timer in debug single step?
I’ve used the following as a means to do a relatively simple timer.
SystemTickCnt is a vairable that’s incremented once for every systemclock tick of 1ms, and is “volatile long”.
The code below implements a blocking millisecond timer, but it’s accurate (as accurate as the timer and transportable between processors of differing speeds)
In debugger, if I single step the following code, it goes on forever, when it should stop after “SystemTickCnt+5” counts. The code works outside of debug.
dlycnt = SystemTickCnt+5;
while(dlycnt >= SystemTickCnt);
When I run or single step, it gets to the first line, then (single stepping) loops forever. If I let it run to this first line (without executing it), then disable breakpoints and resume, the code still apparantly loops forever, as it does not reach breakpoints after this loop.
I read that by default the timer is not stopped during debug, but for some reason, the >= case is never overcome.
Any ideas?