Timing precision using ticks
I have set up a control task that sets a frequency generator up and delays itself for a set period of time in millisecond ticks. The idea is that when it resumes it turns off the frequency generator and an exact known number of rising edges have occurred.
My question is… Just how accurate is tick timing? Presumably, i could run the frequency generator at 1kHz for 1000 ticks and i’d get exactly 1000 rising edges. What are the odds I’d get 999 or 1001.
Now, pushing the limits, what if i were running at 2, 5, 10kHz, how likely am I to be able to stop the frequency generator at exactly the right number of rising edges?
For the sake of argument, lets say there is only ONE task running.
Thanks for your input,
Matt
Timing precision using ticks
I don’t think it can be guaranteed because of jitter in the tick caused by critical sections.
If the port you are using supports interrupt nesting then you can run an interrupt above the system call interrupt priority to avoid any jitter, then it should work well.
Also check the timer peripheral to see if you can do what you want purely using the peripheral hardware.
Regards.
Timing precision using ticks
Richard, Thanks for your input.
I’m almost to the point where the OS is just a big waste of code space if I take that function out of a task… all I’ll be left with is a com task, that could just as easily be polling from the main loop, and a bunch of interrupts.