Im getting unexpected timer delays where the measured time is always one millisecond longer then what is expected. For example when I have
~~~
define configTICKRATEHZ ( 1000 )
xTimer1 = xTimerCreate("clock1",
1,
pdTRUE,
NULL,
clockTimer1);
xTimerReset(xTimer1, portMAX_DELAY);
~~~
I would expect the timer to have a 1 millisecond delay. instead I measure 2 milli when using std::chrono.
When I have configTICK
RATEHZ ( 100 )
and rest the same, I would expect a 10 millisecond delay but instead I am measuring 11 milliseconds.