Hi @jmledran ,
So, when you set your /cmd_vel
to 20Hz
, your controller frequency is around 0.0525
seconds
(and)
when you set your /cmd_vel
to 50Hz
, your controller frequency is around 0.0250
seconds.
20 Hz = 0.05 sec
, so when controller takes a little more time, then you must set controller frequency to a smaller frequency value, example, 16 Hz
, so the time taken is 0.0625 secs > 0.05xx secs
.
[You cannot use 18 Hz
because 18Hz = 0.0556 secs
which is very close to 0.0530 seconds
, so controller can still miss a few steps]. I see 0.0530 secs
in the picture you posted.
Also, it is better to keep all frequencies as even number or a multiple of 5. So, { 12, 14, 15, 16, 18, 20, …} are good and {11, 13, 17, 19, …} are bad.
Similarly, for 50 Hz = 0.02 sec
, you set controller frequency as 32 Hz
, so 0.03125 secs > 0.02xx secs
. Here again, you can’t set 35 Hz
since 35 Hz = 0.0286 secs
which is close to your maximum time taken on your terminal output 0.0280 secs
, so chances are this might miss a few controller iterations. Therefore, 32 Hz
is better than 35 Hz
.
Some Math:
16 / 20 = 0.8
32 / 50 = 0.64
So, as cmd_vel
frequency increases, controller frequency decreases in a decreasing ratio.
I am not sure if the relationship is linear or exponential with the data provided.
So, I believe, the best relation between the rate of /cmd_vel
and controller frequency could be given as:
controller_frequency <= (0.5 * cmd_vel_frequency)
to be on the safest side to get things working.
So, if you set cmd_vel
frequency as 20 Hz
and controller frequency as 10 Hz
, 10 / sec
is still fast !
In my opinion, anything less than 4 Hz
is slow. Anything less than 1 Hz
is super slow.
This is my observation from the outputs that you have posted.
Let me know if this helps you!
Regards,
Girish