Hallo Everyone, i have a small Problem and a Question at the same Time, i am not sure if i have a lack of understanding.
Can i let the Roboter moves according to another Sensor Information ? For Example , i have Ultra Wideband Sensor that Provides Positioninformation for my turtlebot Burger in a Topic. Now i want the Robot to go to a certin Point acording to his Position that have been provided by the UWB Sensor. Not with the Information provided from the Lidar ( like amcl_pose ) . Is this acutally Possioble ?
What i want to do is :
the robot should have a certain target, where it has to go. for example to the point ( X= 1000mm, Y=1200mm). ( i already wrote a code for this (easy) )
before the robot drives to this point, it must know where it is and drive accordingly to point ( X= 1000mm, Y=1200mm). This position information should be taken from the UWB topic and not from the Amcl_pose.
Yes, there are many different methods available which can be used for pose estimation or robot localization. I am assuming that you are familiar with ROS concept of TF or transforms, robot localization and the Navigation Stack. ROS navigation/localization packages revolve around three key coordinate frames: /map, /odom and /base_link. You can create these reference frames using all kind of different sensor, for instance wheel odometry, visual odometry, IMU, laser scan matching, IMU, GPS, visual markers, uwb sensors and others.
I think this is a very broad question, there really is no short explanation. An answer would not only need to address TF’s, robot navigation and sensor fusion (to name a few) to also the specifics of your own hardware.
If any of these subjects is new to you I can recommend the following courses:
Note however that these courses cover the most general cases (lidar sensors, wheel odometry) and are not specific to UWB sensors. Regardless you should be able to take your project a few steps further and advance it to a stage where you have more specific questions or road blocks that you need to solve.