Move Group compute_cartesian_path Function not working on UR5 Gazebo Simulation

Hi, I have been trying to familiarize myself with the rospy move group interface, particularly the compute_cartesian_path function. I am using the simulated UR5 robot from this link: GitHub - ros-industrial/universal_robot: ROS-Industrial Universal Robots support (http://wiki.ros.org/universal_robot), and whenever I run the function with an array of recorded waypoints, I get the error “Fail: ABORTED: Invalid goal constraints.” Then, in the terminal where I am running the moveit, I get a response “Computed Cartesian path with 1 points (followed 0.000000% of requested trajectory),” which then follows with a similar error to the first one I mentioned. The robot just does nothing and the demonstration that I recorded is not reproduced. I have had no luck finding a solution to this, so help would be much appreciated. Below is the relevant segment of my code:

def move_ur5_record_demonstration_and_reproduce():
# initialize move it
moveit_commander.roscpp_initialize(sys.argv)

# setup ros node
rospy.init_node('rob_rep_traj', anonymous=True)

# create robot commander, scene, and group
robot = moveit_commander.RobotCommander()
scene = moveit_commander.PlanningSceneInterface()
group = moveit_commander.MoveGroupCommander("manipulator")

# Publish to RVIZ to visualize trajectories
display_trajectory_publisher = rospy.Publisher("/display_planned_path", moveit_msgs.msg.DisplayTrajectory, queue_size=1)
rospy.sleep(3)

# Set rate
rt = rospy.Rate(100)

# Define the starting point
end_effector_link = group.get_end_effector_link()
start_pose = group.get_current_pose(end_effector_link).pose
print("Start pose: ")
print(start_pose)

# Record trajectory for the next 3 seconds
pose_trajectory = [start_pose]
print("Get ready to record a trajectory! You will have 3 seconds\n Starting in...\n")
for num in [5, 4, 3, 2, 1]:
    print(str(num) + "...\n")
    rospy.sleep(1)
print("Start moving the robot!!!!!")
now = rospy.get_time() 
while rospy.get_time() <= (now + 3.0):
    pose_trajectory.append(copy.deepcopy(group.get_current_pose(end_effector_link).pose))
    rt.sleep()

# Test the length
print("Done! Recorded " + str(len(pose_trajectory)) + "poses!\n")
rospy.sleep(1)

# Set goal to move back to starting point, show in Rviz
group.set_pose_target(start_pose, end_effector_link)
plan1 = group.plan()
rospy.sleep(3)

# Show plan in Rviz
print("Please view return to start state plan in Rviz")
display_trajectory = moveit_msgs.msg.DisplayTrajectory()
display_trajectory.trajectory_start = robot.get_current_state()
display_trajectory.trajectory.append(plan1)
display_trajectory_publisher.publish(display_trajectory)
rospy.sleep(10)

# Move to starting point, then stop
group.execute(plan1, wait=True)
rospy.sleep(1)
group.stop()
group.clear_pose_targets()

# Smooth the trajectory, convert back to pose objects
smooth_traj = smooth_a_trajectory(pose_to_xyz(pose_trajectory), 50)
smooth_pose_trajectory = xyz_to_pose(smooth_traj)

# Add current state as first point in trajectory to reproduce
np.insert(smooth_pose_trajectory, 0, copy.deepcopy(group.get_current_pose(end_effector_link).pose))

# Plot the trajectory, show to user
ax = plt.axes(projection='3d')
ax.plot3D(smooth_traj[0], smooth_traj[1], smooth_traj[2], 'red')
plt.show()

# Get plan to reproducde
print("Computing path plan to reproduce...\n")
for pose in smooth_pose_trajectory:
    print(pose)
(final_path_plan, fraction) = group.compute_cartesian_path(smooth_pose_trajectory, 0.01, 0.0)
plan2 = group.plan()
rospy.sleep(5)

# Publish plan in Rviz, give time to view 
print("Please view the reproduced path plan in Rviz\n")
print("Please view return to start state plan in Rviz")
display_trajectory = moveit_msgs.msg.DisplayTrajectory()
display_trajectory.trajectory_start = robot.get_current_state()
display_trajectory.trajectory.append(plan2)
display_trajectory_publisher.publish(display_trajectory)
rospy.sleep(10)

# Got to goal and stop
group.execute(plan2, wait=True)
group.stop()
group.clear_pose_targets()

# Shutdown moveit
moveit_commander.roscpp_shutdown()

Thanks.

Hi,

I would start by using a pose that you know its possible to do by the robot arm.
For example place the robot arm with rqt_joint mvements and then get the end effector xyz pose through TF. That way you will have a real, feasable pose. Then use that to move it and see if it works.

Thanks for the response.

I have been getting the pose through the movegroup moveit_commander interface. Would that not suffice as well?

Hi,

It shoud sufice in theory, but the tf is just to be sure about all realted to the frames.

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.