Hi,
when i launch the Qlearn RL algo , I get an error in the Task environment of this type:
File “/home/user/catkin_ws/src/my_moving_cube_pkg/scripts/my_one_disk_walk.py”, line 81, in init
super(MyMovingCubeOneDiskWalkEnv, self).init()
TypeError: init() takes exactly 2 arguments (1 given)
[movingcube_gym-1] process has died [pid 3988, exit code 1,
I think that this is because the Robot Environment constructor class requires an extra argument (initial velocity).
I sort it like this :
super(MyMovingCubeOneDiskWalkEnv, self).init(self.init_roll_vel)
Extra info(involved files as explained in the the jupiter notebook):
my_cube_single_disk_env.py
my_one_disk_walk.py
qlearn.py
start_training.py
Am i right ??
Hi!
You can check the init values needed in the class “MyCubeSingleDiskEnv” which the “MyMovingCubeOneDiskWalkEnv” inherits from. If in the class MyCubeSingleDiskEnv init() you have variables, that what it needs to start.
If you go to the file my_cube_single_disk_env.py:
#! /usr/bin/env python
from openai_ros import robot_gazebo_env
class MyCubeSingleDiskEnv(robot_gazebo_env.RobotGazeboEnv):
"""Superclass for all CubeSingleDisk environments.
"""
def __init__(self, init_roll_vel):
"""Initializes a new CubeSingleDisk environment.
"""
# Variables that we give through the constructor.
self.init_roll_vel = init_roll_vel
self.controllers_list = ['my_robot_controller1','my_robot_controller2', ..., 'my_robot_controllerX']
self.robot_name_space = "my_robot_namespace"
reset_controls_bool = True or False
# We launch the init function of the Parent Class robot_gazebo_env.RobotGazeboEnv
super(MyCubeSingleDiskEnv, self).__init__(controllers_list=self.controllers_list,
robot_name_space=self.robot_name_space,
reset_controls=reset_controls_bool)
As you can see it need that init_roll_vel as init input.