How to use the resulted model from DeepQlearning (from live class 32)

I have followed the video from liveClass 32 and pretty much understand what is going on. However, after we got the final trained model, what should be the step to implement it with the simulation or real robot?

Thanks

In theory if you did the training with the simulation, and the simulation is acurate enough, the transference to the real robot would be just loading the trained model and execute the actions. Please let us know if you have a practical question or something related to a real robot implementation. We would love to see that!

Thank you, once I have finish training the robot in simulation and ready to implement, I might have more questions.

Here an example code that uses a trained and saved model:

#!/usr/bin/env python
import os
import gym
from baselines import deepq
import rospy
import rospkg

def main():

env = gym.make("MyMovingCubeOneDiskWalkEnv-v0")

# Set the path where learned model was saved
rospack = rospkg.RosPack()
pkg_path = rospack.get_path('my_moving_cube_pkg')
models_dir_path = os.path.join(pkg_path, "models_saved")
if not os.path.exists(models_dir_path):
    os.makedirs(models_dir_path)

out_model_file_path = os.path.join(models_dir_path, "movingcube_model.pkl")

# start the learning
act = deepq.learn(
    env,
    network='mlp',
    total_timesteps=0,
    load_path=out_model_file_path #indicate here a previous saved model (if you want to continue training it)
)

while True:
    obs, done = env.reset(), False
    episode_rew = 0
    while not done:
        obs, rew, done, _ = env.step(act(obs[None])[0])
        episode_rew += rew
    print("Episode reward", episode_rew)


if __name__ == '__main__':
main()

Thank you very much.