Parrot Drone Environment

Drone simulation to test all your AI algorithms with a flying robot with double cameras and sonar.

Parrot Drone Simulation

Robot Environment

openai_ros.robot_envs.parrotdrone_env module

class openai_ros.robot_envs.parrotdrone_env.ParrotDroneEnv[source]

Bases: openai_ros.robot_gazebo_env.RobotGazeboEnv

Superclass for all CubeSingleDisk environments.


Initializes a new ParrotDroneEnv environment.

To check any topic we need to have the simulations running, we need to do two things: 1) Unpause the simulation: without that th stream of data doesnt flow. This is for simulations that are pause for whatever the reason 2) If the simulation was running already for some reason, we need to reset the controlers. This has to do with the fact that some plugins with tf, dont understand the reset of the simulation and need to be reseted to work properly.

The Sensors: The sensors accesible are the ones considered usefull for AI learning.

Sensor Topic List: * /drone/down_camera/image_raw: RGB Camera facing down. * /drone/front_camera/image_raw: RGB Camera facing front. * /drone/imu: IMU of the drone giving acceleration and orientation relative to world. * /drone/sonar: Sonar readings facing front * /drone/gt_pose: Get position and orientation in Global space * /drone/gt_vel: Get the linear velocity , the angular doesnt record anything.

Actuators Topic List: * /cmd_vel: Move the Drone Around when you have taken off. * /drone/takeoff: Publish into it to take off * /drone/land: Publish to make ParrotDrone Land


__module__ = 'openai_ros.robot_envs.parrotdrone_env'

Checks that all the publishers are working :return:


Checks that all the sensors, publishers and other simulation systems are operational.

_compute_reward(observations, done)[source]

Calculates the reward to give based on the observations given.


Inits variables needed to be initialised each time we reset at the start of an episode.


Checks if episode done based on observations given.


Applies the given action to the simulation.


Sets the Robot in its init pose

check_array_similar(ref_value_array, check_value_array, epsilon)[source]

It checks if the check_value id similar to the ref_value


Sends the Landing command and checks it has landed It unpauses the simulation and pauses again to allow it to be a self contained action

move_base(linear_speed_vector, angular_speed, epsilon=0.05, update_rate=10)[source]

It will move the base based on the linear and angular speeds given. It will wait untill those twists are achived reading from the odometry topic. :param linear_speed_vector: Speed in the XYZ axis of the robot base frame, because drones can move in any direction :param angular_speed: Speed of the angular turning of the robot base frame, because this drone only turns on the Z axis. :param epsilon: Acceptable difference between the speed asked and the odometry readings :param update_rate: Rate at which we check the odometry. :return:


Sends the takeoff command and checks it has taken of It unpauses the simulation and pauses again to allow it to be a self contained action

wait_for_height(heigh_value_to_check, smaller_than, epsilon, update_rate)[source]

Checks if current height is smaller or bigger than a value :param: smaller_than: If True, we will wait until value is smaller than the one given


Because this Parrot Drone position is global, we really dont have a way to know if its moving in the direction desired, because it would need to evaluate the diference in position and speed on the local reference.

wait_until_twist_achieved(cmd_vel_value, epsilon, update_rate)[source]

# TODO: Make it work using TF conversions We wait for the cmd_vel twist given to be reached by the robot reading from the odometry. :param cmd_vel_value: Twist we want to wait to reach. :param epsilon: Error acceptable in odometry readings. :param update_rate: Rate at which we check the odometry. :return: