Parrot Drone Environment

Drone simulation to test all your AI algorithms with a flying robot with double cameras and sonar.

Parrot Drone Simulation

Robot Environment

openai_ros.robot_envs.parrotdrone_env module

class openai_ros.robot_envs.parrotdrone_env.ParrotDroneEnv[source]

Bases: openai_ros.robot_gazebo_env.RobotGazeboEnv

Superclass for all CubeSingleDisk environments.

__init__()[source]

Initializes a new ParrotDroneEnv environment.

To check any topic we need to have the simulations running, we need to do two things: 1) Unpause the simulation: without that th stream of data doesnt flow. This is for simulations that are pause for whatever the reason 2) If the simulation was running already for some reason, we need to reset the controlers. This has to do with the fact that some plugins with tf, dont understand the reset of the simulation and need to be reseted to work properly.

The Sensors: The sensors accesible are the ones considered usefull for AI learning.

Sensor Topic List: * /drone/down_camera/image_raw: RGB Camera facing down. * /drone/front_camera/image_raw: RGB Camera facing front. * /drone/imu: IMU of the drone giving acceleration and orientation relative to world. * /drone/sonar: Sonar readings facing front * /drone/gt_pose: Get position and orientation in Global space * /drone/gt_vel: Get the linear velocity , the angular doesnt record anything.

Actuators Topic List: * /cmd_vel: Move the Drone Around when you have taken off. * /drone/takeoff: Publish into it to take off * /drone/land: Publish to make ParrotDrone Land

Args:

__module__ = 'openai_ros.robot_envs.parrotdrone_env'
_check_all_publishers_ready()[source]

Checks that all the publishers are working :return:

_check_all_sensors_ready()[source]
_check_all_systems_ready()[source]

Checks that all the sensors, publishers and other simulation systems are operational.

_check_cmd_vel_pub_connection()[source]
_check_down_camera_rgb_image_raw_ready()[source]
_check_front_camera_rgb_image_raw_ready()[source]
_check_gt_pose_ready()[source]
_check_gt_vel_ready()[source]
_check_imu_ready()[source]
_check_land_pub_connection()[source]
_check_sonar_ready()[source]
_check_takeoff_pub_connection()[source]
_compute_reward(observations, done)[source]

Calculates the reward to give based on the observations given.

_down_camera_rgb_image_raw_callback(data)[source]
_front_camera_rgb_image_raw_callback(data)[source]
_get_obs()[source]
_gt_pose_callback(data)[source]
_gt_vel_callback(data)[source]
_imu_callback(data)[source]
_init_env_variables()[source]

Inits variables needed to be initialised each time we reset at the start of an episode.

_is_done(observations)[source]

Checks if episode done based on observations given.

_set_action(action)[source]

Applies the given action to the simulation.

_set_init_pose()[source]

Sets the Robot in its init pose

_sonar_callback(data)[source]
check_array_similar(ref_value_array, check_value_array, epsilon)[source]

It checks if the check_value id similar to the ref_value

get_down_camera_rgb_image_raw()[source]
get_front_camera_rgb_image_raw()[source]
get_gt_pose()[source]
get_gt_vel()[source]
get_imu()[source]
get_sonar()[source]
land()[source]

Sends the Landing command and checks it has landed It unpauses the simulation and pauses again to allow it to be a self contained action

move_base(linear_speed_vector, angular_speed, epsilon=0.05, update_rate=10)[source]

It will move the base based on the linear and angular speeds given. It will wait untill those twists are achived reading from the odometry topic. :param linear_speed_vector: Speed in the XYZ axis of the robot base frame, because drones can move in any direction :param angular_speed: Speed of the angular turning of the robot base frame, because this drone only turns on the Z axis. :param epsilon: Acceptable difference between the speed asked and the odometry readings :param update_rate: Rate at which we check the odometry. :return:

takeoff()[source]

Sends the takeoff command and checks it has taken of It unpauses the simulation and pauses again to allow it to be a self contained action

wait_for_height(heigh_value_to_check, smaller_than, epsilon, update_rate)[source]

Checks if current height is smaller or bigger than a value :param: smaller_than: If True, we will wait until value is smaller than the one given

wait_time_for_execute_movement()[source]

Because this Parrot Drone position is global, we really dont have a way to know if its moving in the direction desired, because it would need to evaluate the diference in position and speed on the local reference.

wait_until_twist_achieved(cmd_vel_value, epsilon, update_rate)[source]

# TODO: Make it work using TF conversions We wait for the cmd_vel twist given to be reached by the robot reading from the odometry. :param cmd_vel_value: Twist we want to wait to reach. :param epsilon: Error acceptable in odometry readings. :param update_rate: Rate at which we check the odometry. :return: