-
Notifications
You must be signed in to change notification settings - Fork 9
Developer guide
This page provides an overview of how the R2P2 behaves, and aims to provide a guide about how to extend it. It does not, however, go in-depth into implementation details.
Configuration of the R2P2 simulator is performed using JSON files. In the provided package, these files live under the conf/ folder. Note, however, that they may be stored anywhere in the system, provided that the correct path to the resource is supplied when required.
R2P2 distinguishes three different kinds of configuration files: scenario files, robot files and controller files. Scenario files supply specific information about scenario configuration, such as stage file location, robot configuration file location, and controller configuration file location. Robot files specify physical parameters of the robot so that it may properly be simulated. Finally, controller files include information required to configure the corresponding controller; note that the parameters required by different kinds of controllers may vary, and so they will not be covered in-depth as part of this developer guide. Examples will, however, be provided for the sake of clarity.
Scenario files are always comprised of four differen parameters: a stage file, a robot file, a controller file and a gui boolean:
- stage: path to the image file representing the stage that will be used in this scenario. Refer to Creating a new stage for further considerations on the properties of these images.
- robot: path or paths to the configuration JSON files describing the physical hardware or set of hardware to be simulated. Refer to Robot files for more information about the structure of this kind of file.
- controller: path or paths to the configuration JSON files describing the controller or set of controllers that will be used for this scenario. Refer to Controller files for more information about the structure of this kind of file.
- gui: boolean indicating whether a graphical representation of the simulation should be used. Unless resource or time contraints apply, this parameters is recommended to be set to true. Please, do keep in mind that setting it to false will use fixed, simulated timesteps according to a 60FPS frame rate in order to ensure stability.
All required paths can be expressed as absolute or relative within the system. Note that relative paths are expressed from the folder where the main R2P2 python module, r2p2.py
, is stored, and not from the folder containing the configuration file.
The following is an example derived from the scenario-default.json
provided under the conf
folder within the simulator's package. Be advised that alterations to this file will change how the simulator behaves when loading it without supplying a specific scenario file, as this is the fallback value for said parameter.
{
"stage": "../res/demo1.png",
"robot": ["../conf/robot.json"],
"controller": "../conf/controller-telecom.json",
"gui": true
}
Robot files represent the hardware used for the simulation and, as such, are comprised of several required parameters. Although its appearance may be daunting, most parameters are not used if not operating with a task planning controller, or a similar custom controller.
- id: Identifier of the robot. It should be unique within the simulation, and is used to tag specific log messages, in order to provide clarity.
- x: Initial position over the horizontal axis, expressed in pixels.
- y: Initial position over the vertical axis, expressed in pixels.
- orientation: Initial orientation angle, expressed in degrees for readibility purposes.
- sonar_range: minimum and maximum distance at which the sonars operate, with both bounds included as part of the range. Expressed as pixels. Note that any obstacles located at a distance outside this range will not be detected by the robot.
- radius: radius of the robot.
- max_speed: maximum speed the robot can reach, expressed in pixels per second. Values above this parameter will be reduced to the supplied value.
- step: pixels moved per timestep unit. Only maintained for the sake of legacy compatibility with earlier versions of the provided task planning controller, and likely to be deprecated in the future.
- battery: battery charge, expressed as an arbitrary integer. Currently only used by task planning applications.
- charging_rate: amount of battery recharged per frame while actively charging. Currently only used by task planning applications.
- movement_cost: amount of battery drained per frame while moving. Currently only used by task planning applications.
- reading_cost: amount of battery drained per CO2 reading performed. Currently only used by task planning applications.
- picture_cost: amount of battery drained per picture taken. Currently only used by task planning applications.
- generic_cost: amount of battery drained by performing any action other than those specified earlier. Currently only used by task planning applications.
- color: color with which to represent the robot, expressed as a string. Note that certain colors may not be supported.
The following is an example derived from the robot.json
provided under the conf
folder within the simulator's package:
{
"id": 0,
"x": 260,
"y": 100,
"orientation": 0,
"sonar_range": [0.5, 25],
"radius": 5,
"sonars": [0, 45, 70, 290, 315],
"max_speed": 50,
"step": 10,
"battery": 100,
"charging_rate": 5,
"movement_cost": 0,
"reading_cost": 10,
"picture_cost": 5,
"generic_cost": 0,
"color": "blue"
}
Controller files present several different structures, depending on the specify controller. As such, examples for the controllers provided as part of the simulator will be supplied, in order to provide an idea how to create new controllers and files.
At its most basic, a controller file requires a single parameter: a reference to the class that needs to be used to instantiate the controller in question. This class is expressed in the form file_name.Controller_Name
, with file_name
being a Python file located under r2p2/controllers
. The most basic example can be found in the controller-naive.json
file located under conf/
:
{
"class": "naive_controller.Naive_Controller"
}
For a controller to be functional, it must be provided with all required parameters. The following examples provide an overview of what these parameters are, as well as legal values for them:
{
"class": "telecom_controller.Telecom_Controller"
}
{
"class": "pid_controller.Sequential_PID_Controller",
"goal": [[125, 110], [100, 140], [110, 110], [140, 46], [82, 42], [101, 104], [559, 265], [600, 300], [700, 250], [750, 200], [850, 270], [815, 400], [614, 550], [300, 478]],
"ap": 8,
"ai": 0.015,
"ad": 0.02,
"lp": 0.4,
"li": 0.015,
"ld": 0.02
}
{
"class": "pid_controller.path_planning_controller",
"grid_size": 30,
"waypoints": [[40, 40], [310, 300], [290, 300],
[195, 200], [200, 100],[375, 407],
[344, 348], [285, 112], [58, 216],
[105, 341], [186, 314], [170, 410],
[46, 410], [167, 340], [296, 257],
[173, 350], [117, 101], [65, 136],
[77, 296], [129, 344], [163, 345],
[296, 362], [291, 259], [90, 85],
[132, 305], [91, 308]],
"algorithm": "A*",
"heuristic": "naive",
"start": [25, 26],
"goal": [12, 12]
}
{
"class": "pddl_executor.PDDL_Executor",
"plan_path": "../res/planning.txt"
}
{
"class": "neurocontroller.Neuro_controller",
"weights": [0.0, 0.0],
"hidden_layer": [10, 7, 4],
"activation": "tanh",
"time": 20,
"evolve": true
}
Note that examples providing a waypoints
attribute are supplying the definition of a navigation mesh that can be used for path planning purposes. This parameter is, however, not required unless directly interacting with path planning over a custom navigation mesh, instead of letting the simulator configure it automatically. The granularity of an automatically generated mesh can be configured using the grid_size
paramter instead to select the number of divisions on both axes.
Creating a new stage requires simply the creation of a new image representing it using any drawing program. It can be stored anywhere, and needs to be stored as either a JPEG or non-transparent PNG image to ensure compatibility. A few rules must be observed while designing it, however, due to how the simulator parses and uses them:
- World space is represented as pixel space. As such, image size needs to be considered when attempting to create notably large or small stages. This is required in order to avoid the simulator window overflowing the screen, or being too small to be seen. Bear in mind that the robot or robots need to fit within the stage, and that the radius is expressed in pixels.
- Currently, the image is set to a monochrome version of itself to minimize memory usage. Thus, although the usage of colors is advised for those cases where working over a black and white representation would lead to confusion, bear in mind that specific colors will be ignored in favor of their brightness.
- Colors represent the difficulty of traversing a given space. Pure white implies no cost, while pure black implies the space is a wall and therefore impassable. The range between them indicates an increasing cost in traversing the pixel, which plays a major role when performing path planning tasks. This cost is not visually represented, with the value instead just being stored internally for the sake facilitating specific calculations.
Additionally, it is advisable to use square canvases when creating stages for use with automatically generated navigation meshes for path planning purposes. This is due to the same number of divisions being applied to both axes.
The internal API is split between four main files that provide the bulk of the functionality of the simulator: utils.py
, robot.py
, controller.py
and path_planning.py
.
-
utils.py provides the boilerplate code, as well as the main execution loop of the simulator. It contains all the code used to load information, generate the simulation, represent it and store its internal state, as well as auxiliary functions regarding collision detection. Under normal circumstances, it should not need to be modified. It should be noted that it contains ray casting methods, such as
los_raycasting(src, dst, img)
, which performs raycasting fromsrc
todst
over theimg
provided (img
must be a 2D nparray representation of the scenario, which may be accessed throughutils.npdata
). - robot.py contains the definition of the Robot class, which also includes the physics engine as part of its internal state update method. Normally, there should be no need to modify or subclass this class, as it is designed to generate an adequate object based on parameters provided in JSON format.
-
controller.py contains the abstract definition of a Controller. Of note, this class should never be instantiated directly, instead requiring that a subclass is created for such a purpose. Any Controller subclass must implement a custom
control(self, dst)
, wheredst
refers to the readings provided by the robot's sensors, and the output is always a tuple of the form(speed, angular_velocity)
. It is also advised to callsuper.control(dst)
in the first line of this method to ensure that sensor data is always up to date. The body of this function may calculate target speed and angular velocity however the user decides, including using any number of function calls within it to aid in policy implementation. -
path_planning.py contains all the functionality related to path planning tasks, except specific heuristics and algorithm implementations. This file contains the code in charge of generating custom navigation meshes, loading specified algorithms and allowing their proper execution. Two auxiliary functions,
register_search_method(label, function)
andregister_heuristic(label, function)
for path planning and heuristics respectively. Note that path planning algorithms must be implemented using a function that takes in a start point; an end point; a grid or mesh, and an heuristic's label as input. Heuristics, on the other hand, must be functions that receive two points within the mesh as inputs, and return a numerical value as output.
As a last note, robots can generate events to notify their controllers with. Currently, only the collision event is supported. This event is triggered automatically, and refers to the controller's on_collision
callback, which, by default, is empty. To ensure compatibility, it should receive a tuple of coordinates as input and return nothing. Users may define custom code to act according to collision situations by overriding this function, for example, to write into a log.
Although adding new elements to the simulator is not a complicated process, several rules need to be observed for them to behave properly:
- Heuristics and path planning algorithms must be registered using
path_planning.register_heuristic(label, function)
andpath_planning.register_search_method(label, function)
prior to using them. Note that, if using these function calls within the file where the new element is defined, this file will need to be imported at some point in order for the line to be run. - Controllers may be added by creating a new Python file under
r2p2/controllers
and subclassing the Controller abstract class. The control method must be overriden, and a constructor taking must be created, supplying acontroller_type
string to the super class constructor. This controller may take any additional required arguments, and completely disregard the need for a string of so desired, as it is only used for logging purposes. Optionally, methods such asregister_robot
andwrite_to_log
may be customized in order to specify special behaviours.
Of course, additional auxiliary files may be generated at any time in order to implement new custom functions. However, these functions will need to be called from your own code in order for them to have any effect.