@@ -606,7 +599,7 @@ TypeError
- « Previous
+ « Previous
Next »
diff --git a/apis_cpp/index.html b/apis_cpp/index.html
index 053bc54..cc171a5 100755
--- a/apis_cpp/index.html
+++ b/apis_cpp/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/camera_views/index.html b/camera_views/index.html
index d466e37..4235a46 100755
--- a/camera_views/index.html
+++ b/camera_views/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/cmake_linux/index.html b/cmake_linux/index.html
index a7d4a9c..dd9cf0f 100755
--- a/cmake_linux/index.html
+++ b/cmake_linux/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/custom_drone/index.html b/custom_drone/index.html
index 620452f..005f867 100755
--- a/custom_drone/index.html
+++ b/custom_drone/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/distance_sensor/index.html b/distance_sensor/index.html
index 3b5d3d9..cf28a15 100755
--- a/distance_sensor/index.html
+++ b/distance_sensor/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/dynamic_objects/index.html b/dynamic_objects/index.html
index 14469a5..3fb69e7 100755
--- a/dynamic_objects/index.html
+++ b/dynamic_objects/index.html
@@ -5,14 +5,14 @@
- Setup Dynamic Objects for Scenario Environments for AirSim - Cosys-AirSim
+ Dynamic Objects - Cosys-AirSim
@@ -50,7 +50,7 @@
Installing AirSim
-
Using AirSim
- Support
-
diff --git a/flight_controller/index.html b/flight_controller/index.html
index 88dec05..3d47f45 100755
--- a/flight_controller/index.html
+++ b/flight_controller/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/gazebo_drone/index.html b/gazebo_drone/index.html
index edb09de..da6f235 100755
--- a/gazebo_drone/index.html
+++ b/gazebo_drone/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/gpulidar/index.html b/gpulidar/index.html
index d9f40db..a76aa04 100755
--- a/gpulidar/index.html
+++ b/gpulidar/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/image_apis/index.html b/image_apis/index.html
index cd7a4c7..abeb385 100755
--- a/image_apis/index.html
+++ b/image_apis/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
@@ -566,8 +559,8 @@ Segmentation
To retrieve the color map to know which color is assign to each color index you can use:
colorMap = client.simGetSegmentationColorMap()
-An example can be found in segmentation_test.py .
-For a script that generates a full list of objects and their associated color, please see the script segmentation_generate_list.py .
+An example can be found in segmentation_test.py (Cosys-AirSim/PythonClient/segmentation/segmentation_test.py).
+For a script that generates a full list of objects and their associated color, please see the script segmentation_generate_list.py (Cosys-AirSim/PythonClient/segmentation/segmentation_generate_list.py).
How to Find Mesh names?
To get desired ground truth segmentation you will need to know the names of the meshes in your Unreal environment. To do this, you can use the API:
currentObjectList = client.simListInstanceSegmentationObjects()
diff --git a/index.html b/index.html
index f0dfe66..7f23154 100755
--- a/index.html
+++ b/index.html
@@ -85,6 +85,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/install_linux/index.html b/install_linux/index.html
index c680be2..c161abb 100755
--- a/install_linux/index.html
+++ b/install_linux/index.html
@@ -73,6 +73,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
@@ -293,7 +286,6 @@ Build Cosys-Airsim
Build Unreal Environment
Finally, you will need an Unreal project that hosts the environment for your vehicles. Cosys-AirSim comes with a built-in "Blocks Environment" which you can use, or you can create your own. Please see setting up Unreal Environment if you'd like to setup your own environment.
-The other environments available often need additional asset packs to be downloaded first, read here for more information.
How to Use Cosys-AirSim
Once Cosys-AirSim is setup:
- Navigate to the environment folder (for example for BLocks it is Unreal\Environments\Blocks
), and run update_from_git.sh
.
diff --git a/install_precompiled/index.html b/install_precompiled/index.html
index f372fe0..d3cacbd 100755
--- a/install_precompiled/index.html
+++ b/install_precompiled/index.html
@@ -61,6 +61,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/install_windows/index.html b/install_windows/index.html
index 4cd7602..307da9b 100755
--- a/install_windows/index.html
+++ b/install_windows/index.html
@@ -71,6 +71,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
@@ -295,7 +288,6 @@ How to Use Cosys-AirSim
!!! tip
Go to 'Edit->Editor Preferences', in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked.
See Using APIs and settings.json for various options available.
-The other environments available often need additional asset packs to be downloaded first, read here for more information.
FAQ
I get an error Il ‘P1’, version ‘X’, does not match ‘P2’, version ‘X’
This is caused by multiple versions of Visual Studio installed on the machine. The build script of Cosys-AirSim will use the latest versions it can find so need to make Unreal does the same.
diff --git a/instance_segmentation/index.html b/instance_segmentation/index.html
index 9919e36..4fb5ec3 100755
--- a/instance_segmentation/index.html
+++ b/instance_segmentation/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
@@ -270,13 +263,13 @@ Limitations
Landscape objects aren't supported. This is the special object type in Unreal to make terrain with. As a work-around, StaticMesh terrain must be used.
Foliage objects aren't supported. This is the special object type in Unreal to place trees, grass and other plants that move with the wind. As a work-around, StaticMesh objects must be used.
Brush objects aren't supported. This is a special object type in Unreal to create your own meshes with. As a work-around, you can convert them to a StaticMesh.
-These and other unsupported object types that are less common that either will not be rendered (decals, text, foliage, ...) or will by default be given the RGB color value of [149,149,149] or 0,0,0 .
+These and other unsupported object types that are less common that either will not be rendered (decals, text, foliage, ...) or will by default be given the RGB color value of [149,149,149] or [0,0,0]. (brush objects, landscape,...).
Usage
By default, at the start of the simulation, it will give a random color to each object.
Please see the Image API documentation on how to manually set or get the color information.
-For an example of the Instance Segmentation API, please see the script segmentation_test.py .
-For a script that generates a full list of objects and their associated color, please see the script segmentation_generate_list.py .
+For an example of the Instance Segmentation API, please see the script segmentation_test.py (Cosys-Airsim/PythonClient/segmentation/segmentation_test.py).
+For a script that generates a full list of objects and their associated color, please see the script segmentation_generate_list.py (Cosys-Airsim/PythonClient/segmentation/segmentation_generate_list.py).
When a new object is spawned in your environment by for example a c++ or blueprint extension you made,
and you want it to work with the instance segmentation system, you can use the extended function ASimModeBase::AddNewActorToSegmentation(AActor)
which is also available in blueprints.
Make sure to provide human-readable names to your objects in your environment as the ground truth tables that the AirSim API can provide will use your object naming to create the table.
diff --git a/lidar/index.html b/lidar/index.html
index 6dc12f6..0d803bb 100755
--- a/lidar/index.html
+++ b/lidar/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/log_viewer/index.html b/log_viewer/index.html
index ac7a865..5fac0e4 100755
--- a/log_viewer/index.html
+++ b/log_viewer/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/matlab/index.html b/matlab/index.html
index aec4de4..abf826b 100755
--- a/matlab/index.html
+++ b/matlab/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/mavlinkcom/index.html b/mavlinkcom/index.html
index ff3bbbe..532d502 100755
--- a/mavlinkcom/index.html
+++ b/mavlinkcom/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/mavlinkcom_mocap/index.html b/mavlinkcom_mocap/index.html
index 8e8e963..50dd70b 100755
--- a/mavlinkcom_mocap/index.html
+++ b/mavlinkcom_mocap/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/meshes/index.html b/meshes/index.html
index d8bd717..c3bd37d 100755
--- a/meshes/index.html
+++ b/meshes/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/modify_recording_data/index.html b/modify_recording_data/index.html
index 128d70b..92d96ef 100755
--- a/modify_recording_data/index.html
+++ b/modify_recording_data/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/multi_vehicle/index.html b/multi_vehicle/index.html
index f5b8d21..d5a72de 100755
--- a/multi_vehicle/index.html
+++ b/multi_vehicle/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/object_detection/index.html b/object_detection/index.html
index 02813dd..943562b 100755
--- a/object_detection/index.html
+++ b/object_detection/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/pfm/index.html b/pfm/index.html
index 3db430e..4b96014 100755
--- a/pfm/index.html
+++ b/pfm/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/playback/index.html b/playback/index.html
index 84c2575..8f43436 100755
--- a/playback/index.html
+++ b/playback/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
@@ -262,8 +255,7 @@
Playback
AirSim supports playing back the high level commands in a *.mavlink log file that were recorded using the MavLinkTest app
for the purpose of comparing real and simulated flight.
-The recording.mavlink is an example of a log file captured using a real drone using the following
-command line:
+Example command line:
MavLinkTest -serial:/dev/ttyACM0,115200 -logdir:.
Then the log file contains the commands performed, which included several "orbit" commands, the resulting GPS map of the flight
diff --git a/px4_build/index.html b/px4_build/index.html
index 837feec..719628e 100755
--- a/px4_build/index.html
+++ b/px4_build/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/px4_lockstep/index.html b/px4_lockstep/index.html
index 4a78258..c027383 100755
--- a/px4_lockstep/index.html
+++ b/px4_lockstep/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/px4_logging/index.html b/px4_logging/index.html
index 3f2a093..23d015c 100755
--- a/px4_logging/index.html
+++ b/px4_logging/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/px4_multi_vehicle/index.html b/px4_multi_vehicle/index.html
index b71b1fd..54f0bc8 100755
--- a/px4_multi_vehicle/index.html
+++ b/px4_multi_vehicle/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/px4_setup/index.html b/px4_setup/index.html
index 6dbd646..e2ed602 100755
--- a/px4_setup/index.html
+++ b/px4_setup/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
@@ -345,7 +338,6 @@ FAQ
Drone doesn't fly properly, it just goes "crazy".
There are a few reasons that can cause this. First, make sure your drone doesn't fall down large distance when starting the simulator. This might happen if you have created a custom Unreal environment and Player Start is placed too high above the ground. It seems that when this happens internal calibration in PX4 gets confused.
You should also use QGroundControl and make sure you can arm and takeoff in QGroundControl properly.
-Finally, this also can be a machine performance issue in some rare cases, check your hard drive performance .
Can I use Arducopter or other MavLink implementations?
Our code is tested with the PX4 firmware . We have not tested Arducopter or other mavlink implementations. Some of the flight API's do use the
PX4 custom modes in the MAV_CMD_DO_SET_MODE messages (like PX4_CUSTOM_MAIN_MODE_AUTO)
diff --git a/px4_sitl/index.html b/px4_sitl/index.html
index 0d85665..f1c3d9c 100755
--- a/px4_sitl/index.html
+++ b/px4_sitl/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/px4_sitl_wsl2/index.html b/px4_sitl_wsl2/index.html
index 1181c69..ff19569 100755
--- a/px4_sitl_wsl2/index.html
+++ b/px4_sitl_wsl2/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/remote_control/index.html b/remote_control/index.html
index d96bc3e..4e8361b 100755
--- a/remote_control/index.html
+++ b/remote_control/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
@@ -295,7 +288,7 @@ RC Setup for Default Config
Other Devices
AirSim can detect large variety of devices however devices other than above might need extra configuration. In future we will add ability to set this config through settings.json. For now, if things are not working then you might want to try workarounds such as x360ce or change code in SimJoystick.cpp file .
Note on FrSky Taranis X9D Plus
-FrSky Taranis X9D Plus is real UAV remote control with an advantage that it has USB port so it can be directly connected to PC. You can download AirSim config file and follow this tutorial to import it in your RC. You should then see "sim" model in RC with all channels configured properly.
+FrSky Taranis X9D Plus is real UAV remote control with an advantage that it has USB port so it can be directly connected to PC. You can download AirSim config file and follow this tutorial to import it in your RC. You should then see "sim" model in RC with all channels configured properly.
Note on Linux
Currently default config on Linux is for using Xbox controller. This means other devices might not work properly. In future we will add ability to configure RC in settings.json but for now you might have to change code in SimJoystick.cpp file to use other devices.
RC Setup for PX4
diff --git a/retexturing/index.html b/retexturing/index.html
index 4d2ddd2..f471e47 100755
--- a/retexturing/index.html
+++ b/retexturing/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/ros_cplusplus/index.html b/ros_cplusplus/index.html
index 56278f0..bb75117 100755
--- a/ros_cplusplus/index.html
+++ b/ros_cplusplus/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/ros_python/index.html b/ros_python/index.html
index 60b538b..da1456d 100755
--- a/ros_python/index.html
+++ b/ros_python/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/search.html b/search.html
index c965024..42d8098 100755
--- a/search.html
+++ b/search.html
@@ -52,6 +52,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/search/search_index.json b/search/search_index.json
index a27ab3e..b7aa979 100755
--- a/search/search_index.json
+++ b/search/search_index.json
@@ -1 +1 @@
-{"config":{"indexing":"full","lang":["en"],"min_search_length":3,"prebuild_index":false,"separator":"[\\s\\-]+"},"docs":[{"location":"","text":"Cosys-AirSim Cosys-AirSim is a simulator for drones, cars and more, with extensive API support, built on Unreal Engine . It is open-source, cross platform, and supports hardware-in-loop with popular flight controllers such as PX4 for physically and visually realistic simulations. It is developed as an Unreal plugin that can simply be dropped into any Unreal environment. This fork is based on last public AirSim release from Microsoft's GitHub. Cosys-Lab made extensive modifications to the AirSim platform to support multiple projects and research goals. Please contact a Cosys-Lab researcher to get more in depth information on our work or if you wish to collaborate. The original AirSim MIT license applies to all native AirSim source files. Please note that we use that same MIT license as which applies to all changes made by Cosys-Lab in case you plan to do anything within this repository. Do note that this repository is provided as is, will not be actively updated and comes without warranty or support. Please contact a Cosys-Lab researcher to get more in depth information on which branch or version is best for your work. Associated publications Cosys-AirSim: A Real-Time Simulation Framework Expanded for Complex Industrial Applications @inproceedings{cosysairsim2023jansen, author={Jansen, Wouter and Verreycken, Erik and Schenck, Anthony and Blanquart, Jean-Edouard and Verhulst, Connor and Huebel, Nico and Steckel, Jan}, booktitle={2023 Annual Modeling and Simulation Conference (ANNSIM)}, title={COSYS-AIRSIM: A Real-Time Simulation Framework Expanded for Complex Industrial Applications}, year={2023}, volume={}, number={}, pages={37-48}, doi={}} You can also find the presentation of the live tutorial of Cosys-AirSim at ANNSIM '23 conference here together with the associated videos. Physical LiDAR Simulation in Real-Time Engine @inproceedings{lidarsim2022jansen, author={Jansen, Wouter and Huebel, Nico and Steckel, Jan}, booktitle={2022 IEEE Sensors}, title={Physical LiDAR Simulation in Real-Time Engine}, year={2022}, volume={}, number={}, pages={1-4}, doi={10.1109/SENSORS52175.2022.9967197}} } Simulation of Pulse-Echo Radar for Vehicle Control and SLAM @Article{echosim2021schouten, author={Schouten, Girmi and Jansen, Wouter and Steckel, Jan}, title={Simulation of Pulse-Echo Radar for Vehicle Control and SLAM}, JOURNAL={Sensors}, volume={21}, year={2021}, number={2}, article-number={523}, doi={10.3390/s21020523} } Cosys-Lab Modifications Added support for Unreal up to 5.4 ( Note that Unreal 5.3/5.4 breaks camera scene rendering by default in custom environments ) Added multi-layer annotation for groundtruth label generation with RGB, greyscale and texture options. Extensive API integration and available for camera and GPU-LiDAR sensors. Added Instance Segmentation . Added Echo sensor type for simulation of sensors like sonar and radar. Added GPU LIDAR sensor type : Uses GPU acceleration to simulate a LiDAR sensor. Can support much higher point density then normal LiDAR and behaves more authentic and has realistic intensity generation. Added skid steering SimMode and vehicle type . ClearPath Husky and Pioneer P3DX implemented as vehicle types using this new vehicle model. Added Matlab API Client implementation as an easy to install Matlab toolbox. Added various random but deterministic dynamic object types and world configuration options . Added BoxCar vehicle model to the Car SimMode to have a smaller vehicle to use in indoor spaces. Updated ComputerVision mode : Now has full API and Simulation just like other vehicle types. It mostly means it can now have sensors attached (outside of IMU). Improved handling and camera operation. Updated LIDAR sensor type : Fixed not tracing correctly, added ground truth (point labels) generation, added range-noise generation. Improved API pointcloud delivery to be full scan instead of being frame-rate dependent and partial. Updated the camera, Echo and (GPU-)LiDAR sensors to be uncoupled from the vehicle and be placed as external world sensors. Updated sensors like cameras, Echo sensor and GPU-LiDAR to ignore certain objects with the MarkedIgnore Unreal tag and enabling the \"IgnoreMarked\" setting in the settings file . Updated cameras sensor with more distortion features such as chromatic aberration, motion blur and lens distortion. Updated Python ROS implementation with completely new implementation and feature set. Updated C++ ROS2 implementation to support custom Cosys-AirSim features. Dropped support for Unity Environments. Some more details on our changes can be found in the changelog . How to Get It Download and install from precompiled plugin - Windows/Linux Download and install it Install and use from source - Windows Install/Build it Install and use from source - Linux Install/Build it How to Use It Documentation View our detailed documentation on all aspects of Cosys-AirSim. Original AirSim Paper More technical details are available in AirSim paper (FSR 2017 Conference) . Please cite this as: @inproceedings{airsim2017fsr, author = {Shital Shah and Debadeepta Dey and Chris Lovett and Ashish Kapoor}, title = {AirSim: High-Fidelity Visual and Physical Simulation for Autonomous Vehicles}, year = {2017}, booktitle = {Field and Service Robotics}, eprint = {arXiv:1705.05065}, url = {https://arxiv.org/abs/1705.05065} } License This project is released under the MIT License. Please review the License file for more details.","title":"Home"},{"location":"#cosys-airsim","text":"Cosys-AirSim is a simulator for drones, cars and more, with extensive API support, built on Unreal Engine . It is open-source, cross platform, and supports hardware-in-loop with popular flight controllers such as PX4 for physically and visually realistic simulations. It is developed as an Unreal plugin that can simply be dropped into any Unreal environment. This fork is based on last public AirSim release from Microsoft's GitHub. Cosys-Lab made extensive modifications to the AirSim platform to support multiple projects and research goals. Please contact a Cosys-Lab researcher to get more in depth information on our work or if you wish to collaborate. The original AirSim MIT license applies to all native AirSim source files. Please note that we use that same MIT license as which applies to all changes made by Cosys-Lab in case you plan to do anything within this repository. Do note that this repository is provided as is, will not be actively updated and comes without warranty or support. Please contact a Cosys-Lab researcher to get more in depth information on which branch or version is best for your work.","title":"Cosys-AirSim"},{"location":"#associated-publications","text":"Cosys-AirSim: A Real-Time Simulation Framework Expanded for Complex Industrial Applications @inproceedings{cosysairsim2023jansen, author={Jansen, Wouter and Verreycken, Erik and Schenck, Anthony and Blanquart, Jean-Edouard and Verhulst, Connor and Huebel, Nico and Steckel, Jan}, booktitle={2023 Annual Modeling and Simulation Conference (ANNSIM)}, title={COSYS-AIRSIM: A Real-Time Simulation Framework Expanded for Complex Industrial Applications}, year={2023}, volume={}, number={}, pages={37-48}, doi={}} You can also find the presentation of the live tutorial of Cosys-AirSim at ANNSIM '23 conference here together with the associated videos. Physical LiDAR Simulation in Real-Time Engine @inproceedings{lidarsim2022jansen, author={Jansen, Wouter and Huebel, Nico and Steckel, Jan}, booktitle={2022 IEEE Sensors}, title={Physical LiDAR Simulation in Real-Time Engine}, year={2022}, volume={}, number={}, pages={1-4}, doi={10.1109/SENSORS52175.2022.9967197}} } Simulation of Pulse-Echo Radar for Vehicle Control and SLAM @Article{echosim2021schouten, author={Schouten, Girmi and Jansen, Wouter and Steckel, Jan}, title={Simulation of Pulse-Echo Radar for Vehicle Control and SLAM}, JOURNAL={Sensors}, volume={21}, year={2021}, number={2}, article-number={523}, doi={10.3390/s21020523} }","title":"Associated publications"},{"location":"#cosys-lab-modifications","text":"Added support for Unreal up to 5.4 ( Note that Unreal 5.3/5.4 breaks camera scene rendering by default in custom environments ) Added multi-layer annotation for groundtruth label generation with RGB, greyscale and texture options. Extensive API integration and available for camera and GPU-LiDAR sensors. Added Instance Segmentation . Added Echo sensor type for simulation of sensors like sonar and radar. Added GPU LIDAR sensor type : Uses GPU acceleration to simulate a LiDAR sensor. Can support much higher point density then normal LiDAR and behaves more authentic and has realistic intensity generation. Added skid steering SimMode and vehicle type . ClearPath Husky and Pioneer P3DX implemented as vehicle types using this new vehicle model. Added Matlab API Client implementation as an easy to install Matlab toolbox. Added various random but deterministic dynamic object types and world configuration options . Added BoxCar vehicle model to the Car SimMode to have a smaller vehicle to use in indoor spaces. Updated ComputerVision mode : Now has full API and Simulation just like other vehicle types. It mostly means it can now have sensors attached (outside of IMU). Improved handling and camera operation. Updated LIDAR sensor type : Fixed not tracing correctly, added ground truth (point labels) generation, added range-noise generation. Improved API pointcloud delivery to be full scan instead of being frame-rate dependent and partial. Updated the camera, Echo and (GPU-)LiDAR sensors to be uncoupled from the vehicle and be placed as external world sensors. Updated sensors like cameras, Echo sensor and GPU-LiDAR to ignore certain objects with the MarkedIgnore Unreal tag and enabling the \"IgnoreMarked\" setting in the settings file . Updated cameras sensor with more distortion features such as chromatic aberration, motion blur and lens distortion. Updated Python ROS implementation with completely new implementation and feature set. Updated C++ ROS2 implementation to support custom Cosys-AirSim features. Dropped support for Unity Environments. Some more details on our changes can be found in the changelog .","title":"Cosys-Lab Modifications"},{"location":"#how-to-get-it","text":"","title":"How to Get It"},{"location":"#download-and-install-from-precompiled-plugin-windowslinux","text":"Download and install it","title":"Download and install from precompiled plugin - Windows/Linux"},{"location":"#install-and-use-from-source-windows","text":"Install/Build it","title":"Install and use from source - Windows"},{"location":"#install-and-use-from-source-linux","text":"Install/Build it","title":"Install and use from source - Linux"},{"location":"#how-to-use-it","text":"","title":"How to Use It"},{"location":"#documentation","text":"View our detailed documentation on all aspects of Cosys-AirSim.","title":"Documentation"},{"location":"#original-airsim-paper","text":"More technical details are available in AirSim paper (FSR 2017 Conference) . Please cite this as: @inproceedings{airsim2017fsr, author = {Shital Shah and Debadeepta Dey and Chris Lovett and Ashish Kapoor}, title = {AirSim: High-Fidelity Visual and Physical Simulation for Autonomous Vehicles}, year = {2017}, booktitle = {Field and Service Robotics}, eprint = {arXiv:1705.05065}, url = {https://arxiv.org/abs/1705.05065} }","title":"Original AirSim Paper"},{"location":"#license","text":"This project is released under the MIT License. Please review the License file for more details.","title":"License"},{"location":"InfraredCamera/","text":"This is a tutorial for generating simulated thermal infrared (IR) images using Cosys-AirSim. To generate your own data, you may use two python files: create_ir_segmentation_map.py and capture_ir_segmentation.py . create_ir_segmentation_map.py uses temperature, emissivity, and camera response information to estimate the thermal digital count that could be expected for the objects in the environment, and then reassigns the segmentation IDs in Cosys-AirSim to match these digital counts. It should be run before starting to capture thermal IR data. Otherwise, digital counts in the IR images will be incorrect. The camera response, temperature, and emissivity data are all included for the Africa environment. capture_ir_segmentation.py is run after the segmentation IDs have been reassigned. It tracks objects of interest and records the infrared and scene images from the multirotor. It uses Computer Vision mode. Finally, the details about how temperatures were estimated for plants and animals in the Africa environment, etc. can be found in this paper: @inproceedings{bondi2018airsim, title={AirSim-W: A Simulation Environment for Wildlife Conservation with UAVs}, author={Bondi, Elizabeth and Dey, Debadeepta and Kapoor, Ashish and Piavis, Jim and Shah, Shital and Fang, Fei and Dilkina, Bistra and Hannaford, Robert and Iyer, Arvind and Joppa, Lucas and others}, booktitle={Proceedings of the 1st ACM SIGCAS Conference on Computing and Sustainable Societies}, pages={40}, year={2018}, organization={ACM} } nb","title":"Infrared Camera"},{"location":"adding_new_apis/","text":"Adding New APIs to AirSim Adding new APIs requires modifying the source code. Much of the changes are mechanical and required for various levels of abstractions that AirSim supports. The main files required to be modified are described below along with some commits and PRs for demonstration. Specific sections of the PRs or commits might be linked in some places, but it'll be helpful to have a look at the entire diff to get a better sense of the workflow. Also, don't hesitate in opening an issue or a draft PR also if unsure about how to go about making changes or to get feedback. Implementing the API Before adding the wrapper code to call and handle the API, it needs to be implemented first. The exact files where this will occur varies depending on what it does. Few examples are given below which might help you in getting started. Vehicle-based APIs moveByVelocityBodyFrameAsync API for velocity-based movement in the multirotor's X-Y frame. The main implementation is done in MultirotorBaseApi.cpp , where most of the multirotor APIs are implemented. In some cases, additional structures might be needed for storing data, getRotorStates API is a good example for this, here the RotorStates struct is defined in 2 places for conversion from RPC to internal code. It also requires modifications in AirLib as well as Unreal/Plugins for the implementation. Environment-related APIs These APIs need to interact with the simulation environment itself, hence it's likely that it'll be implemented inside the Unreal/Plugins folder. simCreateVoxelGrid API to generate and save a binvox-formatted grid of the environment - WorldSimApi.cpp simAddVehicle API to create vehicles at runtime - SimMode*, WorldSimApi files Physics-related APIs simSetWind API shows an example of modifying the physics behaviour and adding an API + settings field for the same. See the PR for details about the code. RPC Wrappers The APIs use msgpack-rpc protocol over TCP/IP through rpclib developed by Tam\u00c3\u00a1s Szelei which allows you to use variety of programming languages including C++, C#, Python, Java etc. When AirSim starts, it opens port 41451 (this can be changed via settings ) and listens for incoming request. The Python or C++ client code connects to this port and sends RPC calls using msgpack serialization format . To add the RPC code to call the new API, follow the steps below. Follow the implementation of other APIs defined in the files. Add an RPC handler in the server which calls your implemented method in RpcLibServerBase.cpp . Vehicle-specific APIs are in their respective vehicle subfolder. Add the C++ client API method in RpcClientBase.cpp Add the Python client API method in client.py . If needed, add or modify a structure definition in types.py Testing Testing is required to ensure that the API is working as expected. For this, as expected, you'll have to use the source-built AirSim and Blocks environment. Apart from this, if using the Python APIs, you'll have to use the airsim package from source rather than the PyPI package. Below are 2 ways described to go about using the package from source - Use setup_path.py . It will setup the path such that the local airsim module is used instead of the pip installed package. This is the method used in many of the scripts since the user doesn't need to do anything other than run the script. Place your example script in one of the folders inside PythonClient like multirotor , car , etc. You can also create one to keep things separate, and copy the setup_path.py file from another folder. Add import setup_path before import cosysairsim as airsim in your files. Now the latest main API (or any branch currently checked out) will be used. Use a local project pip install . Regular install would create a copy of the current source and use it, whereas Editable install ( pip install -e . from inside the PythonClient folder) would change the package whenever the Python API files are changed. Editable install has the benefit when working on several branches or API is not finalized. It is recommended to use a virtual environment for dealing with Python packaging so as to not break any existing setup. When opening a PR, make sure to follow the coding guidelines . Also add a docstring for the API in the Python files, and please include any example scripts and settings required in the script as well.","title":"Adding new APIs"},{"location":"adding_new_apis/#adding-new-apis-to-airsim","text":"Adding new APIs requires modifying the source code. Much of the changes are mechanical and required for various levels of abstractions that AirSim supports. The main files required to be modified are described below along with some commits and PRs for demonstration. Specific sections of the PRs or commits might be linked in some places, but it'll be helpful to have a look at the entire diff to get a better sense of the workflow. Also, don't hesitate in opening an issue or a draft PR also if unsure about how to go about making changes or to get feedback.","title":"Adding New APIs to AirSim"},{"location":"adding_new_apis/#implementing-the-api","text":"Before adding the wrapper code to call and handle the API, it needs to be implemented first. The exact files where this will occur varies depending on what it does. Few examples are given below which might help you in getting started.","title":"Implementing the API"},{"location":"adding_new_apis/#vehicle-based-apis","text":"moveByVelocityBodyFrameAsync API for velocity-based movement in the multirotor's X-Y frame. The main implementation is done in MultirotorBaseApi.cpp , where most of the multirotor APIs are implemented. In some cases, additional structures might be needed for storing data, getRotorStates API is a good example for this, here the RotorStates struct is defined in 2 places for conversion from RPC to internal code. It also requires modifications in AirLib as well as Unreal/Plugins for the implementation.","title":"Vehicle-based APIs"},{"location":"adding_new_apis/#environment-related-apis","text":"These APIs need to interact with the simulation environment itself, hence it's likely that it'll be implemented inside the Unreal/Plugins folder. simCreateVoxelGrid API to generate and save a binvox-formatted grid of the environment - WorldSimApi.cpp simAddVehicle API to create vehicles at runtime - SimMode*, WorldSimApi files","title":"Environment-related APIs"},{"location":"adding_new_apis/#physics-related-apis","text":"simSetWind API shows an example of modifying the physics behaviour and adding an API + settings field for the same. See the PR for details about the code.","title":"Physics-related APIs"},{"location":"adding_new_apis/#rpc-wrappers","text":"The APIs use msgpack-rpc protocol over TCP/IP through rpclib developed by Tam\u00c3\u00a1s Szelei which allows you to use variety of programming languages including C++, C#, Python, Java etc. When AirSim starts, it opens port 41451 (this can be changed via settings ) and listens for incoming request. The Python or C++ client code connects to this port and sends RPC calls using msgpack serialization format . To add the RPC code to call the new API, follow the steps below. Follow the implementation of other APIs defined in the files. Add an RPC handler in the server which calls your implemented method in RpcLibServerBase.cpp . Vehicle-specific APIs are in their respective vehicle subfolder. Add the C++ client API method in RpcClientBase.cpp Add the Python client API method in client.py . If needed, add or modify a structure definition in types.py","title":"RPC Wrappers"},{"location":"adding_new_apis/#testing","text":"Testing is required to ensure that the API is working as expected. For this, as expected, you'll have to use the source-built AirSim and Blocks environment. Apart from this, if using the Python APIs, you'll have to use the airsim package from source rather than the PyPI package. Below are 2 ways described to go about using the package from source - Use setup_path.py . It will setup the path such that the local airsim module is used instead of the pip installed package. This is the method used in many of the scripts since the user doesn't need to do anything other than run the script. Place your example script in one of the folders inside PythonClient like multirotor , car , etc. You can also create one to keep things separate, and copy the setup_path.py file from another folder. Add import setup_path before import cosysairsim as airsim in your files. Now the latest main API (or any branch currently checked out) will be used. Use a local project pip install . Regular install would create a copy of the current source and use it, whereas Editable install ( pip install -e . from inside the PythonClient folder) would change the package whenever the Python API files are changed. Editable install has the benefit when working on several branches or API is not finalized. It is recommended to use a virtual environment for dealing with Python packaging so as to not break any existing setup. When opening a PR, make sure to follow the coding guidelines . Also add a docstring for the API in the Python files, and please include any example scripts and settings required in the script as well.","title":"Testing"},{"location":"annotation/","text":"Annotation in Cosys-AirSim A multi-layer annotation system is implemented into Cosys-AirSim. It uses Proxy Mesh rendering to allow for each object in the world to be annotated by a greyscale value, an RGB color or a texture that fits the mesh. An annotation layer allows the user to tag individual actors and/or their child-components with a certain annotation component. This can be used to create ground truth data for machine learning models or to create a visual representation of the environment. Let's say you want to train a model to detect cars or pedestrians, you create an RGB annotation layer where you can tag all the cars and pedestrians in the environment with a certain RGB color respectively. Through the API you can then get the image of this RGB annotation layer (GPU LiDAR is also supported next to cameras). Or you want to assign a ripeness value to all the apples in your environment, you can create a greyscale annotation layer where you can tag all the apples with a certain greyscale value between 0 and 1. Similarly, you can also load a texture to a specific mesh component only visible in the annotation layer. For example when trying to show where defects are in a mesh. The annotation system uses actor and/or component tags to set these values for the 3 modes (greyscale, RGB, texture). You can add these manually or use the APIs (RPC API, Unreal Blueprint, Unreal c++). Limitations 2744000 different RGB colors are currently available to be assigned to unique objects. If your environment during a run requires more colors, you will generate errors and new objects will be assigned color [0,0,0]. Only static and skeletal meshes are supported. Landscape objects aren't supported. This is the special object type in Unreal to make terrain with. As a work-around, StaticMesh terrain must be used. Foliage objects aren't supported. This is the special object type in Unreal to place trees, grass and other plants that move with the wind. As a work-around, StaticMesh objects must be used. Brush objects aren't supported. This is a special object type in Unreal to create your own meshes with. As a work-around, you can convert them to a StaticMesh. These and other unsupported object types that are less common that either will not be rendered (decals, text, foliage, ...) or will by default be given the RGB color value of [149,149,149] or 0,0,0 . Usage Settings JSON definition of layers To use the annotation system, you need to set the annotation mode in the settings.json file. You can define as many as you want and use them simultaneously. You will always have to ID them by the name. Here you define each layer with a name, the type and some other settings, often specific to the type. For example: { ... \"Annotation\": [ { \"Name\": \"RGBTestDirect\", \"Type\": 0, \"Default\": true, \"SetDirect\": true, \"ViewDistance\": 10 }, { \"Name\": \"RGBTestIndex\", \"Type\": 0, \"Default\": true, \"SetDirect\": false }, { \"Name\": \"GreyscaleTest\", \"Type\": 1, \"Default\": true, \"ViewDistance\": 5 }, { \"Name\": \"TextureTestDirect\", \"Type\": 2, \"Default\": true, \"SetDirect\": true }, { \"Name\": \"TextureTestRelativePath\", \"Type\": 2, \"Default\": false, \"SetDirect\": false, \"TexturePath\": \"/Game/AnnotationTest\", \"TexturePrefix\": \"Test1\" } ], ... } The types are: RGB = 0, Greyscale = 1, Texture = 2 The Default setting applies to all types and is what happens when no tag is set for na actor/component. When set to false, the mesh will not be rendered in the annotation layer. When set to true, the mesh will be rendered in the annotation layer with the default value of the layer. The ViewDistance setting applies to all types and allows you to set the maximum distance in meters at which the annotation layer is rendered. This only applies to the camera sensor output as for LiDAR you can set the maximum range distance of the sensor differently. This value is by default set to -1 which means infinite draw distance. Type 1: RGB Similar to instance segmentation , you can use the RGB annotation layer to tag objects in the environment with a unique color. You can do this by directly setting the color yourself (direct mode), or by assigning the object an index (0-2744000 unique colors) that will be linked to the colormap. To use direct mode, set the settings of this layer with SetDirect to true . For index mode, set to false . Actor/component tags have the following format: annotationName_R_G_B for direct mode or annotationName_ID for direct mode. So if for example your RGB annotation layer is called RGBTestDirect , you can tag an actor with the tag RGBTestDirect_255_0_0 to give it a red color. Or for index mode, RGBTest_5 to give it the fifth color in the colormap. When Default is set to 1, all objects without a tag for this layer will be rendered in black. The instance segmentation API function to get the colormap also applies to the RGB index mode. For example in Python you can use: colorMap = client.simGetSegmentationColorMap() Several RPC API functions are available to influence or retrieve the RGB annotation layer. Currently, it is not possible to use the RPC API to add new actors or components to the annotation system, you can only update their values. For example in Python: simSetAnnotationObjectID(annotation_name, mesh_name, object_id, is_name_regex=False/True) to update the color of an object in index mode (regex allows to set multiple with wildcards for example) when it already exists in the annotation system simSetAnnotationObjectColor(annotation_name, mesh_name, r, g, b, is_name_regex=False/True) to update the color of an object in direct mode (regex allows to set multiple with wildcards for example) when it already exists in the annotation system simGetAnnotationObjectID(annotation_name, mesh_name) to get the ID of an object in index mode simGetAnnotationObjectColor(annotation_name, mesh_name) to get the color of an object in direct mode simIsValidColor(r,g,b) You can check if a color is valid using this function The same is available in Unreal Blueprint and Unreal c++. You can find the functions in the Annotation category. Add RGBDirect Annotation Tag to Component/Actor(annotation_name, component/actor, color, update_annotation=true/false) to set the color of an object in direct mode Update RGBDirect Annotation Tag to Component/Actor(annotation_name, component/actor, color, update_annotation=true/false) to update the color of an object in direct mode already in the system Add RGBIndex Annotation Tag to Component/Actor(annotation_name, component/actor, object_id, update_annotation=true/false) to set the index of an object in index mode Update RGBIndex Annotation Tag to Component/Actor(annotation_name, component/actor, object_id, update_annotation=true/false) to update the index of an object in index mode already in the system Is Annotation RGB Valid(color) You can check if a color is valid using this function Note that enabling update_annotation is a relatively slow process, specially on actors with lots of annotated components. Ideally set update_annotation to false during the process of adding tags to the actor and only turn on update_annotation for the last component or actor you want to update. Alternatively, you can use the Add New Actor To Annotation() blueprint function to update the annotation layer for this actor after you have added all tags. Type 2: Greyscale You can use the greyscale annotation layer to tag objects in the environment with a float value between 0 and 1. Note that this has the precision of uint8. Actor/component tags have the following format: annotationName_value . So if for example your RGB annotation layer is called GreyscaleTest , you can tag an actor with the tag GreyscaleTest_0.76 to give it a value of 0.76 which would result in a color of (194, 194, 194). When Default is set to 1, all objects without a tag for this layer will be rendered in black. Several RPC API functions are available to influence or retrieve the RGB annotation layer. Currently, it is not possible to use the RPC API to add new actors or components to the annotation system, you can only update their values. For example in Python: simSetAnnotationObjectValue(annotation_name, mesh_name, greyscale_value, is_name_regex=False/True) to update the value of an object (regex allows to set multiple with wildcards for example) when it already exists in the annotation system simGetAnnotationObjectValue(annotation_name, mesh_name) to get the value of an object The same is available in Unreal Blueprint and Unreal c++. You can find the functions in the Annotation category. Add Greyscale Annotation Tag to Component/Actor(annotation_name, component/actor, value, update_annotation=true/false) to update the value of an object when it already exists in the annotation system Update Greyscale Annotation Tag to Component/Actor(annotation_name, component/actor, value, update_annotation=true/false) to update the value of an object Note that enabling update_annotation is a relatively slow process, specially on actors with lots of annotated components. Ideally set update_annotation to false during the process of adding tags to the actor and only turn on update_annotation for the last component or actor you want to update. Alternatively, you can use the Add New Actor To Annotation() blueprint function to update the annotation layer for this actor after you have added all tags. Type 3: Texture You can use the texture annotation layer to tag objects in the environment with a specific texture. This can be a color or greyscale texture, or you can mix them. Choice is up to you. You can do this by directly setting the texture yourself (direct mode), or by assigning a texture that is loaded based on a set path and the name of the mesh. To use direct mode, set the settings of this layer with SetDirect to true . For path reference mode, set to false . Actor/component tags have the following format: annotationName_texturepath for direct mode. The Unreal texture path name has to be rather specific: - If your texture is in the environment content folder, you must add /Game/ in front of the path. - If it is in the Cosys-AirSim plugin content folder, you must add /AirSim/ in front of the path. - For Engine textures, you must add /Engine/ in front of the path. So if for example your texture annotation layer is called TextureTestDirect , and your texture TestTexture is in the game content folder under a subfolder AnnotationTest you can tag an actor with the tag TextureTest_/Game/AnnotationTest/TestTexture to give it this texture. For path reference mod, the content of the tag is not really important as long as it contains the name of the annotation layer and an underscore, for example annotationName_enable . What is important is in reference mode is that you have a texture in the content folder with the name of the mesh if you do enable this object by setting a tag. You must place your textures in the folder defined by the TexturePath setting in the settings.json file for this layer. And the texture must have the same name as the mesh and start with the prefix set by the TexturePrefix setting in the settings.json file for this layer followed by a hyphen. So for example if you have a static mesh called Cylinder and your texture layer is called TextureTestDirect with the settings TexturePath set to /Game/AnnotationTest and TexturePrefix set to Test1 , you must have a texture called Test1-Cylinder in the folder /Game/AnnotationTest . When Default is set to 1, all objects without a tag for this layer will be rendered in black. Several RPC API functions are available to influence or retrieve the RGB annotation layer. Currently, it is not possible to use the RPC API to add new actors or components to the annotation system, you can only update their values. For example in Python: simSetAnnotationObjectTextureByPath(annotation_name, mesh_name, texture_path, is_name_regex=False/True) to set the texture of an object in direct mode, the texture path should be same format as described above, for example /Game/MyTextures/TestTexture1 (regex allows to set multiple with wildcards for example) simEnableAnnotationObjectTextureByPath(annotation_name, mesh_name, r, g, b, is_name_regex=False/True) to enable the texture of an object in relative path mode, this does require a texture in the relative path as described above! (regex allows to set multiple with wildcards for example) simGetAnnotationObjectTexturePath(annotation_name, mesh_name) to get the texture path of an object The same is available in Unreal Blueprint and Unreal c++. You can find the functions in the Annotation category. Add Texture Direct Annotation Tag to Component/Actor By Path(annotation_name, component/actor, texture_path, update_annotation=true/false) to set the texture of an object in direct mode, the texture path should be same format as described above, for example /Game/MyTextures/TestTexture1 Update Texture Direct Annotation Tag to Component/Actor By Path(annotation_name, component/actor, texture_path, update_annotation=true/false) to update texture of an object in direct mode that is already in the system, the texture path should be same format as described above, for example /Game/MyTextures/TestTexture1 Add Texture Direct Annotation Tag to Component/Actor(annotation_name, component/actor, texture, update_annotation=true/false) to set the texture of an object in direct mode, the texture can be directly referenced as UTexture* Object Update Texture Direct Annotation Tag to Component/Actor(annotation_name, component/actor, texture, update_annotation=true/false) to update texture of an object in direct mode that is already in the system, the texture can be directly referenced as UTexture* Object Enable Texture By Path Annotation Tag to Component/Actor(annotation_name, component/actor, update_annotation=true/false) to enable the texture of an object in relative path mode, this does require a texture in the relative path as described above! Note that enabling update_annotation is a relatively slow process, specially on actors with lots of annotated components. Ideally set update_annotation to false during the process of adding tags to the actor and only turn on update_annotation for the last component or actor you want to update. Alternatively, you can use the Add New Actor To Annotation() blueprint function to update the annotation layer for this actor after you have added all tags. Common functionality By default, when the world loads, all meshes are checked for tags and the annotation layers are updated accordingly. With the unreal blueprint and c++ functions however, you can also decide to update the annotation layer only when you want to with the update_annotation argument. If you have many objects to update, this can save a lot of time by doing it only for the last object. Some API functions exist for all types, for example in Python: simListAnnotationObjects(annotation_name) to get a list of all objects within this annotation layer. simListAnnotationPoses(annotation_name, ned=True/False, only_visible=False/True) to get the 3D poses of all objects in this annotation layer. The returned pose is in NED coordinates in SI units with its origin at Player Start by default or in Unreal NED frame if the ned boolean argument is set to talse . Similarly, for Unreal Blueprint and Unreal c++. You can find the functions in the Annotation category. Does Annotation Layer Exist(annotation_name) to figure out if a layer exists or not Add New Actor To Annotation(annotation_name, actor, update_annotation=true/false) if you manually added a tag, and want to update the annotation layer with this actor. This is useful to run after adding multiple tags to the actor and its components with the other api calls, and you want to update the annotation layer only once, otherwise it will be much slower. Delete Actor From Annotation(annotation_name, actor, update_annotation=true/false) if you manually remove all tags from an actor for this layer and remove it from the annotation layer Force Update Annotation(annotation_name) to force an update of the annotation layer. Getting annotation data from sensors The easiest way to get the images from annotation cameras, is through the image API. See the Image API documentation for more information. GPU LiDAR is also supported, but each GPU Lidar can only render one annotation layer. See the GPU LiDAR documentation for more information. You can also display the annotation layers in the subwindows. See the Settings documentation for more information. For example: { ... \"SubWindows\": [ { \"WindowID\": 0, \"CameraName\": \"front_center\", \"ImageType\": 10, \"VehicleName\": \"robot1\", \"Annotation\": \"GreyscaleTest\", \"Visible\": false }, ... Credits The method used to use Proxy meshes to segment object is a derivative of and inspired by the work of UnrealCV . Their work is licensed under the MIT License. It is made by students from Johns Hopkins University and Peking University under the supervision of Prof. Alan Yuille and Prof. Yizhou Wang. You can read the paper on their work here .","title":"Annotation"},{"location":"annotation/#annotation-in-cosys-airsim","text":"A multi-layer annotation system is implemented into Cosys-AirSim. It uses Proxy Mesh rendering to allow for each object in the world to be annotated by a greyscale value, an RGB color or a texture that fits the mesh. An annotation layer allows the user to tag individual actors and/or their child-components with a certain annotation component. This can be used to create ground truth data for machine learning models or to create a visual representation of the environment. Let's say you want to train a model to detect cars or pedestrians, you create an RGB annotation layer where you can tag all the cars and pedestrians in the environment with a certain RGB color respectively. Through the API you can then get the image of this RGB annotation layer (GPU LiDAR is also supported next to cameras). Or you want to assign a ripeness value to all the apples in your environment, you can create a greyscale annotation layer where you can tag all the apples with a certain greyscale value between 0 and 1. Similarly, you can also load a texture to a specific mesh component only visible in the annotation layer. For example when trying to show where defects are in a mesh. The annotation system uses actor and/or component tags to set these values for the 3 modes (greyscale, RGB, texture). You can add these manually or use the APIs (RPC API, Unreal Blueprint, Unreal c++).","title":"Annotation in Cosys-AirSim"},{"location":"annotation/#limitations","text":"2744000 different RGB colors are currently available to be assigned to unique objects. If your environment during a run requires more colors, you will generate errors and new objects will be assigned color [0,0,0]. Only static and skeletal meshes are supported. Landscape objects aren't supported. This is the special object type in Unreal to make terrain with. As a work-around, StaticMesh terrain must be used. Foliage objects aren't supported. This is the special object type in Unreal to place trees, grass and other plants that move with the wind. As a work-around, StaticMesh objects must be used. Brush objects aren't supported. This is a special object type in Unreal to create your own meshes with. As a work-around, you can convert them to a StaticMesh. These and other unsupported object types that are less common that either will not be rendered (decals, text, foliage, ...) or will by default be given the RGB color value of [149,149,149] or 0,0,0 .","title":"Limitations"},{"location":"annotation/#usage","text":"","title":"Usage"},{"location":"annotation/#settings-json-definition-of-layers","text":"To use the annotation system, you need to set the annotation mode in the settings.json file. You can define as many as you want and use them simultaneously. You will always have to ID them by the name. Here you define each layer with a name, the type and some other settings, often specific to the type. For example: { ... \"Annotation\": [ { \"Name\": \"RGBTestDirect\", \"Type\": 0, \"Default\": true, \"SetDirect\": true, \"ViewDistance\": 10 }, { \"Name\": \"RGBTestIndex\", \"Type\": 0, \"Default\": true, \"SetDirect\": false }, { \"Name\": \"GreyscaleTest\", \"Type\": 1, \"Default\": true, \"ViewDistance\": 5 }, { \"Name\": \"TextureTestDirect\", \"Type\": 2, \"Default\": true, \"SetDirect\": true }, { \"Name\": \"TextureTestRelativePath\", \"Type\": 2, \"Default\": false, \"SetDirect\": false, \"TexturePath\": \"/Game/AnnotationTest\", \"TexturePrefix\": \"Test1\" } ], ... } The types are: RGB = 0, Greyscale = 1, Texture = 2 The Default setting applies to all types and is what happens when no tag is set for na actor/component. When set to false, the mesh will not be rendered in the annotation layer. When set to true, the mesh will be rendered in the annotation layer with the default value of the layer. The ViewDistance setting applies to all types and allows you to set the maximum distance in meters at which the annotation layer is rendered. This only applies to the camera sensor output as for LiDAR you can set the maximum range distance of the sensor differently. This value is by default set to -1 which means infinite draw distance.","title":"Settings JSON definition of layers"},{"location":"annotation/#type-1-rgb","text":"Similar to instance segmentation , you can use the RGB annotation layer to tag objects in the environment with a unique color. You can do this by directly setting the color yourself (direct mode), or by assigning the object an index (0-2744000 unique colors) that will be linked to the colormap. To use direct mode, set the settings of this layer with SetDirect to true . For index mode, set to false . Actor/component tags have the following format: annotationName_R_G_B for direct mode or annotationName_ID for direct mode. So if for example your RGB annotation layer is called RGBTestDirect , you can tag an actor with the tag RGBTestDirect_255_0_0 to give it a red color. Or for index mode, RGBTest_5 to give it the fifth color in the colormap. When Default is set to 1, all objects without a tag for this layer will be rendered in black. The instance segmentation API function to get the colormap also applies to the RGB index mode. For example in Python you can use: colorMap = client.simGetSegmentationColorMap() Several RPC API functions are available to influence or retrieve the RGB annotation layer. Currently, it is not possible to use the RPC API to add new actors or components to the annotation system, you can only update their values. For example in Python: simSetAnnotationObjectID(annotation_name, mesh_name, object_id, is_name_regex=False/True) to update the color of an object in index mode (regex allows to set multiple with wildcards for example) when it already exists in the annotation system simSetAnnotationObjectColor(annotation_name, mesh_name, r, g, b, is_name_regex=False/True) to update the color of an object in direct mode (regex allows to set multiple with wildcards for example) when it already exists in the annotation system simGetAnnotationObjectID(annotation_name, mesh_name) to get the ID of an object in index mode simGetAnnotationObjectColor(annotation_name, mesh_name) to get the color of an object in direct mode simIsValidColor(r,g,b) You can check if a color is valid using this function The same is available in Unreal Blueprint and Unreal c++. You can find the functions in the Annotation category. Add RGBDirect Annotation Tag to Component/Actor(annotation_name, component/actor, color, update_annotation=true/false) to set the color of an object in direct mode Update RGBDirect Annotation Tag to Component/Actor(annotation_name, component/actor, color, update_annotation=true/false) to update the color of an object in direct mode already in the system Add RGBIndex Annotation Tag to Component/Actor(annotation_name, component/actor, object_id, update_annotation=true/false) to set the index of an object in index mode Update RGBIndex Annotation Tag to Component/Actor(annotation_name, component/actor, object_id, update_annotation=true/false) to update the index of an object in index mode already in the system Is Annotation RGB Valid(color) You can check if a color is valid using this function Note that enabling update_annotation is a relatively slow process, specially on actors with lots of annotated components. Ideally set update_annotation to false during the process of adding tags to the actor and only turn on update_annotation for the last component or actor you want to update. Alternatively, you can use the Add New Actor To Annotation() blueprint function to update the annotation layer for this actor after you have added all tags.","title":"Type 1: RGB"},{"location":"annotation/#type-2-greyscale","text":"You can use the greyscale annotation layer to tag objects in the environment with a float value between 0 and 1. Note that this has the precision of uint8. Actor/component tags have the following format: annotationName_value . So if for example your RGB annotation layer is called GreyscaleTest , you can tag an actor with the tag GreyscaleTest_0.76 to give it a value of 0.76 which would result in a color of (194, 194, 194). When Default is set to 1, all objects without a tag for this layer will be rendered in black. Several RPC API functions are available to influence or retrieve the RGB annotation layer. Currently, it is not possible to use the RPC API to add new actors or components to the annotation system, you can only update their values. For example in Python: simSetAnnotationObjectValue(annotation_name, mesh_name, greyscale_value, is_name_regex=False/True) to update the value of an object (regex allows to set multiple with wildcards for example) when it already exists in the annotation system simGetAnnotationObjectValue(annotation_name, mesh_name) to get the value of an object The same is available in Unreal Blueprint and Unreal c++. You can find the functions in the Annotation category. Add Greyscale Annotation Tag to Component/Actor(annotation_name, component/actor, value, update_annotation=true/false) to update the value of an object when it already exists in the annotation system Update Greyscale Annotation Tag to Component/Actor(annotation_name, component/actor, value, update_annotation=true/false) to update the value of an object Note that enabling update_annotation is a relatively slow process, specially on actors with lots of annotated components. Ideally set update_annotation to false during the process of adding tags to the actor and only turn on update_annotation for the last component or actor you want to update. Alternatively, you can use the Add New Actor To Annotation() blueprint function to update the annotation layer for this actor after you have added all tags.","title":"Type 2: Greyscale"},{"location":"annotation/#type-3-texture","text":"You can use the texture annotation layer to tag objects in the environment with a specific texture. This can be a color or greyscale texture, or you can mix them. Choice is up to you. You can do this by directly setting the texture yourself (direct mode), or by assigning a texture that is loaded based on a set path and the name of the mesh. To use direct mode, set the settings of this layer with SetDirect to true . For path reference mode, set to false . Actor/component tags have the following format: annotationName_texturepath for direct mode. The Unreal texture path name has to be rather specific: - If your texture is in the environment content folder, you must add /Game/ in front of the path. - If it is in the Cosys-AirSim plugin content folder, you must add /AirSim/ in front of the path. - For Engine textures, you must add /Engine/ in front of the path. So if for example your texture annotation layer is called TextureTestDirect , and your texture TestTexture is in the game content folder under a subfolder AnnotationTest you can tag an actor with the tag TextureTest_/Game/AnnotationTest/TestTexture to give it this texture. For path reference mod, the content of the tag is not really important as long as it contains the name of the annotation layer and an underscore, for example annotationName_enable . What is important is in reference mode is that you have a texture in the content folder with the name of the mesh if you do enable this object by setting a tag. You must place your textures in the folder defined by the TexturePath setting in the settings.json file for this layer. And the texture must have the same name as the mesh and start with the prefix set by the TexturePrefix setting in the settings.json file for this layer followed by a hyphen. So for example if you have a static mesh called Cylinder and your texture layer is called TextureTestDirect with the settings TexturePath set to /Game/AnnotationTest and TexturePrefix set to Test1 , you must have a texture called Test1-Cylinder in the folder /Game/AnnotationTest . When Default is set to 1, all objects without a tag for this layer will be rendered in black. Several RPC API functions are available to influence or retrieve the RGB annotation layer. Currently, it is not possible to use the RPC API to add new actors or components to the annotation system, you can only update their values. For example in Python: simSetAnnotationObjectTextureByPath(annotation_name, mesh_name, texture_path, is_name_regex=False/True) to set the texture of an object in direct mode, the texture path should be same format as described above, for example /Game/MyTextures/TestTexture1 (regex allows to set multiple with wildcards for example) simEnableAnnotationObjectTextureByPath(annotation_name, mesh_name, r, g, b, is_name_regex=False/True) to enable the texture of an object in relative path mode, this does require a texture in the relative path as described above! (regex allows to set multiple with wildcards for example) simGetAnnotationObjectTexturePath(annotation_name, mesh_name) to get the texture path of an object The same is available in Unreal Blueprint and Unreal c++. You can find the functions in the Annotation category. Add Texture Direct Annotation Tag to Component/Actor By Path(annotation_name, component/actor, texture_path, update_annotation=true/false) to set the texture of an object in direct mode, the texture path should be same format as described above, for example /Game/MyTextures/TestTexture1 Update Texture Direct Annotation Tag to Component/Actor By Path(annotation_name, component/actor, texture_path, update_annotation=true/false) to update texture of an object in direct mode that is already in the system, the texture path should be same format as described above, for example /Game/MyTextures/TestTexture1 Add Texture Direct Annotation Tag to Component/Actor(annotation_name, component/actor, texture, update_annotation=true/false) to set the texture of an object in direct mode, the texture can be directly referenced as UTexture* Object Update Texture Direct Annotation Tag to Component/Actor(annotation_name, component/actor, texture, update_annotation=true/false) to update texture of an object in direct mode that is already in the system, the texture can be directly referenced as UTexture* Object Enable Texture By Path Annotation Tag to Component/Actor(annotation_name, component/actor, update_annotation=true/false) to enable the texture of an object in relative path mode, this does require a texture in the relative path as described above! Note that enabling update_annotation is a relatively slow process, specially on actors with lots of annotated components. Ideally set update_annotation to false during the process of adding tags to the actor and only turn on update_annotation for the last component or actor you want to update. Alternatively, you can use the Add New Actor To Annotation() blueprint function to update the annotation layer for this actor after you have added all tags.","title":"Type 3: Texture"},{"location":"annotation/#common-functionality","text":"By default, when the world loads, all meshes are checked for tags and the annotation layers are updated accordingly. With the unreal blueprint and c++ functions however, you can also decide to update the annotation layer only when you want to with the update_annotation argument. If you have many objects to update, this can save a lot of time by doing it only for the last object. Some API functions exist for all types, for example in Python: simListAnnotationObjects(annotation_name) to get a list of all objects within this annotation layer. simListAnnotationPoses(annotation_name, ned=True/False, only_visible=False/True) to get the 3D poses of all objects in this annotation layer. The returned pose is in NED coordinates in SI units with its origin at Player Start by default or in Unreal NED frame if the ned boolean argument is set to talse . Similarly, for Unreal Blueprint and Unreal c++. You can find the functions in the Annotation category. Does Annotation Layer Exist(annotation_name) to figure out if a layer exists or not Add New Actor To Annotation(annotation_name, actor, update_annotation=true/false) if you manually added a tag, and want to update the annotation layer with this actor. This is useful to run after adding multiple tags to the actor and its components with the other api calls, and you want to update the annotation layer only once, otherwise it will be much slower. Delete Actor From Annotation(annotation_name, actor, update_annotation=true/false) if you manually remove all tags from an actor for this layer and remove it from the annotation layer Force Update Annotation(annotation_name) to force an update of the annotation layer.","title":"Common functionality"},{"location":"annotation/#getting-annotation-data-from-sensors","text":"The easiest way to get the images from annotation cameras, is through the image API. See the Image API documentation for more information. GPU LiDAR is also supported, but each GPU Lidar can only render one annotation layer. See the GPU LiDAR documentation for more information. You can also display the annotation layers in the subwindows. See the Settings documentation for more information. For example: { ... \"SubWindows\": [ { \"WindowID\": 0, \"CameraName\": \"front_center\", \"ImageType\": 10, \"VehicleName\": \"robot1\", \"Annotation\": \"GreyscaleTest\", \"Visible\": false }, ...","title":"Getting annotation data from sensors"},{"location":"annotation/#credits","text":"The method used to use Proxy meshes to segment object is a derivative of and inspired by the work of UnrealCV . Their work is licensed under the MIT License. It is made by students from Johns Hopkins University and Peking University under the supervision of Prof. Alan Yuille and Prof. Yizhou Wang. You can read the paper on their work here .","title":"Credits"},{"location":"apis/","text":"AirSim APIs Introduction AirSim exposes APIs so you can interact with vehicle in the simulation programmatically. You can use these APIs to retrieve images, get state, control the vehicle and so on. Python Quickstart If you want to use Python to call AirSim APIs, we recommend using Anaconda with Python 3.5 or later versions however some code may also work with Python 2.7 ( help us improve compatibility!). First install this package: pip install msgpack-rpc-python Once you can run AirSim, choose Car as vehicle and then navigate to PythonClient\\car\\ folder and run: python hello_car.py If you are using Visual Studio 2019 then just open AirSim.sln, set PythonClient as startup project and choose car\\hello_car.py as your startup script. Installing AirSim Package You can also install the AirSim python module to your Python environment to use anywhere by running pip install . in the PythonClient folder. Notes 1. You may notice a file setup_path.py in our example folders. This file has simple code to detect if airsim package is available in parent folder and in that case we use that instead of pip installed package so you always use latest code. 2. AirSim is still under heavy development which means you might frequently need to update the package to use new APIs. C++ Users If you want to use C++ APIs and examples, please see C++ APIs Guide . Hello Car Here's how to use AirSim APIs using Python to control simulated car (see also C++ example ): # ready to run example: PythonClient/car/hello_car.py import cosysairsim as airsim import time # connect to the AirSim simulator client = airsim.CarClient() client.confirmConnection() client.enableApiControl(True) car_controls = airsim.CarControls() while True: # get state of the car car_state = client.getCarState() print(\"Speed %d, Gear %d\" % (car_state.speed, car_state.gear)) # set the controls for car car_controls.throttle = 1 car_controls.steering = 1 client.setCarControls(car_controls) # let car drive a bit time.sleep(1) # get camera images from the car responses = client.simGetImages([ airsim.ImageRequest(0, airsim.ImageType.DepthVis), airsim.ImageRequest(1, airsim.ImageType.DepthPlanar, True)]) print('Retrieved images: %d', len(responses)) # do something with images for response in responses: if response.pixels_as_float: print(\"Type %d, size %d\" % (response.image_type, len(response.image_data_float))) airsim.write_pfm('py1.pfm', airsim.get_pfm_array(response)) else: print(\"Type %d, size %d\" % (response.image_type, len(response.image_data_uint8))) airsim.write_file('py1.png', response.image_data_uint8) Hello Drone Here's how to use AirSim APIs using Python to control simulated quadrotor (see also C++ example ): # ready to run example: PythonClient/multirotor/hello_drone.py import cosysairsim as airsim import os # connect to the AirSim simulator client = airsim.MultirotorClient() client.confirmConnection() client.enableApiControl(True) client.armDisarm(True) # Async methods returns Future. Call join() to wait for task to complete. client.takeoffAsync().join() client.moveToPositionAsync(-10, 10, -10, 5).join() # take images responses = client.simGetImages([ airsim.ImageRequest(\"0\", airsim.ImageType.DepthVis), airsim.ImageRequest(\"1\", airsim.ImageType.DepthPlanar, True)]) print('Retrieved images: %d', len(responses)) # do something with the images for response in responses: if response.pixels_as_float: print(\"Type %d, size %d\" % (response.image_type, len(response.image_data_float))) airsim.write_pfm(os.path.normpath('/temp/py1.pfm'), airsim.get_pfm_array(response)) else: print(\"Type %d, size %d\" % (response.image_type, len(response.image_data_uint8))) airsim.write_file(os.path.normpath('/temp/py1.png'), response.image_data_uint8) Common APIs reset : This resets the vehicle to its original starting state. Note that you must call enableApiControl and armDisarm again after the call to reset . confirmConnection : Checks state of connection every 1 sec and reports it in Console so user can see the progress for connection. enableApiControl : For safety reasons, by default API control for autonomous vehicle is not enabled and human operator has full control (usually via RC or joystick in simulator). The client must make this call to request control via API. It is likely that human operator of vehicle might have disallowed API control which would mean that enableApiControl has no effect. This can be checked by isApiControlEnabled . isApiControlEnabled : Returns true if API control is established. If false (which is default) then API calls would be ignored. After a successful call to enableApiControl , the isApiControlEnabled should return true. ping : If connection is established then this call will return true otherwise it will be blocked until timeout. simPrintLogMessage : Prints the specified message in the simulator's window. If message_param is also supplied then its printed next to the message and in that case if this API is called with same message value but different message_param again then previous line is overwritten with new line (instead of API creating new line on display). For example, simPrintLogMessage(\"Iteration: \", to_string(i)) keeps updating same line on display when API is called with different values of i. The valid values of severity parameter is 0 to 3 inclusive that corresponds to different colors. simGetObjectPose(ned=true) , simSetObjectPose : Gets and sets the pose of specified object in Unreal environment. Here the object means \"actor\" in Unreal terminology. They are searched by tag as well as name. Please note that the names shown in UE Editor are auto-generated in each run and are not permanent. So if you want to refer to actor by name, you must change its auto-generated name in UE Editor. Alternatively you can add a tag to actor which can be done by clicking on that actor in Unreal Editor and then going to Tags property , click \"+\" sign and add some string value. If multiple actors have same tag then the first match is returned. If no matches are found then NaN pose is returned. The returned pose is in NED coordinates in SI units with its origin at Player Start by default or in Unreal NED frame if the ned boolean argument is set to talse . For simSetObjectPose , the specified actor must have Mobility set to Movable or otherwise you will get undefined behavior. The simSetObjectPose has parameter teleport which means object is moved through other objects in its way and it returns true if move was successful simListSceneObjects : Provides a list of all objects in the environment. You can also use regular expression to filter specific objects by name. For example, the code below sets all meshes which have names starting with \"wall\" you can use simListSceneObjects(\"wall[\\w]*\") . Image/Computer Vision/Instance segmentation APIs AirSim offers comprehensive images APIs to retrieve synchronized images from multiple cameras along with ground truth including depth, disparity, surface normals and vision. You can set the resolution, FOV, motion blur etc parameters in settings.json . There is also API for detecting collision state. See also complete code that generates specified number of stereo images and ground truth depth with normalization to camera plan, computation of disparity image and saving it to pfm format . Furthermore, the Instance Segmentation system can also be manipulated through the API. More on image APIs, Computer Vision mode and instance segmentation configuration . Pause and Continue APIs AirSim allows to pause and continue the simulation through pause(is_paused) API. To pause the simulation call pause(True) and to continue the simulation call pause(False) . You may have scenario, especially while using reinforcement learning, to run the simulation for specified amount of time and then automatically pause. While simulation is paused, you may then do some expensive computation, send a new command and then again run the simulation for specified amount of time. This can be achieved by API continueForTime(seconds) . This API runs the simulation for the specified number of seconds and then pauses the simulation. For example usage, please see pause_continue_car.py and pause_continue_drone.py . Collision API The collision information can be obtained using simGetCollisionInfo API. This call returns a struct that has information not only whether collision occurred but also collision position, surface normal, penetration depth and so on. Time of Day API AirSim assumes there exist sky sphere of class EngineSky/BP_Sky_Sphere in your environment with ADirectionalLight actor. By default, the position of the sun in the scene doesn't move with time. You can use settings to set up latitude, longitude, date and time which AirSim uses to compute the position of sun in the scene. You can also use following API call to set the sun position according to given date time: simSetTimeOfDay(self, is_enabled, start_datetime = \"\", is_start_datetime_dst = False, celestial_clock_speed = 1, update_interval_secs = 60, move_sun = True) The is_enabled parameter must be True to enable time of day effect. If it is False then sun position is reset to its original in the environment. Other parameters are same as in settings . Line-of-sight and world extent APIs To test line-of-sight in the sim from a vehicle to a point or between two points, see simTestLineOfSightToPoint(point, vehicle_name) and simTestLineOfSightBetweenPoints(point1, point2), respectively. Sim world extent, in the form of a vector of two GeoPoints, can be retrieved using simGetWorldExtents(). Weather APIs By default all weather effects are disabled. To enable weather effect, first call: simEnableWeather(True) Various weather effects can be enabled by using simSetWeatherParameter method which takes WeatherParameter , for example, client.simSetWeatherParameter(airsim.WeatherParameter.Rain, 0.25); The second parameter value is from 0 to 1. The first parameter provides following options: class WeatherParameter: Rain = 0 Roadwetness = 1 Snow = 2 RoadSnow = 3 MapleLeaf = 4 RoadLeaf = 5 Dust = 6 Fog = 7 Please note that Roadwetness , RoadSnow and RoadLeaf effects requires adding materials to your scene. Please see example code for more details. Recording APIs Recording APIs can be used to start recording data through APIs. Data to be recorded can be specified using settings . To start recording, use - client.startRecording() Similarly, to stop recording, use client.stopRecording() . To check whether Recording is running, call client.isRecording() , returns a bool . This API works alongwith toggling Recording using R button, therefore if it's enabled using R key, isRecording() will return True , and recording can be stopped via API using stopRecording() . Similarly, recording started using API will be stopped if R key is pressed in Viewport. LogMessage will also appear in the top-left of the viewport if recording is started or stopped using API. Note that this will only save the data as specfied in the settings. For full freedom in storing data such as certain sensor information, or in a different format or layout, use the other APIs to fetch the data and save as desired. Check out Modifying Recording Data for details on how to modify the kinematics data being recorded. Wind API Wind can be changed during simulation using simSetWind() . Wind is specified in World frame, NED direction and m/s values E.g. To set 20m/s wind in North (forward) direction - # Set wind to (20,0,0) in NED (forward direction) wind = airsim.Vector3r(20, 0, 0) client.simSetWind(wind) Also see example script in set_wind.py Lidar APIs AirSim offers API to retrieve point cloud data from (GPU)Lidar sensors on vehicles. You can set the number of channels, points per second, horizontal and vertical FOV, etc parameters in settings.json . More on lidar APIs and settings , GPUlidar APIs and settings and sensor settings Light Control APIs Lights that can be manipulated inside AirSim can be created via the simSpawnObject() API by passing either PointLightBP or SpotLightBP as the asset_name parameter and True as the is_blueprint parameter. Once a light has been spawned, it can be manipulated using the following API: simSetLightIntensity : This allows you to edit a light's intensity or brightness. It takes two parameters, light_name , the name of the light object returned by a previous call to simSpawnObject() , and intensity , a float value. Texture APIs Textures can be dynamically set on objects via these APIs: simSetObjectMaterial : This sets an object's material using an existing Unreal material asset. It takes two string parameters, object_name and material_name . simSetObjectMaterialFromTexture : This sets an object's material using a path to a texture. It takes two string parameters, object_name and texture_path . Multiple Vehicles AirSim supports multiple vehicles and control them through APIs. Please Multiple Vehicles doc. Coordinate System All AirSim API uses NED coordinate system, i.e., +X is North, +Y is East and +Z is Down. All units are in SI system. Please note that this is different from coordinate system used internally by Unreal Engine. In Unreal Engine, +Z is up instead of down and length unit is in centimeters instead of meters. AirSim APIs takes care of the appropriate conversions. The starting point of the vehicle is always coordinates (0, 0, 0) in NED system. Thus when converting from Unreal coordinates to NED, we first subtract the starting offset and then scale by 100 for cm to m conversion. The vehicle is spawned in Unreal environment where the Player Start component is placed. There is a setting called OriginGeopoint in settings.json which assigns geographic longitude, longitude and altitude to the Player Start component. If wanted, one can move the Unreal origin to the same location as the AirSim origin player start position by setting the MoveWorldOrigin in the settings.json to true . Vehicle Specific APIs APIs for Car Car has followings APIs available: setCarControls : This allows you to set throttle, steering, handbrake and auto or manual gear. getCarState : This retrieves the state information including speed, current gear and 6 kinematics quantities: position, orientation, linear and angular velocity, linear and angular acceleration. All quantities are in NED coordinate system, SI units in world frame except for angular velocity and accelerations which are in body frame. Image APIs . APIs for Multirotor Multirotor can be controlled by specifying angles, velocity vector, destination position or some combination of these. There are corresponding move* APIs for this purpose. When doing position control, we need to use some path following algorithm. By default AirSim uses carrot following algorithm. This is often referred to as \"high level control\" because you just need to specify high level goal and the firmware takes care of the rest. Currently lowest level control available in AirSim is moveByAngleThrottleAsync API. getMultirotorState This API returns the state of the vehicle in one call. The state includes, collision, estimated kinematics (i.e. kinematics computed by fusing sensors), and timestamp (nano seconds since epoch). The kinematics here means 6 quantities: position, orientation, linear and angular velocity, linear and angular acceleration. Please note that simple_slight currently doesn't support state estimator which means estimated and ground truth kinematics values would be same for simple_flight. Estimated kinematics are however available for PX4 except for angular acceleration. All quantities are in NED coordinate system, SI units in world frame except for angular velocity and accelerations which are in body frame. Async methods, duration and max_wait_seconds Many API methods has parameters named duration or max_wait_seconds and they have Async as suffix, for example, takeoffAsync . These methods will return immediately after starting the task in AirSim so that your client code can do something else while that task is being executed. If you want to wait for this task to complete then you can call waitOnLastTask like this: //C++ client.takeoffAsync()->waitOnLastTask(); # Python client.takeoffAsync().join() If you start another command then it automatically cancels the previous task and starts new command. This allows to use pattern where your coded continuously does the sensing, computes a new trajectory to follow and issues that path to vehicle in AirSim. Each newly issued trajectory cancels the previous trajectory allowing your code to continuously do the update as new sensor data arrives. All Async method returns concurrent.futures.Future in Python ( std::future in C++). Please note that these future classes currently do not allow to check status or cancel the task; they only allow to wait for task to complete. AirSim does provide API cancelLastTask , however. drivetrain There are two modes you can fly vehicle: drivetrain parameter is set to airsim.DrivetrainType.ForwardOnly or airsim.DrivetrainType.MaxDegreeOfFreedom . When you specify ForwardOnly, you are saying that vehicle's front should always point in the direction of travel. So if you want drone to take left turn then it would first rotate so front points to left. This mode is useful when you have only front camera and you are operating vehicle using FPV view. This is more or less like travelling in car where you always have front view. The MaxDegreeOfFreedom means you don't care where the front points to. So when you take left turn, you just start going left like crab. Quadrotors can go in any direction regardless of where front points to. The MaxDegreeOfFreedom enables this mode. yaw_mode yaw_mode is a struct YawMode with two fields, yaw_or_rate and is_rate . If is_rate field is True then yaw_or_rate field is interpreted as angular velocity in degrees/sec which means you want vehicle to rotate continuously around its axis at that angular velocity while moving. If is_rate is False then yaw_or_rate is interpreted as angle in degrees which means you want vehicle to rotate to specific angle (i.e. yaw) and keep that angle while moving. You can probably see that when yaw_mode.is_rate == true , the drivetrain parameter shouldn't be set to ForwardOnly because you are contradicting by saying that keep front pointing ahead but also rotate continuously. However if you have yaw_mode.is_rate = false in ForwardOnly mode then you can do some funky stuff. For example, you can have drone do circles and have yaw_or_rate set to 90 so camera is always pointed to center (\"super cool selfie mode\"). In MaxDegreeofFreedom also you can get some funky stuff by setting yaw_mode.is_rate = true and say yaw_mode.yaw_or_rate = 20 . This will cause drone to go in its path while rotating which may allow to do 360 scanning. In most cases, you just don't want yaw to change which you can do by setting yaw rate of 0. The shorthand for this is airsim.YawMode.Zero() (or in C++: YawMode::Zero() ). lookahead and adaptive_lookahead When you ask vehicle to follow a path, AirSim uses \"carrot following\" algorithm. This algorithm operates by looking ahead on path and adjusting its velocity vector. The parameters for this algorithm is specified by lookahead and adaptive_lookahead . For most of the time you want algorithm to auto-decide the values by simply setting lookahead = -1 and adaptive_lookahead = 0 . Using APIs on Real Vehicles We want to be able to run same code that runs in simulation as on real vehicle. This allows you to test your code in simulator and deploy to real vehicle. Generally speaking, APIs therefore shouldn't allow you to do something that cannot be done on real vehicle (for example, getting the ground truth). But, of course, simulator has much more information and it would be useful in applications that may not care about running things on real vehicle. For this reason, we clearly delineate between sim-only APIs by attaching sim prefix, for example, simGetGroundTruthKinematics . This way you can avoid using these simulation-only APIs if you care about running your code on real vehicles. The AirLib is self-contained library that you can put on an offboard computing module such as the Gigabyte barebone Mini PC. This module then can talk to the flight controllers such as PX4 using exact same code and flight controller protocol. The code you write for testing in the simulator remains unchanged. See AirLib on custom drones . Adding New APIs to AirSim See the Adding New APIs page References and Examples C++ API Examples Car Examples Multirotor Examples Computer Vision Examples Move on Path demo showing video of fast multirotor flight through Modular Neighborhood environment Building a Hexacopter Building Point Clouds FAQ Unreal is slowed down dramatically when I run API If you see Unreal getting slowed down dramatically when Unreal Engine window loses focus then go to 'Edit->Editor Preferences' in Unreal Editor, in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. Do I need anything else on Windows? You should install VS2019 with VC++, Windows SDK 10.0 and Python. To use Python APIs you will need Python 3.5 or later (install it using Anaconda). Which version of Python should I use? We recommend Anaconda to get Python tools and libraries. Our code is tested with Python 3.5.3 :: Anaconda 4.4.0. This is important because older version have been known to have problems . I get error on import cv2 You can install OpenCV using: conda install opencv pip install opencv-python TypeError: unsupported operand type(s) for *: 'AsyncIOLoop' and 'float' This error happens if you install Jupyter, which somehow breaks the msgpackrpc library. Create a new python environment which the minimal required packages.","title":"Core APIs"},{"location":"apis/#airsim-apis","text":"","title":"AirSim APIs"},{"location":"apis/#introduction","text":"AirSim exposes APIs so you can interact with vehicle in the simulation programmatically. You can use these APIs to retrieve images, get state, control the vehicle and so on.","title":"Introduction"},{"location":"apis/#python-quickstart","text":"If you want to use Python to call AirSim APIs, we recommend using Anaconda with Python 3.5 or later versions however some code may also work with Python 2.7 ( help us improve compatibility!). First install this package: pip install msgpack-rpc-python Once you can run AirSim, choose Car as vehicle and then navigate to PythonClient\\car\\ folder and run: python hello_car.py If you are using Visual Studio 2019 then just open AirSim.sln, set PythonClient as startup project and choose car\\hello_car.py as your startup script.","title":"Python Quickstart"},{"location":"apis/#installing-airsim-package","text":"You can also install the AirSim python module to your Python environment to use anywhere by running pip install . in the PythonClient folder. Notes 1. You may notice a file setup_path.py in our example folders. This file has simple code to detect if airsim package is available in parent folder and in that case we use that instead of pip installed package so you always use latest code. 2. AirSim is still under heavy development which means you might frequently need to update the package to use new APIs.","title":"Installing AirSim Package"},{"location":"apis/#c-users","text":"If you want to use C++ APIs and examples, please see C++ APIs Guide .","title":"C++ Users"},{"location":"apis/#hello-car","text":"Here's how to use AirSim APIs using Python to control simulated car (see also C++ example ): # ready to run example: PythonClient/car/hello_car.py import cosysairsim as airsim import time # connect to the AirSim simulator client = airsim.CarClient() client.confirmConnection() client.enableApiControl(True) car_controls = airsim.CarControls() while True: # get state of the car car_state = client.getCarState() print(\"Speed %d, Gear %d\" % (car_state.speed, car_state.gear)) # set the controls for car car_controls.throttle = 1 car_controls.steering = 1 client.setCarControls(car_controls) # let car drive a bit time.sleep(1) # get camera images from the car responses = client.simGetImages([ airsim.ImageRequest(0, airsim.ImageType.DepthVis), airsim.ImageRequest(1, airsim.ImageType.DepthPlanar, True)]) print('Retrieved images: %d', len(responses)) # do something with images for response in responses: if response.pixels_as_float: print(\"Type %d, size %d\" % (response.image_type, len(response.image_data_float))) airsim.write_pfm('py1.pfm', airsim.get_pfm_array(response)) else: print(\"Type %d, size %d\" % (response.image_type, len(response.image_data_uint8))) airsim.write_file('py1.png', response.image_data_uint8)","title":"Hello Car"},{"location":"apis/#hello-drone","text":"Here's how to use AirSim APIs using Python to control simulated quadrotor (see also C++ example ): # ready to run example: PythonClient/multirotor/hello_drone.py import cosysairsim as airsim import os # connect to the AirSim simulator client = airsim.MultirotorClient() client.confirmConnection() client.enableApiControl(True) client.armDisarm(True) # Async methods returns Future. Call join() to wait for task to complete. client.takeoffAsync().join() client.moveToPositionAsync(-10, 10, -10, 5).join() # take images responses = client.simGetImages([ airsim.ImageRequest(\"0\", airsim.ImageType.DepthVis), airsim.ImageRequest(\"1\", airsim.ImageType.DepthPlanar, True)]) print('Retrieved images: %d', len(responses)) # do something with the images for response in responses: if response.pixels_as_float: print(\"Type %d, size %d\" % (response.image_type, len(response.image_data_float))) airsim.write_pfm(os.path.normpath('/temp/py1.pfm'), airsim.get_pfm_array(response)) else: print(\"Type %d, size %d\" % (response.image_type, len(response.image_data_uint8))) airsim.write_file(os.path.normpath('/temp/py1.png'), response.image_data_uint8)","title":"Hello Drone"},{"location":"apis/#common-apis","text":"reset : This resets the vehicle to its original starting state. Note that you must call enableApiControl and armDisarm again after the call to reset . confirmConnection : Checks state of connection every 1 sec and reports it in Console so user can see the progress for connection. enableApiControl : For safety reasons, by default API control for autonomous vehicle is not enabled and human operator has full control (usually via RC or joystick in simulator). The client must make this call to request control via API. It is likely that human operator of vehicle might have disallowed API control which would mean that enableApiControl has no effect. This can be checked by isApiControlEnabled . isApiControlEnabled : Returns true if API control is established. If false (which is default) then API calls would be ignored. After a successful call to enableApiControl , the isApiControlEnabled should return true. ping : If connection is established then this call will return true otherwise it will be blocked until timeout. simPrintLogMessage : Prints the specified message in the simulator's window. If message_param is also supplied then its printed next to the message and in that case if this API is called with same message value but different message_param again then previous line is overwritten with new line (instead of API creating new line on display). For example, simPrintLogMessage(\"Iteration: \", to_string(i)) keeps updating same line on display when API is called with different values of i. The valid values of severity parameter is 0 to 3 inclusive that corresponds to different colors. simGetObjectPose(ned=true) , simSetObjectPose : Gets and sets the pose of specified object in Unreal environment. Here the object means \"actor\" in Unreal terminology. They are searched by tag as well as name. Please note that the names shown in UE Editor are auto-generated in each run and are not permanent. So if you want to refer to actor by name, you must change its auto-generated name in UE Editor. Alternatively you can add a tag to actor which can be done by clicking on that actor in Unreal Editor and then going to Tags property , click \"+\" sign and add some string value. If multiple actors have same tag then the first match is returned. If no matches are found then NaN pose is returned. The returned pose is in NED coordinates in SI units with its origin at Player Start by default or in Unreal NED frame if the ned boolean argument is set to talse . For simSetObjectPose , the specified actor must have Mobility set to Movable or otherwise you will get undefined behavior. The simSetObjectPose has parameter teleport which means object is moved through other objects in its way and it returns true if move was successful simListSceneObjects : Provides a list of all objects in the environment. You can also use regular expression to filter specific objects by name. For example, the code below sets all meshes which have names starting with \"wall\" you can use simListSceneObjects(\"wall[\\w]*\") .","title":"Common APIs"},{"location":"apis/#imagecomputer-visioninstance-segmentation-apis","text":"AirSim offers comprehensive images APIs to retrieve synchronized images from multiple cameras along with ground truth including depth, disparity, surface normals and vision. You can set the resolution, FOV, motion blur etc parameters in settings.json . There is also API for detecting collision state. See also complete code that generates specified number of stereo images and ground truth depth with normalization to camera plan, computation of disparity image and saving it to pfm format . Furthermore, the Instance Segmentation system can also be manipulated through the API. More on image APIs, Computer Vision mode and instance segmentation configuration .","title":"Image/Computer Vision/Instance segmentation APIs"},{"location":"apis/#pause-and-continue-apis","text":"AirSim allows to pause and continue the simulation through pause(is_paused) API. To pause the simulation call pause(True) and to continue the simulation call pause(False) . You may have scenario, especially while using reinforcement learning, to run the simulation for specified amount of time and then automatically pause. While simulation is paused, you may then do some expensive computation, send a new command and then again run the simulation for specified amount of time. This can be achieved by API continueForTime(seconds) . This API runs the simulation for the specified number of seconds and then pauses the simulation. For example usage, please see pause_continue_car.py and pause_continue_drone.py .","title":"Pause and Continue APIs"},{"location":"apis/#collision-api","text":"The collision information can be obtained using simGetCollisionInfo API. This call returns a struct that has information not only whether collision occurred but also collision position, surface normal, penetration depth and so on.","title":"Collision API"},{"location":"apis/#time-of-day-api","text":"AirSim assumes there exist sky sphere of class EngineSky/BP_Sky_Sphere in your environment with ADirectionalLight actor. By default, the position of the sun in the scene doesn't move with time. You can use settings to set up latitude, longitude, date and time which AirSim uses to compute the position of sun in the scene. You can also use following API call to set the sun position according to given date time: simSetTimeOfDay(self, is_enabled, start_datetime = \"\", is_start_datetime_dst = False, celestial_clock_speed = 1, update_interval_secs = 60, move_sun = True) The is_enabled parameter must be True to enable time of day effect. If it is False then sun position is reset to its original in the environment. Other parameters are same as in settings .","title":"Time of Day API"},{"location":"apis/#line-of-sight-and-world-extent-apis","text":"To test line-of-sight in the sim from a vehicle to a point or between two points, see simTestLineOfSightToPoint(point, vehicle_name) and simTestLineOfSightBetweenPoints(point1, point2), respectively. Sim world extent, in the form of a vector of two GeoPoints, can be retrieved using simGetWorldExtents().","title":"Line-of-sight and world extent APIs"},{"location":"apis/#weather-apis","text":"By default all weather effects are disabled. To enable weather effect, first call: simEnableWeather(True) Various weather effects can be enabled by using simSetWeatherParameter method which takes WeatherParameter , for example, client.simSetWeatherParameter(airsim.WeatherParameter.Rain, 0.25); The second parameter value is from 0 to 1. The first parameter provides following options: class WeatherParameter: Rain = 0 Roadwetness = 1 Snow = 2 RoadSnow = 3 MapleLeaf = 4 RoadLeaf = 5 Dust = 6 Fog = 7 Please note that Roadwetness , RoadSnow and RoadLeaf effects requires adding materials to your scene. Please see example code for more details.","title":"Weather APIs"},{"location":"apis/#recording-apis","text":"Recording APIs can be used to start recording data through APIs. Data to be recorded can be specified using settings . To start recording, use - client.startRecording() Similarly, to stop recording, use client.stopRecording() . To check whether Recording is running, call client.isRecording() , returns a bool . This API works alongwith toggling Recording using R button, therefore if it's enabled using R key, isRecording() will return True , and recording can be stopped via API using stopRecording() . Similarly, recording started using API will be stopped if R key is pressed in Viewport. LogMessage will also appear in the top-left of the viewport if recording is started or stopped using API. Note that this will only save the data as specfied in the settings. For full freedom in storing data such as certain sensor information, or in a different format or layout, use the other APIs to fetch the data and save as desired. Check out Modifying Recording Data for details on how to modify the kinematics data being recorded.","title":"Recording APIs"},{"location":"apis/#wind-api","text":"Wind can be changed during simulation using simSetWind() . Wind is specified in World frame, NED direction and m/s values E.g. To set 20m/s wind in North (forward) direction - # Set wind to (20,0,0) in NED (forward direction) wind = airsim.Vector3r(20, 0, 0) client.simSetWind(wind) Also see example script in set_wind.py","title":"Wind API"},{"location":"apis/#lidar-apis","text":"AirSim offers API to retrieve point cloud data from (GPU)Lidar sensors on vehicles. You can set the number of channels, points per second, horizontal and vertical FOV, etc parameters in settings.json . More on lidar APIs and settings , GPUlidar APIs and settings and sensor settings","title":"Lidar APIs"},{"location":"apis/#light-control-apis","text":"Lights that can be manipulated inside AirSim can be created via the simSpawnObject() API by passing either PointLightBP or SpotLightBP as the asset_name parameter and True as the is_blueprint parameter. Once a light has been spawned, it can be manipulated using the following API: simSetLightIntensity : This allows you to edit a light's intensity or brightness. It takes two parameters, light_name , the name of the light object returned by a previous call to simSpawnObject() , and intensity , a float value.","title":"Light Control APIs"},{"location":"apis/#texture-apis","text":"Textures can be dynamically set on objects via these APIs: simSetObjectMaterial : This sets an object's material using an existing Unreal material asset. It takes two string parameters, object_name and material_name . simSetObjectMaterialFromTexture : This sets an object's material using a path to a texture. It takes two string parameters, object_name and texture_path .","title":"Texture APIs"},{"location":"apis/#multiple-vehicles","text":"AirSim supports multiple vehicles and control them through APIs. Please Multiple Vehicles doc.","title":"Multiple Vehicles"},{"location":"apis/#coordinate-system","text":"All AirSim API uses NED coordinate system, i.e., +X is North, +Y is East and +Z is Down. All units are in SI system. Please note that this is different from coordinate system used internally by Unreal Engine. In Unreal Engine, +Z is up instead of down and length unit is in centimeters instead of meters. AirSim APIs takes care of the appropriate conversions. The starting point of the vehicle is always coordinates (0, 0, 0) in NED system. Thus when converting from Unreal coordinates to NED, we first subtract the starting offset and then scale by 100 for cm to m conversion. The vehicle is spawned in Unreal environment where the Player Start component is placed. There is a setting called OriginGeopoint in settings.json which assigns geographic longitude, longitude and altitude to the Player Start component. If wanted, one can move the Unreal origin to the same location as the AirSim origin player start position by setting the MoveWorldOrigin in the settings.json to true .","title":"Coordinate System"},{"location":"apis/#vehicle-specific-apis","text":"","title":"Vehicle Specific APIs"},{"location":"apis/#apis-for-car","text":"Car has followings APIs available: setCarControls : This allows you to set throttle, steering, handbrake and auto or manual gear. getCarState : This retrieves the state information including speed, current gear and 6 kinematics quantities: position, orientation, linear and angular velocity, linear and angular acceleration. All quantities are in NED coordinate system, SI units in world frame except for angular velocity and accelerations which are in body frame. Image APIs .","title":"APIs for Car"},{"location":"apis/#apis-for-multirotor","text":"Multirotor can be controlled by specifying angles, velocity vector, destination position or some combination of these. There are corresponding move* APIs for this purpose. When doing position control, we need to use some path following algorithm. By default AirSim uses carrot following algorithm. This is often referred to as \"high level control\" because you just need to specify high level goal and the firmware takes care of the rest. Currently lowest level control available in AirSim is moveByAngleThrottleAsync API.","title":"APIs for Multirotor"},{"location":"apis/#getmultirotorstate","text":"This API returns the state of the vehicle in one call. The state includes, collision, estimated kinematics (i.e. kinematics computed by fusing sensors), and timestamp (nano seconds since epoch). The kinematics here means 6 quantities: position, orientation, linear and angular velocity, linear and angular acceleration. Please note that simple_slight currently doesn't support state estimator which means estimated and ground truth kinematics values would be same for simple_flight. Estimated kinematics are however available for PX4 except for angular acceleration. All quantities are in NED coordinate system, SI units in world frame except for angular velocity and accelerations which are in body frame.","title":"getMultirotorState"},{"location":"apis/#async-methods-duration-and-max_wait_seconds","text":"Many API methods has parameters named duration or max_wait_seconds and they have Async as suffix, for example, takeoffAsync . These methods will return immediately after starting the task in AirSim so that your client code can do something else while that task is being executed. If you want to wait for this task to complete then you can call waitOnLastTask like this: //C++ client.takeoffAsync()->waitOnLastTask(); # Python client.takeoffAsync().join() If you start another command then it automatically cancels the previous task and starts new command. This allows to use pattern where your coded continuously does the sensing, computes a new trajectory to follow and issues that path to vehicle in AirSim. Each newly issued trajectory cancels the previous trajectory allowing your code to continuously do the update as new sensor data arrives. All Async method returns concurrent.futures.Future in Python ( std::future in C++). Please note that these future classes currently do not allow to check status or cancel the task; they only allow to wait for task to complete. AirSim does provide API cancelLastTask , however.","title":"Async methods, duration and max_wait_seconds"},{"location":"apis/#drivetrain","text":"There are two modes you can fly vehicle: drivetrain parameter is set to airsim.DrivetrainType.ForwardOnly or airsim.DrivetrainType.MaxDegreeOfFreedom . When you specify ForwardOnly, you are saying that vehicle's front should always point in the direction of travel. So if you want drone to take left turn then it would first rotate so front points to left. This mode is useful when you have only front camera and you are operating vehicle using FPV view. This is more or less like travelling in car where you always have front view. The MaxDegreeOfFreedom means you don't care where the front points to. So when you take left turn, you just start going left like crab. Quadrotors can go in any direction regardless of where front points to. The MaxDegreeOfFreedom enables this mode.","title":"drivetrain"},{"location":"apis/#yaw_mode","text":"yaw_mode is a struct YawMode with two fields, yaw_or_rate and is_rate . If is_rate field is True then yaw_or_rate field is interpreted as angular velocity in degrees/sec which means you want vehicle to rotate continuously around its axis at that angular velocity while moving. If is_rate is False then yaw_or_rate is interpreted as angle in degrees which means you want vehicle to rotate to specific angle (i.e. yaw) and keep that angle while moving. You can probably see that when yaw_mode.is_rate == true , the drivetrain parameter shouldn't be set to ForwardOnly because you are contradicting by saying that keep front pointing ahead but also rotate continuously. However if you have yaw_mode.is_rate = false in ForwardOnly mode then you can do some funky stuff. For example, you can have drone do circles and have yaw_or_rate set to 90 so camera is always pointed to center (\"super cool selfie mode\"). In MaxDegreeofFreedom also you can get some funky stuff by setting yaw_mode.is_rate = true and say yaw_mode.yaw_or_rate = 20 . This will cause drone to go in its path while rotating which may allow to do 360 scanning. In most cases, you just don't want yaw to change which you can do by setting yaw rate of 0. The shorthand for this is airsim.YawMode.Zero() (or in C++: YawMode::Zero() ).","title":"yaw_mode"},{"location":"apis/#lookahead-and-adaptive_lookahead","text":"When you ask vehicle to follow a path, AirSim uses \"carrot following\" algorithm. This algorithm operates by looking ahead on path and adjusting its velocity vector. The parameters for this algorithm is specified by lookahead and adaptive_lookahead . For most of the time you want algorithm to auto-decide the values by simply setting lookahead = -1 and adaptive_lookahead = 0 .","title":"lookahead and adaptive_lookahead"},{"location":"apis/#using-apis-on-real-vehicles","text":"We want to be able to run same code that runs in simulation as on real vehicle. This allows you to test your code in simulator and deploy to real vehicle. Generally speaking, APIs therefore shouldn't allow you to do something that cannot be done on real vehicle (for example, getting the ground truth). But, of course, simulator has much more information and it would be useful in applications that may not care about running things on real vehicle. For this reason, we clearly delineate between sim-only APIs by attaching sim prefix, for example, simGetGroundTruthKinematics . This way you can avoid using these simulation-only APIs if you care about running your code on real vehicles. The AirLib is self-contained library that you can put on an offboard computing module such as the Gigabyte barebone Mini PC. This module then can talk to the flight controllers such as PX4 using exact same code and flight controller protocol. The code you write for testing in the simulator remains unchanged. See AirLib on custom drones .","title":"Using APIs on Real Vehicles"},{"location":"apis/#adding-new-apis-to-airsim","text":"See the Adding New APIs page","title":"Adding New APIs to AirSim"},{"location":"apis/#references-and-examples","text":"C++ API Examples Car Examples Multirotor Examples Computer Vision Examples Move on Path demo showing video of fast multirotor flight through Modular Neighborhood environment Building a Hexacopter Building Point Clouds","title":"References and Examples"},{"location":"apis/#faq","text":"","title":"FAQ"},{"location":"apis/#unreal-is-slowed-down-dramatically-when-i-run-api","text":"If you see Unreal getting slowed down dramatically when Unreal Engine window loses focus then go to 'Edit->Editor Preferences' in Unreal Editor, in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked.","title":"Unreal is slowed down dramatically when I run API"},{"location":"apis/#do-i-need-anything-else-on-windows","text":"You should install VS2019 with VC++, Windows SDK 10.0 and Python. To use Python APIs you will need Python 3.5 or later (install it using Anaconda).","title":"Do I need anything else on Windows?"},{"location":"apis/#which-version-of-python-should-i-use","text":"We recommend Anaconda to get Python tools and libraries. Our code is tested with Python 3.5.3 :: Anaconda 4.4.0. This is important because older version have been known to have problems .","title":"Which version of Python should I use?"},{"location":"apis/#i-get-error-on-import-cv2","text":"You can install OpenCV using: conda install opencv pip install opencv-python","title":"I get error on import cv2"},{"location":"apis/#typeerror-unsupported-operand-types-for-asyncioloop-and-float","text":"This error happens if you install Jupyter, which somehow breaks the msgpackrpc library. Create a new python environment which the minimal required packages.","title":"TypeError: unsupported operand type(s) for *: 'AsyncIOLoop' and 'float'"},{"location":"apis_cpp/","text":"Using C++ APIs for AirSim Please read general API doc first if you haven't already. This document describes C++ examples and other C++ specific details. Quick Start Fastest way to get started is to open AirSim.sln in Visual Studio 2017. You will see Hello Car and Hello Drone examples in the solution. These examples will show you the include paths and lib paths you will need to setup in your VC++ projects. If you are using Linux then you will specify these paths either in your cmake file or on compiler command line. Include and Lib Folders Include folders: $(ProjectDir)..\\AirLib\\deps\\rpclib\\include;include;$(ProjectDir)..\\AirLib\\deps\\eigen3;$(ProjectDir)..\\AirLib\\include Dependencies: rpc.lib Lib folders: $(ProjectDir)\\..\\AirLib\\deps\\MavLinkCom\\lib\\$(Platform)\\$(Configuration);$(ProjectDir)\\..\\AirLib\\deps\\rpclib\\lib\\$(Platform)\\$(Configuration);$(ProjectDir)\\..\\AirLib\\lib\\$(Platform)\\$(Configuration) Hello Car Here's how to use AirSim APIs using Python to control simulated car (see also Python example ): // ready to run example: https://github.com/Cosys-Lab/Cosys-AirSim/blob/main/HelloCar/main.cpp #include #include \"vehicles/car/api/CarRpcLibClient.hpp\" int main() { msr::airlib::CarRpcLibClient client; client.enableApiControl(true); //this disables manual control CarControllerBase::CarControls controls; std::cout << \"Press enter to drive forward\" << std::endl; std::cin.get(); controls.throttle = 1; client.setCarControls(controls); std::cout << \"Press Enter to activate handbrake\" << std::endl; std::cin.get(); controls.handbrake = true; client.setCarControls(controls); std::cout << \"Press Enter to take turn and drive backward\" << std::endl; std::cin.get(); controls.handbrake = false; controls.throttle = -1; controls.steering = 1; client.setCarControls(controls); std::cout << \"Press Enter to stop\" << std::endl; std::cin.get(); client.setCarControls(CarControllerBase::CarControls()); return 0; } Hello Drone Here's how to use AirSim APIs using Python to control simulated car (see also Python example ): // ready to run example: https://github.com/Cosys-Lab/Cosys-AirSim/blob/main/HelloDrone/main.cpp #include #include \"vehicles/multirotor/api/MultirotorRpcLibClient.hpp\" int main() { using namespace std; msr::airlib::MultirotorRpcLibClient client; cout << \"Press Enter to enable API control\" << endl; cin.get(); client.enableApiControl(true); cout << \"Press Enter to arm the drone\" << endl; cin.get(); client.armDisarm(true); cout << \"Press Enter to takeoff\" << endl; cin.get(); client.takeoffAsync(5)->waitOnLastTask(); cout << \"Press Enter to move 5 meters in x direction with 1 m/s velocity\" << endl; cin.get(); auto position = client.getMultirotorState().getPosition(); // from current location client.moveToPositionAsync(position.x() + 5, position.y(), position.z(), 1)->waitOnLastTask(); cout << \"Press Enter to land\" << endl; cin.get(); client.landAsync()->waitOnLastTask(); return 0; } See Also Examples of how to use internal infrastructure in AirSim in your other projects DroneShell app shows how to make simple interface using C++ APIs to control drones Python APIs","title":"C++ APIs"},{"location":"apis_cpp/#using-c-apis-for-airsim","text":"Please read general API doc first if you haven't already. This document describes C++ examples and other C++ specific details.","title":"Using C++ APIs for AirSim"},{"location":"apis_cpp/#quick-start","text":"Fastest way to get started is to open AirSim.sln in Visual Studio 2017. You will see Hello Car and Hello Drone examples in the solution. These examples will show you the include paths and lib paths you will need to setup in your VC++ projects. If you are using Linux then you will specify these paths either in your cmake file or on compiler command line.","title":"Quick Start"},{"location":"apis_cpp/#include-and-lib-folders","text":"Include folders: $(ProjectDir)..\\AirLib\\deps\\rpclib\\include;include;$(ProjectDir)..\\AirLib\\deps\\eigen3;$(ProjectDir)..\\AirLib\\include Dependencies: rpc.lib Lib folders: $(ProjectDir)\\..\\AirLib\\deps\\MavLinkCom\\lib\\$(Platform)\\$(Configuration);$(ProjectDir)\\..\\AirLib\\deps\\rpclib\\lib\\$(Platform)\\$(Configuration);$(ProjectDir)\\..\\AirLib\\lib\\$(Platform)\\$(Configuration)","title":"Include and Lib Folders"},{"location":"apis_cpp/#hello-car","text":"Here's how to use AirSim APIs using Python to control simulated car (see also Python example ): // ready to run example: https://github.com/Cosys-Lab/Cosys-AirSim/blob/main/HelloCar/main.cpp #include #include \"vehicles/car/api/CarRpcLibClient.hpp\" int main() { msr::airlib::CarRpcLibClient client; client.enableApiControl(true); //this disables manual control CarControllerBase::CarControls controls; std::cout << \"Press enter to drive forward\" << std::endl; std::cin.get(); controls.throttle = 1; client.setCarControls(controls); std::cout << \"Press Enter to activate handbrake\" << std::endl; std::cin.get(); controls.handbrake = true; client.setCarControls(controls); std::cout << \"Press Enter to take turn and drive backward\" << std::endl; std::cin.get(); controls.handbrake = false; controls.throttle = -1; controls.steering = 1; client.setCarControls(controls); std::cout << \"Press Enter to stop\" << std::endl; std::cin.get(); client.setCarControls(CarControllerBase::CarControls()); return 0; }","title":"Hello Car"},{"location":"apis_cpp/#hello-drone","text":"Here's how to use AirSim APIs using Python to control simulated car (see also Python example ): // ready to run example: https://github.com/Cosys-Lab/Cosys-AirSim/blob/main/HelloDrone/main.cpp #include #include \"vehicles/multirotor/api/MultirotorRpcLibClient.hpp\" int main() { using namespace std; msr::airlib::MultirotorRpcLibClient client; cout << \"Press Enter to enable API control\" << endl; cin.get(); client.enableApiControl(true); cout << \"Press Enter to arm the drone\" << endl; cin.get(); client.armDisarm(true); cout << \"Press Enter to takeoff\" << endl; cin.get(); client.takeoffAsync(5)->waitOnLastTask(); cout << \"Press Enter to move 5 meters in x direction with 1 m/s velocity\" << endl; cin.get(); auto position = client.getMultirotorState().getPosition(); // from current location client.moveToPositionAsync(position.x() + 5, position.y(), position.z(), 1)->waitOnLastTask(); cout << \"Press Enter to land\" << endl; cin.get(); client.landAsync()->waitOnLastTask(); return 0; }","title":"Hello Drone"},{"location":"apis_cpp/#see-also","text":"Examples of how to use internal infrastructure in AirSim in your other projects DroneShell app shows how to make simple interface using C++ APIs to control drones Python APIs","title":"See Also"},{"location":"camera_views/","text":"Camera Views The camera views that are shown on screen are the camera views you can fetch via the simGetImages API . From left to right is the depth view, segmentation view and the FPV view. See Image APIs for description of various available views. Turning ON/OFF Views Press F1 key to see keyboard shortcuts for turning on/off any or all views. You can also select various view modes there, such as \"Fly with Me\" mode, FPV mode and \"Ground View\" mode. Controlling Manual Camera You can switch to manual camera control by pressing the M key. While manual camera control mode is selected, you can use the following keys to control the camera: |Key|Action| ---|--- |Arrow keys|move the camera forward/back and left/right| |Page up/down|move the camera up/down| |W/A/S/D|control pitch up/down and yaw left/right| |Left shift|increase movement speed| |Left control|decrease movement speed| Configuring Sub-Windows Now you can select what is shown by each of above sub windows. For instance, you can chose to show surface normals in first window (instead of depth) and disparity in second window (instead of segmentation). Below is the settings value you can use in settings.json : { \"SubWindows\": [ {\"WindowID\": 1, \"CameraName\": \"0\", \"ImageType\": 5, \"VehicleName\": \"\", \"Visible\": false}, {\"WindowID\": 2, \"CameraName\": \"0\", \"ImageType\": 3, \"VehicleName\": \"\", \"Visible\": false} ] } Performance Impact Note : This section is outdated and has not been updated for new performance enhancement changes. Now rendering these views does impact the FPS performance of the game, since this is additional work for the GPU. The following shows the impact on FPS when you open these views. This is measured on Intel core i7 computer with 32 gb RAM and a GeForce GTX 1080 graphics card running the Modular Neighborhood map, using cooked debug bits, no debugger or GameEditor open. The normal state with no subviews open is measuring around 16 ms per frame, which means it is keeping a nice steady 60 FPS (which is the target FPS). As it climbs up to 35ms the FPS drops to around 28 frames per second, spiking to 40ms means a few drops to 25 fps. The simulator can still function and fly correctly when all this is going on even in the worse case because the physics is decoupled from the rendering. However if the delay gets too high such that the communication with PX4 hardware is interrupted due to overly busy CPU then the flight can stall due to timeout in the offboard control messages. On the computer where this was measured the drone could fly the path.py program without any problems with all views open, and with 3 python scripts running to capture each view type. But there was one stall during this flight, but it recovered gracefully and completed the path. So it was right on the limit. The following shows the impact on CPU, perhaps a bit surprisingly, the CPU impact is also non trivial.","title":"Camera Views"},{"location":"camera_views/#camera-views","text":"The camera views that are shown on screen are the camera views you can fetch via the simGetImages API . From left to right is the depth view, segmentation view and the FPV view. See Image APIs for description of various available views.","title":"Camera Views"},{"location":"camera_views/#turning-onoff-views","text":"Press F1 key to see keyboard shortcuts for turning on/off any or all views. You can also select various view modes there, such as \"Fly with Me\" mode, FPV mode and \"Ground View\" mode.","title":"Turning ON/OFF Views"},{"location":"camera_views/#controlling-manual-camera","text":"You can switch to manual camera control by pressing the M key. While manual camera control mode is selected, you can use the following keys to control the camera: |Key|Action| ---|--- |Arrow keys|move the camera forward/back and left/right| |Page up/down|move the camera up/down| |W/A/S/D|control pitch up/down and yaw left/right| |Left shift|increase movement speed| |Left control|decrease movement speed|","title":"Controlling Manual Camera"},{"location":"camera_views/#configuring-sub-windows","text":"Now you can select what is shown by each of above sub windows. For instance, you can chose to show surface normals in first window (instead of depth) and disparity in second window (instead of segmentation). Below is the settings value you can use in settings.json : { \"SubWindows\": [ {\"WindowID\": 1, \"CameraName\": \"0\", \"ImageType\": 5, \"VehicleName\": \"\", \"Visible\": false}, {\"WindowID\": 2, \"CameraName\": \"0\", \"ImageType\": 3, \"VehicleName\": \"\", \"Visible\": false} ] }","title":"Configuring Sub-Windows"},{"location":"camera_views/#performance-impact","text":"Note : This section is outdated and has not been updated for new performance enhancement changes. Now rendering these views does impact the FPS performance of the game, since this is additional work for the GPU. The following shows the impact on FPS when you open these views. This is measured on Intel core i7 computer with 32 gb RAM and a GeForce GTX 1080 graphics card running the Modular Neighborhood map, using cooked debug bits, no debugger or GameEditor open. The normal state with no subviews open is measuring around 16 ms per frame, which means it is keeping a nice steady 60 FPS (which is the target FPS). As it climbs up to 35ms the FPS drops to around 28 frames per second, spiking to 40ms means a few drops to 25 fps. The simulator can still function and fly correctly when all this is going on even in the worse case because the physics is decoupled from the rendering. However if the delay gets too high such that the communication with PX4 hardware is interrupted due to overly busy CPU then the flight can stall due to timeout in the offboard control messages. On the computer where this was measured the drone could fly the path.py program without any problems with all views open, and with 3 python scripts running to capture each view type. But there was one stall during this flight, but it recovered gracefully and completed the path. So it was right on the limit. The following shows the impact on CPU, perhaps a bit surprisingly, the CPU impact is also non trivial.","title":"Performance Impact"},{"location":"cmake_linux/","text":"Installing cmake on Linux If you don't have cmake version 3.10 (for example, 3.2.2 is the default on Ubuntu 14) you can run the following: mkdir ~/cmake-3.10.2 cd ~/cmake-3.10.2 wget https://cmake.org/files/v3.10/cmake-3.10.2-Linux-x86_64.sh Now you have to run this command by itself (it is interactive) sh cmake-3.10.2-Linux-x86_64.sh --prefix ~/cmake-3.10.2 Answer 'n' to the question about creating another cmake-3.10.2-Linux-x86_64 folder and then sudo update-alternatives --install /usr/bin/cmake cmake ~/cmake-3.10.2/bin/cmake 60 Now type cmake --version to make sure your cmake version is 3.10.2.","title":"Installing cmake on Linux"},{"location":"cmake_linux/#installing-cmake-on-linux","text":"If you don't have cmake version 3.10 (for example, 3.2.2 is the default on Ubuntu 14) you can run the following: mkdir ~/cmake-3.10.2 cd ~/cmake-3.10.2 wget https://cmake.org/files/v3.10/cmake-3.10.2-Linux-x86_64.sh Now you have to run this command by itself (it is interactive) sh cmake-3.10.2-Linux-x86_64.sh --prefix ~/cmake-3.10.2 Answer 'n' to the question about creating another cmake-3.10.2-Linux-x86_64 folder and then sudo update-alternatives --install /usr/bin/cmake cmake ~/cmake-3.10.2/bin/cmake 60 Now type cmake --version to make sure your cmake version is 3.10.2.","title":"Installing cmake on Linux"},{"location":"custom_drone/","text":"AirLib on a Real Drone The AirLib library can be compiled and deployed on the companion computer on a real drone. For our testing, we mounted a Gigabyte Brix BXi7-5500 ultra compact PC on the drone connected to the Pixhawk flight controller over USB. The Gigabyte PC is running Ubuntu, so we are able to SSH into it over Wi-Fi: Once connected you can run MavLinkTest with this command line: MavLinkTest -serial:/dev/ttyACM0,115200 -logdir:. And this will produce a log file of the flight which can then be used for playback in the simulator . You can also add -proxy:192.168.1.100:14550 to connect MavLinkTest to a remote computer where you can run QGroundControl or our PX4 Log Viewer which is another handy way to see what is going on with your drone. MavLinkTest then has some simple commands for testing your drone, here's a simple example of some commands: arm takeoff 5 orbit 10 2 This will arm the drone, takeoff of 5 meters, then do an orbit pattern radius 10 meters, at 2 m/s. Type '?' to find all available commands. Note: Some commands (for example, orbit ) are named differently and have different syntax in MavLinkTest and DroneShell (for example, circlebypath -radius 10 -velocity 21 ). When you land the drone you can stop MavLinkTest and copy the *.mavlink log file that was generated. DroneServer and DroneShell Once you are happy that the MavLinkTest is working, you can also run DroneServer and DroneShell as follows. First, run MavLinkTest with a local proxy to send everything to DroneServer: MavLinkTest -serial:/dev/ttyACM0,115200 -logdir:. -proxy:127.0.0.1:14560 Change ~/Documents/AirSim/settings.json to say \"serial\":false, because we want DroneServer to look for this UDP connection. DroneServer 0 Lastly, you can now connect DroneShell to this instance of DroneServer and use the DroneShell commands to fly your drone: DroneShell ==||=> Welcome to DroneShell 1.0. Type ? for help. Microsoft Research (c) 2016. Waiting for drone to report a valid GPS location... ==||=> requestcontrol ==||=> arm ==||=> takeoff ==||=> circlebypath -radius 10 -velocity 2 PX4 Specific Tools You can run the MavlinkCom library and MavLinkTest app to test the connection between your companion computer and flight controller. How Does This Work? AirSim uses MavLinkCom component developed by @lovettchris. The MavLinkCom has a proxy architecture where you can open a connection to PX4 either using serial or UDP and then other components share this connection. When PX4 sends MavLink message, all components receive that message. If any component sends a message then it's received by PX4 only. This allows you to connect any number of components to PX4 This code opens a connection for LogViewer and QGC. You can add something more if you like. If you want to use QGC + AirSim together than you will need QGC to let own the serial port. QGC opens up TCP connection that acts as a proxy so any other component can connect to QGC and send MavLinkMessage to QGC and then QGC forwards that message to PX4. So you tell AirSim to connect to QGC and let QGC own serial port. For companion board, the way we did it earlier was to have Gigabyte Brix on the drone. This x86 full-fledged computer that will connect to PX4 through USB. We had Ubuntu on Brix and ran DroneServer . The DroneServer created an API endpoint that we can talk to via C++ client code (or Python code) and it translated API calls to MavLink messages. That way you can write your code against the same API, test it in the simulator and then run the same code on an actual vehicle. So the companion computer has DroneServer running along with client code.","title":"AirSim on Real Drones"},{"location":"custom_drone/#airlib-on-a-real-drone","text":"The AirLib library can be compiled and deployed on the companion computer on a real drone. For our testing, we mounted a Gigabyte Brix BXi7-5500 ultra compact PC on the drone connected to the Pixhawk flight controller over USB. The Gigabyte PC is running Ubuntu, so we are able to SSH into it over Wi-Fi: Once connected you can run MavLinkTest with this command line: MavLinkTest -serial:/dev/ttyACM0,115200 -logdir:. And this will produce a log file of the flight which can then be used for playback in the simulator . You can also add -proxy:192.168.1.100:14550 to connect MavLinkTest to a remote computer where you can run QGroundControl or our PX4 Log Viewer which is another handy way to see what is going on with your drone. MavLinkTest then has some simple commands for testing your drone, here's a simple example of some commands: arm takeoff 5 orbit 10 2 This will arm the drone, takeoff of 5 meters, then do an orbit pattern radius 10 meters, at 2 m/s. Type '?' to find all available commands. Note: Some commands (for example, orbit ) are named differently and have different syntax in MavLinkTest and DroneShell (for example, circlebypath -radius 10 -velocity 21 ). When you land the drone you can stop MavLinkTest and copy the *.mavlink log file that was generated.","title":"AirLib on a Real Drone"},{"location":"custom_drone/#droneserver-and-droneshell","text":"Once you are happy that the MavLinkTest is working, you can also run DroneServer and DroneShell as follows. First, run MavLinkTest with a local proxy to send everything to DroneServer: MavLinkTest -serial:/dev/ttyACM0,115200 -logdir:. -proxy:127.0.0.1:14560 Change ~/Documents/AirSim/settings.json to say \"serial\":false, because we want DroneServer to look for this UDP connection. DroneServer 0 Lastly, you can now connect DroneShell to this instance of DroneServer and use the DroneShell commands to fly your drone: DroneShell ==||=> Welcome to DroneShell 1.0. Type ? for help. Microsoft Research (c) 2016. Waiting for drone to report a valid GPS location... ==||=> requestcontrol ==||=> arm ==||=> takeoff ==||=> circlebypath -radius 10 -velocity 2","title":"DroneServer and DroneShell"},{"location":"custom_drone/#px4-specific-tools","text":"You can run the MavlinkCom library and MavLinkTest app to test the connection between your companion computer and flight controller.","title":"PX4 Specific Tools"},{"location":"custom_drone/#how-does-this-work","text":"AirSim uses MavLinkCom component developed by @lovettchris. The MavLinkCom has a proxy architecture where you can open a connection to PX4 either using serial or UDP and then other components share this connection. When PX4 sends MavLink message, all components receive that message. If any component sends a message then it's received by PX4 only. This allows you to connect any number of components to PX4 This code opens a connection for LogViewer and QGC. You can add something more if you like. If you want to use QGC + AirSim together than you will need QGC to let own the serial port. QGC opens up TCP connection that acts as a proxy so any other component can connect to QGC and send MavLinkMessage to QGC and then QGC forwards that message to PX4. So you tell AirSim to connect to QGC and let QGC own serial port. For companion board, the way we did it earlier was to have Gigabyte Brix on the drone. This x86 full-fledged computer that will connect to PX4 through USB. We had Ubuntu on Brix and ran DroneServer . The DroneServer created an API endpoint that we can talk to via C++ client code (or Python code) and it translated API calls to MavLink messages. That way you can write your code against the same API, test it in the simulator and then run the same code on an actual vehicle. So the companion computer has DroneServer running along with client code.","title":"How Does This Work?"},{"location":"distance_sensor/","text":"Distance Sensor By default, Distance Sensor points to the front of the vehicle. It can be pointed in any direction by modifying the settings Configurable Parameters - Parameter Description X Y Z Position of the sensor relative to the vehicle (in NED, in meters) (Default (0,0,0)-Multirotor, (0,0,-1)-Car) Yaw Pitch Roll Orientation of the sensor relative to the vehicle (degrees) (Default (0,0,0)) MinDistance Minimum distance measured by distance sensor (metres, only used to fill Mavlink message for PX4) (Default 0.2m) MaxDistance Maximum distance measured by distance sensor (metres) (Default 40.0m) ExternalController Whether data is to be sent to external controller such as ArduPilot or PX4 if being used (default true ) For example, to make the sensor point towards the ground (for altitude measurement similar to barometer), the orientation can be modified as follows - \"Distance\": { \"SensorType\": 5, \"Enabled\" : true, \"Yaw\": 0, \"Pitch\": -90, \"Roll\": 0 } Note: For Cars, the sensor is placed 1 meter above the vehicle center by default. This is required since otherwise the sensor gives strange data due it being inside the vehicle. This doesn't affect the sensor values say when measuring the distance between 2 cars. See PythonClient/car/distance_sensor_multi.py for an example usage.","title":"Distance Sensor"},{"location":"distance_sensor/#distance-sensor","text":"By default, Distance Sensor points to the front of the vehicle. It can be pointed in any direction by modifying the settings Configurable Parameters - Parameter Description X Y Z Position of the sensor relative to the vehicle (in NED, in meters) (Default (0,0,0)-Multirotor, (0,0,-1)-Car) Yaw Pitch Roll Orientation of the sensor relative to the vehicle (degrees) (Default (0,0,0)) MinDistance Minimum distance measured by distance sensor (metres, only used to fill Mavlink message for PX4) (Default 0.2m) MaxDistance Maximum distance measured by distance sensor (metres) (Default 40.0m) ExternalController Whether data is to be sent to external controller such as ArduPilot or PX4 if being used (default true ) For example, to make the sensor point towards the ground (for altitude measurement similar to barometer), the orientation can be modified as follows - \"Distance\": { \"SensorType\": 5, \"Enabled\" : true, \"Yaw\": 0, \"Pitch\": -90, \"Roll\": 0 } Note: For Cars, the sensor is placed 1 meter above the vehicle center by default. This is required since otherwise the sensor gives strange data due it being inside the vehicle. This doesn't affect the sensor values say when measuring the distance between 2 cars. See PythonClient/car/distance_sensor_multi.py for an example usage.","title":"Distance Sensor"},{"location":"dynamic_objects/","text":"Setup Dynamic Objects for Scenario Environments for AirSim The available environments often feature some custom-made dynamic blueprints that can be used to create random but deterministic change in your environment. Location While these can be found in the environments available, they are also separately saved in Unreal/Environments/DynamicObjects . Copy the c++ files to your environments Source folder ( Environments/Source/levelname/ ) and copy the uassets to your Contents folder. Features Dynamic AI humans walking between waypoints Dynamic spawning stacked goods (pallets etc.) Dynamic static objects spawning (either always the same or pick from a set of options) Small dynamic changes such as random open doors. All randomization is controllable by a seed to make sure you can simulate the same setup again. Other animate objects such as modular conveyor belts and robotic arms are available as well. Some features can also be configured with a launchfile/launch parameters. Usage There are several object types and settings to make the environment dynamic. Here follows some simple instructions to tweak and alter their behaviour using the created blueprints. Seed & World Dynamics Configuration To control the randomisation functionally used in the dynamic objects, a controllable seed number is used. In every level using the dynamic objects an actor has to be present of the class Dynamic World Master . This object will visually show the chosen seed every time the simulation is started (it can be hidden as well with a toggle). There are few other toggles available as well. The seed and these other settings can be controlled both in standalone(build packages) and in the editor: - Editor : - To control the seed in the editor, change the Editor Seed setting of the Dynamic World Master actor in the level. If it is set to 0, it will generate a random seed number. But if set to anything else, it will use that number as seed for all dynamic objects. - To make the world static and turn off all dynamic changes throughout the simulation (conveyor belts, randomized changes to statics) set the Editor Is Static boolean to true. Default is false. - Toggle the AI in the world on and off set the Editor Spawn AI . Default to true. - Standalone : - To control the seed when using the simulator as standalone, use the launch parameter -startSeed INT with X being the chosen seed value. if not set it will chose a random one. - To make the world static and turn off all dynamic changes throughout the simulation (conveyor belts, randomized changes to statics) add a launch parameter -isStatic BOOL with the boolean set to true. If not set it defaults to false. - Toggle the AI in the world on and off with a launch parameter the -spawnAI BOOL . If not set it defaults to true. Start Point In order for your environment to have multiple starting points, the Dynamic World Master can be configured to teleport the AirSim vehicle after launch to one of several manually defined starting points. To define a new startpoint in your environment, place objects of the type Target Point in your environment. At launch these will be marked as potential staring points for the simulator. They are used in order that they appear in the World Outliner. To configure which starting point is used, you can configure the number: - Editor : To control the starting point in the editor, change the Editor Start Point setting of the Dynamic World Master actor in the level. - Standalone : To control the starting point in aa standalone build, use the launch parameter -startPoint INT with X being the chosen starting point. Dynamic Marked Objects Objects in your environment can be marked to be 'dynamic'. It's mean purpose is to offer a simple setup with a configurable amount of dynamic objects that can move or be deleted. All dynamic objects (those that can be removed or moved) need to have their actor tag set to DynamicObject . To control the dynamic objects a few parameters are available: - Remove percentage : number of marked objects to remove randomly - Move percentage : number of marked objects to move and rotate slightly - Move offset : maximum the object can move in cm - Rotation offset : maximum rotation in degrees the object can rotate These are as well available in the editor with the DynamicWorldMaster actors settings. Or can be setup in the launch parameters. - Editor : Search for the settings Editor Remove Percentage , Editor Move Percentage , Editor Move Offset Value and Editor Rotation Offset Value to configure the dynamic marked objects system. - Standalone : Use the launch parameters -removePercentage INT , -movePercentage INT , -moveOffsetValue INT and -moveRotationValue INT to configure the dynamic marked objects system. Furthermore, you can mark an object as Guide with an actor tag. This will print out the horizontal distance of all marked dynamic objects to these Guide Objects for debugging or validation purposes. LaunchFile You can also use a file to define the previous dynamic setting configurations line per line. Then by pressing a button ( O ) you can switch to the next configuration. This file has the following structure: seed,removePercentage,movePercentage,moveOffsetValue,moveRotationValue For example you can create launchfile.ini , each line defining a new configuration: 0,50,25,50,50 450,10,10,50,50 450,10,10,50,50 500,10,10,50,50 450,10,10,50,50 Do note that this only configures those 5 settings. The Starting point , Is Static and Spawn AI settings are not configured this way and are configured just as before. to make the environment load this file, you need to define it. Which similarly to before is different for the editor or standalone: - Editor : - To control the launchfile in the editor, enable the Use Launch File toggle and set the Editor Launch File field to the absolute filepath of the launchfile of the DynamicWorldMaster actor in the level. - Standalone : - To control the launchfile when using the simulator as standalone, use the launch parameter -launchFile STRING and set it to the absolute filepath of the launchfile. Dynamic Static Spawners Some blueprints are also available to be used to spawn dynamic objects. Currently, there are 4 dynamic static object spawners blueprints available: - RandomStackSpawner : This can be used to create a dynamic formed stacked set of goods. Like a pallet with boxes spawned on top. One can control it with the following settings: Setting Description Static The StaticMesh that needs to be stacked dynamically Min Width Count Minimum amount of statics to spawn in the Y direction Max Width Count Maximum amount of statics to spawn in the Y direction Min Length Count Minimum amount of statics to spawn in the X direction Max Length Count Maximum amount of statics to spawn in the X direction Min Height Count Minimum amount of statics to spawn in the Z direction Max Height Count Maximum amount of statics to spawn in the Z direction Random Rotation Boolean to toggle the application of a random rotation to the object ( for barrels and other cylinder objects) Random Position Offset Value in cm to apply a random offset in position in any direction Chance To Spawn Percentage of chance to spawn each object in the stack Chance to Change Percentage of chance to alter the stack configuration every so many seconds Average Time Between Changes Average time delta in seconds between changes Max Time Between Changes Offset Maximum time delta offset in seconds between changes (to not have all objects change at same time a small random offset is used) Nav Collision W/L/H This setting can be used to create a area around the object spawner of which the Dynamic AI pathfinding will stay away from. RandomStackSpawnerSwitcher : This can be used to create a dynamic formed stacked set of goods. Like a pallet with boxes spawned on top. Difference with the one above is that this one can select from a Data Table object to select randomly a 'goods'/object type, and it's stacking settings. One can control it with the following settings: Setting Description Data Table The Data Table object of the type RandomStackSpawnerSwitcherStruct to set the object types and their settings similar to the ones above for the normal RandomStackSpawner Chance To Spawn Percentage of chance to spawn each object in the stack Chance to Change Percentage of chance to alter the stack configuration every so many seconds Chance To Switch Percentage of chance to switch to a different object type from Data Table Average Time Between Changes Average time delta in seconds between changes Max Time Between Changes Offset Maximum time delta offset in seconds between changes (to not have all objects change at same time a small random offset is used) RandomStaticModifier : This can be used to spawn a singular static and alter it's spawn transform dynamically. One can control it with the following settings: Setting Description Static The StaticMesh that needs to be spawned Chance To Spawn Percentage of chance to spawn the object Max Rotation Offset Maximum rotation in degrees (both positive and negative) to alter the transform Max XPosition Offset Maximum position offset in cm (both positive and negative) to alter the transform in the X axis Max YPosition Offset Maximum position offset in cm (both positive and negative) to alter the transform in the Y axis Chance to Change Percentage of chance to alter the stack configuration every so many seconds Average Time Between Changes Average time delta in seconds between changes Max Time Between Changes Offset Maximum time delta offset in seconds between changes (to not have all objects change at same time a small random offset is used) RandomStaticPicker : This can be used to spawn a singular randomly picked static out of list of chosen statics and alter it's spawn transform dynamically. One can control it with the following settings: Setting Description Statics The list of StaticMesh objects that can be picked from to spawn one Chance To Spawn Percentage of chance to spawn the object Max Rotation Offset Maximum rotation in degrees (both positive and negative) to alter the transform Max XPosition Offset Maximum position offset in cm (both positive and negative) to alter the transform in the X axis Max YPosition Offset Maximum position offset in cm (both positive and negative) to alter the transform in the Y axis Chance to Change Percentage of chance to alter the stack configuration every so many seconds Average Time Between Changes Average time delta in seconds between changes Max Time Between Changes Offset Maximum time delta offset in seconds between changes (to not have all objects change at same time a small random offset is used) There are some other more simple dynamic objects such as doors and conveyor belts that have self-explanatory settings very similar to those above. All are based on the seed randomisation. Grouped AI A human looking AI is also available to walk dynamically between a set of self chosen waypoints. They are based on the DetourAIController of Unreal so will avoid each other and the user pretty well. In order to have more control over them, some custom blueprints were created. Their main features are that the AI themselves, the waypoints and the spawners can be assigned a group ID number. So that all functionally is grouped. They also use the Seed randomisation so that they can be spawned at the same waypoints and target the same waypoints each time if the same seed value is chosen. The following blueprints are available: - GroupedTargetPoint : These are the TargetPoints (waypoints) that the AI will walk between. Their only setting is the group ID number to decide which AI group will be able to pick this waypoint to spawn AI and be the target waypoint for them. - GroupedAI : One can manually spawn an AI by placing these in the world. They need to be assigned the Group ID manually to choose which waypoints to target. - GroupedAISpawner : To automate the spawner of AI, one can use this blueprint. It will spawn Ai at the waypoints of the same group. A setting is available to configure the fill percentage. This will set the percentage of waypoints to spawn AI upon. On also has to chose which Skeletal Meshes and their Animation Blueprints can be chosen from. Spline Animations Making statics and skeletal meshes move along a spline path at a fixed speed. See the video below for more information on how it works:","title":"Setup Dynamic Objects for Scenario Environments for AirSim"},{"location":"dynamic_objects/#setup-dynamic-objects-for-scenario-environments-for-airsim","text":"The available environments often feature some custom-made dynamic blueprints that can be used to create random but deterministic change in your environment.","title":"Setup Dynamic Objects for Scenario Environments for AirSim"},{"location":"dynamic_objects/#location","text":"While these can be found in the environments available, they are also separately saved in Unreal/Environments/DynamicObjects . Copy the c++ files to your environments Source folder ( Environments/Source/levelname/ ) and copy the uassets to your Contents folder.","title":"Location"},{"location":"dynamic_objects/#features","text":"Dynamic AI humans walking between waypoints Dynamic spawning stacked goods (pallets etc.) Dynamic static objects spawning (either always the same or pick from a set of options) Small dynamic changes such as random open doors. All randomization is controllable by a seed to make sure you can simulate the same setup again. Other animate objects such as modular conveyor belts and robotic arms are available as well. Some features can also be configured with a launchfile/launch parameters.","title":"Features"},{"location":"dynamic_objects/#usage","text":"There are several object types and settings to make the environment dynamic. Here follows some simple instructions to tweak and alter their behaviour using the created blueprints.","title":"Usage"},{"location":"dynamic_objects/#seed-world-dynamics-configuration","text":"To control the randomisation functionally used in the dynamic objects, a controllable seed number is used. In every level using the dynamic objects an actor has to be present of the class Dynamic World Master . This object will visually show the chosen seed every time the simulation is started (it can be hidden as well with a toggle). There are few other toggles available as well. The seed and these other settings can be controlled both in standalone(build packages) and in the editor: - Editor : - To control the seed in the editor, change the Editor Seed setting of the Dynamic World Master actor in the level. If it is set to 0, it will generate a random seed number. But if set to anything else, it will use that number as seed for all dynamic objects. - To make the world static and turn off all dynamic changes throughout the simulation (conveyor belts, randomized changes to statics) set the Editor Is Static boolean to true. Default is false. - Toggle the AI in the world on and off set the Editor Spawn AI . Default to true. - Standalone : - To control the seed when using the simulator as standalone, use the launch parameter -startSeed INT with X being the chosen seed value. if not set it will chose a random one. - To make the world static and turn off all dynamic changes throughout the simulation (conveyor belts, randomized changes to statics) add a launch parameter -isStatic BOOL with the boolean set to true. If not set it defaults to false. - Toggle the AI in the world on and off with a launch parameter the -spawnAI BOOL . If not set it defaults to true.","title":"Seed & World Dynamics Configuration"},{"location":"dynamic_objects/#start-point","text":"In order for your environment to have multiple starting points, the Dynamic World Master can be configured to teleport the AirSim vehicle after launch to one of several manually defined starting points. To define a new startpoint in your environment, place objects of the type Target Point in your environment. At launch these will be marked as potential staring points for the simulator. They are used in order that they appear in the World Outliner. To configure which starting point is used, you can configure the number: - Editor : To control the starting point in the editor, change the Editor Start Point setting of the Dynamic World Master actor in the level. - Standalone : To control the starting point in aa standalone build, use the launch parameter -startPoint INT with X being the chosen starting point.","title":"Start Point"},{"location":"dynamic_objects/#dynamic-marked-objects","text":"Objects in your environment can be marked to be 'dynamic'. It's mean purpose is to offer a simple setup with a configurable amount of dynamic objects that can move or be deleted. All dynamic objects (those that can be removed or moved) need to have their actor tag set to DynamicObject . To control the dynamic objects a few parameters are available: - Remove percentage : number of marked objects to remove randomly - Move percentage : number of marked objects to move and rotate slightly - Move offset : maximum the object can move in cm - Rotation offset : maximum rotation in degrees the object can rotate These are as well available in the editor with the DynamicWorldMaster actors settings. Or can be setup in the launch parameters. - Editor : Search for the settings Editor Remove Percentage , Editor Move Percentage , Editor Move Offset Value and Editor Rotation Offset Value to configure the dynamic marked objects system. - Standalone : Use the launch parameters -removePercentage INT , -movePercentage INT , -moveOffsetValue INT and -moveRotationValue INT to configure the dynamic marked objects system. Furthermore, you can mark an object as Guide with an actor tag. This will print out the horizontal distance of all marked dynamic objects to these Guide Objects for debugging or validation purposes.","title":"Dynamic Marked Objects"},{"location":"dynamic_objects/#launchfile","text":"You can also use a file to define the previous dynamic setting configurations line per line. Then by pressing a button ( O ) you can switch to the next configuration. This file has the following structure: seed,removePercentage,movePercentage,moveOffsetValue,moveRotationValue For example you can create launchfile.ini , each line defining a new configuration: 0,50,25,50,50 450,10,10,50,50 450,10,10,50,50 500,10,10,50,50 450,10,10,50,50 Do note that this only configures those 5 settings. The Starting point , Is Static and Spawn AI settings are not configured this way and are configured just as before. to make the environment load this file, you need to define it. Which similarly to before is different for the editor or standalone: - Editor : - To control the launchfile in the editor, enable the Use Launch File toggle and set the Editor Launch File field to the absolute filepath of the launchfile of the DynamicWorldMaster actor in the level. - Standalone : - To control the launchfile when using the simulator as standalone, use the launch parameter -launchFile STRING and set it to the absolute filepath of the launchfile.","title":"LaunchFile"},{"location":"dynamic_objects/#dynamic-static-spawners","text":"Some blueprints are also available to be used to spawn dynamic objects. Currently, there are 4 dynamic static object spawners blueprints available: - RandomStackSpawner : This can be used to create a dynamic formed stacked set of goods. Like a pallet with boxes spawned on top. One can control it with the following settings: Setting Description Static The StaticMesh that needs to be stacked dynamically Min Width Count Minimum amount of statics to spawn in the Y direction Max Width Count Maximum amount of statics to spawn in the Y direction Min Length Count Minimum amount of statics to spawn in the X direction Max Length Count Maximum amount of statics to spawn in the X direction Min Height Count Minimum amount of statics to spawn in the Z direction Max Height Count Maximum amount of statics to spawn in the Z direction Random Rotation Boolean to toggle the application of a random rotation to the object ( for barrels and other cylinder objects) Random Position Offset Value in cm to apply a random offset in position in any direction Chance To Spawn Percentage of chance to spawn each object in the stack Chance to Change Percentage of chance to alter the stack configuration every so many seconds Average Time Between Changes Average time delta in seconds between changes Max Time Between Changes Offset Maximum time delta offset in seconds between changes (to not have all objects change at same time a small random offset is used) Nav Collision W/L/H This setting can be used to create a area around the object spawner of which the Dynamic AI pathfinding will stay away from. RandomStackSpawnerSwitcher : This can be used to create a dynamic formed stacked set of goods. Like a pallet with boxes spawned on top. Difference with the one above is that this one can select from a Data Table object to select randomly a 'goods'/object type, and it's stacking settings. One can control it with the following settings: Setting Description Data Table The Data Table object of the type RandomStackSpawnerSwitcherStruct to set the object types and their settings similar to the ones above for the normal RandomStackSpawner Chance To Spawn Percentage of chance to spawn each object in the stack Chance to Change Percentage of chance to alter the stack configuration every so many seconds Chance To Switch Percentage of chance to switch to a different object type from Data Table Average Time Between Changes Average time delta in seconds between changes Max Time Between Changes Offset Maximum time delta offset in seconds between changes (to not have all objects change at same time a small random offset is used) RandomStaticModifier : This can be used to spawn a singular static and alter it's spawn transform dynamically. One can control it with the following settings: Setting Description Static The StaticMesh that needs to be spawned Chance To Spawn Percentage of chance to spawn the object Max Rotation Offset Maximum rotation in degrees (both positive and negative) to alter the transform Max XPosition Offset Maximum position offset in cm (both positive and negative) to alter the transform in the X axis Max YPosition Offset Maximum position offset in cm (both positive and negative) to alter the transform in the Y axis Chance to Change Percentage of chance to alter the stack configuration every so many seconds Average Time Between Changes Average time delta in seconds between changes Max Time Between Changes Offset Maximum time delta offset in seconds between changes (to not have all objects change at same time a small random offset is used) RandomStaticPicker : This can be used to spawn a singular randomly picked static out of list of chosen statics and alter it's spawn transform dynamically. One can control it with the following settings: Setting Description Statics The list of StaticMesh objects that can be picked from to spawn one Chance To Spawn Percentage of chance to spawn the object Max Rotation Offset Maximum rotation in degrees (both positive and negative) to alter the transform Max XPosition Offset Maximum position offset in cm (both positive and negative) to alter the transform in the X axis Max YPosition Offset Maximum position offset in cm (both positive and negative) to alter the transform in the Y axis Chance to Change Percentage of chance to alter the stack configuration every so many seconds Average Time Between Changes Average time delta in seconds between changes Max Time Between Changes Offset Maximum time delta offset in seconds between changes (to not have all objects change at same time a small random offset is used) There are some other more simple dynamic objects such as doors and conveyor belts that have self-explanatory settings very similar to those above. All are based on the seed randomisation.","title":"Dynamic Static Spawners"},{"location":"dynamic_objects/#grouped-ai","text":"A human looking AI is also available to walk dynamically between a set of self chosen waypoints. They are based on the DetourAIController of Unreal so will avoid each other and the user pretty well. In order to have more control over them, some custom blueprints were created. Their main features are that the AI themselves, the waypoints and the spawners can be assigned a group ID number. So that all functionally is grouped. They also use the Seed randomisation so that they can be spawned at the same waypoints and target the same waypoints each time if the same seed value is chosen. The following blueprints are available: - GroupedTargetPoint : These are the TargetPoints (waypoints) that the AI will walk between. Their only setting is the group ID number to decide which AI group will be able to pick this waypoint to spawn AI and be the target waypoint for them. - GroupedAI : One can manually spawn an AI by placing these in the world. They need to be assigned the Group ID manually to choose which waypoints to target. - GroupedAISpawner : To automate the spawner of AI, one can use this blueprint. It will spawn Ai at the waypoints of the same group. A setting is available to configure the fill percentage. This will set the percentage of waypoints to spawn AI upon. On also has to chose which Skeletal Meshes and their Animation Blueprints can be chosen from.","title":"Grouped AI"},{"location":"dynamic_objects/#spline-animations","text":"Making statics and skeletal meshes move along a spline path at a fixed speed. See the video below for more information on how it works:","title":"Spline Animations"},{"location":"echo/","text":"How to Use Echo sensor modalities in Cosys-AirSim Cosys-AirSim supports Echo sensors for multirotors and cars. Echo sensors can be configured to behave like sonar, radar or other echo-based sensor types. The enablement of an echo sensor and the other settings can be configured via AirSimSettings json. Please see general sensors for information on configuration of general/shared sensor settings. Enabling echo sensor on a vehicle By default, echo sensors are not enabled. To enable one, set the SensorType and Enabled attributes in settings json. \"echo1\": { \"SensorType\": 7, \"Enabled\" : true, Multiple echo sensors can be enabled on a vehicle . Echo configuration The following parameters can be configured right now via settings json. Parameter Description X Y Z Position of the echo sensor relative to the vehicle (in NED, in meters) Roll Pitch Yaw Orientation of the echo sensor relative to the vehicle (in degrees, yaw-pitch-roll order to front vector +X) External Uncouple the sensor from the vehicle. If enabled, the position and orientation will be relative to Unreal world coordinates ExternalLocal When in external mode, if this is enabled the retrieved pose of the sensor will be in Local NED coordinates(from starting position from vehicle) and not converted Unreal NED coordinates which is default runParallel Uses CPU parallelisation for speeding up the ray casting for active sensing. This disables all debug drawing except for the final reflected points if enabled (DrawReflectedPoints) SenseActive Enable active sensing where the sensor will emit a signal and receive signals from the reflections SensePassive Enable passive sensing where the sensor will receive signals from other active sources in the world (Passive Echo Beacons, see details below) PassiveRadius The radius in meters in which the sensor will receive signals from passive sources if that mode is enabled NumberOfTraces Amount of traces (rays) being cast. If set to a negative value, it will only do 2D sensing in horizontal plane! SensorLowerAzimuthLimit The lower azimuth angle limit in degrees for receiving signals on the sensor (default = -90) SensorUpperAzimuthLimit The upper azimuth angle limit in degrees for receiving signals on the sensor (default = 90) SensorLowerElevationLimit The lower elevation angle limit in degrees for receiving signals on the sensor (default = -90) SensorUpperElevationLimit The upper elevation angle limit in degrees for receiving signals on the sensor (default = 90) MeasurementFrequency The frequency of the sensor (measurements/s) SensorDiameter The diameter of the sensor plane used to capture the reflecting traces (meter) ReflectionOpeningAngle Opening angle of reflections (degrees) ReflectionLimit Maximum amount of reflections that can happen. ReflectionDistanceLimit Maximum distance between two reflections (meters) AttenuationPerDistance Attenuation of signal wrt distance traveled (dB/m) AttenuationPerReflection Attenuation of signal wrt reflections (dB) AttenuationLimit Attenuation at which the signal is considered dissipated (dB) DistanceLimit Maximum distance a reflection can travel (meters) PauseAfterMeasurement Pause the simulation after each measurement. Useful for API interaction to be synced IgnoreMarked Remove objects with the Unreal Tag MarkedIgnore from the sensor data DrawReflectedPoints Draw debug points in world where reflected points are captured by the sensor DrawReflectedLines Draw debug lines in world from reflected points to the sensor DrawReflectedPaths Draw the full paths of the reflected points DrawInitialPoints Draw the points of the initial half sphere where the traces (rays) are cast DrawExternalPoints Draw a pointcloud coming through the API from an external source DrawBounceLines Draw lines of all bouncing reflections of the traces with their color depending on attenuation DrawPassiveSources Draw debug points and reflection lines for all detected passive echo sources (original sources and their reflection echos against objects) DrawPassiveLines Draw debug lines of the sensor to the passive echo sources that are detected with line of sight. DrawSensor Draw the physical sensor in the world on the vehicle with a 3D axes shown where the sensor is e.g., { \"SeeDocsAt\": \"https://cosys-lab.github.io/settings/\", \"SettingsVersion\": 2.0, \"SimMode\": \"SkidVehicle\", \"Vehicles\": { \"CPHusky\": { \"VehicleType\": \"CPHusky\", \"AutoCreate\": true, \"Sensors\": { \"SonarSensor1\": { \"SensorType\": 7, \"Enabled\": true, \"X\": 0, \"Y\": 0, \"Z\": -0.55, \"Roll\": 0, \"Pitch\": 0, \"Yaw\": 0, \"SenseActive\": true, \"SensePassive\": false, \"MeasurementFrequency\": 5, \"NumberOfTraces\": 10000, \"SensorDiameter\": 0.5, \"SensorLowerAzimuthLimit\": -90, \"SensorUpperAzimuthLimit\": 90, \"SensorLowerElevationLimit\": -90, \"SensorUpperElevationLimit\": 90, \"AttenuationPerDistance\": 0, \"AttenuationPerReflection\": 0, \"AttenuationLimit\": -100, \"DistanceLimit\": 10, \"ReflectionLimit\": 3, \"ReflectionDistanceLimit\": 0.4, \"ReflectionOpeningAngle\": 10 } } } } } Passive Echo Beacons While the default configuration of the echo sensor is to emit a signal and receive the reflections, it is also possible to have passive echo sources in the world. These are objects that emit a signal and the echo sensor will receive the reflections of these signals. This can be used to simulate other echo sources in the world that are not the echo sensor itself. One can define these from the Unreal Editor itself or through the AirSimSettings json file. In the Editor, use the search function to look for Passive Echo Beacon and add it to the world. You can alter the settings from the Details panel. In the AirSimSettings json file you can define new beacons under the PassiveEchoBeacons section. The beacons have the following settings: Parameter Description X Y Z Position of the beacon relative to the Unreal World origin, so not in robot reference frame! (in NED, in meters) Roll Pitch Yaw Orientation of the beacon relative to the Unreal World origin, so not in robot reference frame! (in degrees, yaw-pitch-roll order to front vector +X) Enabe Toggle the beacon on or off. InitialDirections Amount of traces (rays) being cast. This defines the resolution of the resulting reflection point cloud. SensorLowerAzimuthLimit The lower azimuth angle limit in degrees for sending out the initial rays of the source. (default = -90) SensorUpperAzimuthLimit The upper azimuth angle limit in degrees for sending out the initial rays of the source. (default = 90) SensorLowerElevationLimit The lower elevation angle limit in degrees for sending out the initial rays of the source. (default = -90) SensorUpperElevationLimit The upper elevation angle limit in degrees for sending out the initial rays of the source. (default = 90) ReflectionLimit Maximum amount of reflections that can happen. ReflectionDistanceLimit Maximum distance between two reflections (meters) ReflectionOnlyFinal Only save the final reflection along a trace. This will ignore all other reflections that happen along the trace in the data AttenuationPerDistance Attenuation of signal wrt distance traveled (dB/m) AttenuationPerReflection Attenuation of signal wrt reflections (dB) AttenuationLimit Attenuation at which the signal is considered dissipated (dB) DistanceLimit Maximum distance a reflection can travel (meters) DrawDebugAllPoints Draw debug points in world where reflected points are happening due to this source. It will also show the reflection direction with a line DrawDebugAllLines Draw all lines that are being cast from the source to the reflections, not only the ones that are reflected DrawDebugLocation Draw a 3D axes shown where the source is DrawDebugDuration Duration in seconds that the debug points and lines will be shown in the world. -1 is infinite. In the settings file this can look like this example : { \"SeeDocsAt\": \"https://cosys-lab.github.io/settings/\", \"SettingsVersion\": 2.0, \"SimMode\": \"SkidVehicle\", \"ViewMode\": \"\", \"Vehicles\": { \"airsimvehicle\": { \"VehicleType\": \"CPHusky\", \"AutoCreate\": true, \"Sensors\": { ... \"echo\": { \"SensorType\": 7, \"Enabled\": true, ... \"DrawPassiveSources\": false, \"DrawPassiveLines\": true, \"DrawSensor\": true, \"SenseActive\": false, \"SensePassive\": true, \"PassiveRadius\" : 10 } } } }, \"PassiveEchoBeacons\": { \"passiveEchoBeacon1\": { \"X\": 5, \"Y\": 5, \"Z\": -5, \"Roll\": 0, \"Pitch\": 0, \"Yaw\": 0, \"Enable\" : true, \"InitialDirections\": 1000, \"SensorLowerAzimuthLimit\": -90, \"SensorUpperAzimuthLimit\": 90, \"SensorLowerElevationLimit\": -90, \"SensorUpperElevationLimit\": 90, \"AttenuationPerDistance\": 0, \"AttenuationPerReflection\": 0, \"AttenuationLimit\": -100, \"DistanceLimit\": 10, \"ReflectionLimit\": 3, \"DrawDebugAllPoints\": true, \"DrawDebugAllLines\": false, \"DrawDebugLocation\": true, \"DrawDebugDuration\": -1 } } } Client API Use getEchoData(sensor name, vehicle name) API to retrieve the echo sensor data. The API returns Point-Cloud(s) as a flat array of floats, the final attenuation, total distance and reflection count (+ reflection normal for passive beacon reflections) along with the timestamp of the capture and sensor pose. Echo Pose: Default:Active Point-Cloud: Echo sensor pose in the vehicle frame / External: If set to External (see table) the coordinates will be in either Unreal NED when ExternalLocal is false or Local NED (from starting position from vehicle) when ExternalLocal is true . Active Point-Cloud The floats represent [x, y, z, attenuation, total_distance, reflection_count] for each point hit within the range in the last scan in NED format. Active Groundtruth: For each point of the Active Point-Cloud a label string is kept that has the name of the object that the point belongs to. Passive Point-Cloud: The floats represent [x, y, z, attenuation, total_distance, reflection_count, reflection angle x, reflection angle y, reflection angle z] for each point hit within the range in the last scan in NED format. Passive Groundtruth: For each point two strings are kept of the Passive Point-Cloud. The first a label string representing the object of the reflection and second the name of the Passive Echo Beacon that was the source of this reflection. Use setEchoData(sensor name, vehicle name, echo data) API to render an external pointcloud back to the simulation. It expects it to be [x,y,z] as a flat array of floats.","title":"Pulse Echo"},{"location":"echo/#how-to-use-echo-sensor-modalities-in-cosys-airsim","text":"Cosys-AirSim supports Echo sensors for multirotors and cars. Echo sensors can be configured to behave like sonar, radar or other echo-based sensor types. The enablement of an echo sensor and the other settings can be configured via AirSimSettings json. Please see general sensors for information on configuration of general/shared sensor settings.","title":"How to Use Echo sensor modalities in Cosys-AirSim"},{"location":"echo/#enabling-echo-sensor-on-a-vehicle","text":"By default, echo sensors are not enabled. To enable one, set the SensorType and Enabled attributes in settings json. \"echo1\": { \"SensorType\": 7, \"Enabled\" : true, Multiple echo sensors can be enabled on a vehicle .","title":"Enabling echo sensor on a vehicle"},{"location":"echo/#echo-configuration","text":"The following parameters can be configured right now via settings json. Parameter Description X Y Z Position of the echo sensor relative to the vehicle (in NED, in meters) Roll Pitch Yaw Orientation of the echo sensor relative to the vehicle (in degrees, yaw-pitch-roll order to front vector +X) External Uncouple the sensor from the vehicle. If enabled, the position and orientation will be relative to Unreal world coordinates ExternalLocal When in external mode, if this is enabled the retrieved pose of the sensor will be in Local NED coordinates(from starting position from vehicle) and not converted Unreal NED coordinates which is default runParallel Uses CPU parallelisation for speeding up the ray casting for active sensing. This disables all debug drawing except for the final reflected points if enabled (DrawReflectedPoints) SenseActive Enable active sensing where the sensor will emit a signal and receive signals from the reflections SensePassive Enable passive sensing where the sensor will receive signals from other active sources in the world (Passive Echo Beacons, see details below) PassiveRadius The radius in meters in which the sensor will receive signals from passive sources if that mode is enabled NumberOfTraces Amount of traces (rays) being cast. If set to a negative value, it will only do 2D sensing in horizontal plane! SensorLowerAzimuthLimit The lower azimuth angle limit in degrees for receiving signals on the sensor (default = -90) SensorUpperAzimuthLimit The upper azimuth angle limit in degrees for receiving signals on the sensor (default = 90) SensorLowerElevationLimit The lower elevation angle limit in degrees for receiving signals on the sensor (default = -90) SensorUpperElevationLimit The upper elevation angle limit in degrees for receiving signals on the sensor (default = 90) MeasurementFrequency The frequency of the sensor (measurements/s) SensorDiameter The diameter of the sensor plane used to capture the reflecting traces (meter) ReflectionOpeningAngle Opening angle of reflections (degrees) ReflectionLimit Maximum amount of reflections that can happen. ReflectionDistanceLimit Maximum distance between two reflections (meters) AttenuationPerDistance Attenuation of signal wrt distance traveled (dB/m) AttenuationPerReflection Attenuation of signal wrt reflections (dB) AttenuationLimit Attenuation at which the signal is considered dissipated (dB) DistanceLimit Maximum distance a reflection can travel (meters) PauseAfterMeasurement Pause the simulation after each measurement. Useful for API interaction to be synced IgnoreMarked Remove objects with the Unreal Tag MarkedIgnore from the sensor data DrawReflectedPoints Draw debug points in world where reflected points are captured by the sensor DrawReflectedLines Draw debug lines in world from reflected points to the sensor DrawReflectedPaths Draw the full paths of the reflected points DrawInitialPoints Draw the points of the initial half sphere where the traces (rays) are cast DrawExternalPoints Draw a pointcloud coming through the API from an external source DrawBounceLines Draw lines of all bouncing reflections of the traces with their color depending on attenuation DrawPassiveSources Draw debug points and reflection lines for all detected passive echo sources (original sources and their reflection echos against objects) DrawPassiveLines Draw debug lines of the sensor to the passive echo sources that are detected with line of sight. DrawSensor Draw the physical sensor in the world on the vehicle with a 3D axes shown where the sensor is e.g., { \"SeeDocsAt\": \"https://cosys-lab.github.io/settings/\", \"SettingsVersion\": 2.0, \"SimMode\": \"SkidVehicle\", \"Vehicles\": { \"CPHusky\": { \"VehicleType\": \"CPHusky\", \"AutoCreate\": true, \"Sensors\": { \"SonarSensor1\": { \"SensorType\": 7, \"Enabled\": true, \"X\": 0, \"Y\": 0, \"Z\": -0.55, \"Roll\": 0, \"Pitch\": 0, \"Yaw\": 0, \"SenseActive\": true, \"SensePassive\": false, \"MeasurementFrequency\": 5, \"NumberOfTraces\": 10000, \"SensorDiameter\": 0.5, \"SensorLowerAzimuthLimit\": -90, \"SensorUpperAzimuthLimit\": 90, \"SensorLowerElevationLimit\": -90, \"SensorUpperElevationLimit\": 90, \"AttenuationPerDistance\": 0, \"AttenuationPerReflection\": 0, \"AttenuationLimit\": -100, \"DistanceLimit\": 10, \"ReflectionLimit\": 3, \"ReflectionDistanceLimit\": 0.4, \"ReflectionOpeningAngle\": 10 } } } } }","title":"Echo configuration"},{"location":"echo/#passive-echo-beacons","text":"While the default configuration of the echo sensor is to emit a signal and receive the reflections, it is also possible to have passive echo sources in the world. These are objects that emit a signal and the echo sensor will receive the reflections of these signals. This can be used to simulate other echo sources in the world that are not the echo sensor itself. One can define these from the Unreal Editor itself or through the AirSimSettings json file. In the Editor, use the search function to look for Passive Echo Beacon and add it to the world. You can alter the settings from the Details panel. In the AirSimSettings json file you can define new beacons under the PassiveEchoBeacons section. The beacons have the following settings: Parameter Description X Y Z Position of the beacon relative to the Unreal World origin, so not in robot reference frame! (in NED, in meters) Roll Pitch Yaw Orientation of the beacon relative to the Unreal World origin, so not in robot reference frame! (in degrees, yaw-pitch-roll order to front vector +X) Enabe Toggle the beacon on or off. InitialDirections Amount of traces (rays) being cast. This defines the resolution of the resulting reflection point cloud. SensorLowerAzimuthLimit The lower azimuth angle limit in degrees for sending out the initial rays of the source. (default = -90) SensorUpperAzimuthLimit The upper azimuth angle limit in degrees for sending out the initial rays of the source. (default = 90) SensorLowerElevationLimit The lower elevation angle limit in degrees for sending out the initial rays of the source. (default = -90) SensorUpperElevationLimit The upper elevation angle limit in degrees for sending out the initial rays of the source. (default = 90) ReflectionLimit Maximum amount of reflections that can happen. ReflectionDistanceLimit Maximum distance between two reflections (meters) ReflectionOnlyFinal Only save the final reflection along a trace. This will ignore all other reflections that happen along the trace in the data AttenuationPerDistance Attenuation of signal wrt distance traveled (dB/m) AttenuationPerReflection Attenuation of signal wrt reflections (dB) AttenuationLimit Attenuation at which the signal is considered dissipated (dB) DistanceLimit Maximum distance a reflection can travel (meters) DrawDebugAllPoints Draw debug points in world where reflected points are happening due to this source. It will also show the reflection direction with a line DrawDebugAllLines Draw all lines that are being cast from the source to the reflections, not only the ones that are reflected DrawDebugLocation Draw a 3D axes shown where the source is DrawDebugDuration Duration in seconds that the debug points and lines will be shown in the world. -1 is infinite. In the settings file this can look like this example : { \"SeeDocsAt\": \"https://cosys-lab.github.io/settings/\", \"SettingsVersion\": 2.0, \"SimMode\": \"SkidVehicle\", \"ViewMode\": \"\", \"Vehicles\": { \"airsimvehicle\": { \"VehicleType\": \"CPHusky\", \"AutoCreate\": true, \"Sensors\": { ... \"echo\": { \"SensorType\": 7, \"Enabled\": true, ... \"DrawPassiveSources\": false, \"DrawPassiveLines\": true, \"DrawSensor\": true, \"SenseActive\": false, \"SensePassive\": true, \"PassiveRadius\" : 10 } } } }, \"PassiveEchoBeacons\": { \"passiveEchoBeacon1\": { \"X\": 5, \"Y\": 5, \"Z\": -5, \"Roll\": 0, \"Pitch\": 0, \"Yaw\": 0, \"Enable\" : true, \"InitialDirections\": 1000, \"SensorLowerAzimuthLimit\": -90, \"SensorUpperAzimuthLimit\": 90, \"SensorLowerElevationLimit\": -90, \"SensorUpperElevationLimit\": 90, \"AttenuationPerDistance\": 0, \"AttenuationPerReflection\": 0, \"AttenuationLimit\": -100, \"DistanceLimit\": 10, \"ReflectionLimit\": 3, \"DrawDebugAllPoints\": true, \"DrawDebugAllLines\": false, \"DrawDebugLocation\": true, \"DrawDebugDuration\": -1 } } }","title":"Passive Echo Beacons"},{"location":"echo/#client-api","text":"Use getEchoData(sensor name, vehicle name) API to retrieve the echo sensor data. The API returns Point-Cloud(s) as a flat array of floats, the final attenuation, total distance and reflection count (+ reflection normal for passive beacon reflections) along with the timestamp of the capture and sensor pose. Echo Pose: Default:Active Point-Cloud: Echo sensor pose in the vehicle frame / External: If set to External (see table) the coordinates will be in either Unreal NED when ExternalLocal is false or Local NED (from starting position from vehicle) when ExternalLocal is true . Active Point-Cloud The floats represent [x, y, z, attenuation, total_distance, reflection_count] for each point hit within the range in the last scan in NED format. Active Groundtruth: For each point of the Active Point-Cloud a label string is kept that has the name of the object that the point belongs to. Passive Point-Cloud: The floats represent [x, y, z, attenuation, total_distance, reflection_count, reflection angle x, reflection angle y, reflection angle z] for each point hit within the range in the last scan in NED format. Passive Groundtruth: For each point two strings are kept of the Passive Point-Cloud. The first a label string representing the object of the reflection and second the name of the Passive Echo Beacon that was the source of this reflection. Use setEchoData(sensor name, vehicle name, echo data) API to render an external pointcloud back to the simulation. It expects it to be [x,y,z] as a flat array of floats.","title":"Client API"},{"location":"event_sim/","text":"Cosys-AirSim provides a Python-based event camera simulator, aimed at performance and ability to run in real-time along with the sim. Event cameras An event camera is a special vision sensor that measures changes in logarithmic brightness and only reports 'events'. Each event is a set of four values that gets generated every time the absolute change in the logarithmic brightness exceeds a certain threshold. An event contains the timestamp of the measurement, pixel location (x and y coordinates) and the polarity: which is either +1/-1 based on whether the logarithmic brightness has increased or decreased. Most event cameras have a temporal resolution of the order of microseconds, making them significantly faster than RGB sensors, and also demonstrate a high dynamic range and low motion blur. More details about event cameras can be found in this tutorial from RPG-UZH Cosys-AirSim event simulator The Cosys-AirSim event simulator uses two consecutive RGB images (converted to grayscale), and computes \"past events\" that would have occurred during the transition based on the change in log luminance between the images. These events are reported as a stream of bytes, following this format: x and y are the pixel locations of the event firing, timestamp is the global timestamp in microseconds and pol is either +1/-1 depending on whether the brightness increased or decreased. Along with this bytestream, an accumulation of events over a 2D frame is also constructed, known as an 'event image' that visualizes +1 events as red and -1 as blue pixels. An example event image is shown below: Usage An example script to run the event simulator alongside Cosys-AirSim is located at https://github.com/Cosys-Lab/Cosys-AirSim/blob/main/PythonClient/eventcamera_sim/test_event_sim.py. The following optional command-line arguments can be passed to this script. args.width, args.height (float): Simulated event camera resolution args.save (bool): Whether or not to save the event data to a file, args.debug (bool): Whether or not to display the simulated events as an image The implementation of the actual event simulation, written in Python and numba, is at https://github.com/Cosys-Lab/Cosys-AirSim/blob/main/PythonClient/eventcamera_sim/event_simulator.py. The event simulator is initialized as follows, with the arguments controlling the resolution of the camera. from event_simulator import * ev_sim = EventSimulator(W, H) The actual computation of the events is triggered through an image_callback function, which is executed every time a new RGB image is obtained. The first time this function is called, due to the lack of a 'previous' image, it acts as an initialization of the event sim. event_img, events = ev_sim.image_callback(img, ts_delta) This function, which behaves similar to a callback (called every time a new image is received) returns an event image as a one dimensional array of +1/-1 values, thus indicating only whether events were seen at each pixel, but not the timing/number of events. This one dimensional array can be converted into the red/blue event image as seen in the function convert_event_img_rgb . events is a numpy array of events, each of format . Through this function, the event sim computes the difference between the past and the current image, and computes a stream of events which is then returned as a numpy array. This can then be appended to a file. There are quite a few parameters that can be tuned to achieve a level of visual fidelity/performance of the event simulation. The main factors to tune are the following: The resolution of the camera. The log luminance threshold TOL that determines whether or not a detected change counts as an event. Note: There is also currently a max limit on the number of events generated per pair of images, which can also be tuned. Algorithm The working of the event simulator loosely follows this set of operations: 1. Take the difference between the log intensities of the current and previous frames. 2. Iterating over all pixels, calculate the polarity for each each pixel based on a threshold of change in log intensity. 3. Determine the number of events to be fired per pixel, based on extent of intensity change over the threshold. Let $N_{max}$ be the maximum number of events that can occur at a single pixel, then the total number of firings to be simulated at pixel location $u$ would be $N_e(u) = min(N_{max}, \\frac{\\Delta L(u)}{TOL})$. 4. Determine the timestamps for each interpolated event by interpolating between the amount of time that has elapsed between the captures of the previous and current images. $t = t_{prev} + \\frac{\\Delta T}{N_e(u)}$ 5. Generate the output bytestream by simulating events at every pixel and sort by timestamp.","title":"Event camera"},{"location":"event_sim/#event-cameras","text":"An event camera is a special vision sensor that measures changes in logarithmic brightness and only reports 'events'. Each event is a set of four values that gets generated every time the absolute change in the logarithmic brightness exceeds a certain threshold. An event contains the timestamp of the measurement, pixel location (x and y coordinates) and the polarity: which is either +1/-1 based on whether the logarithmic brightness has increased or decreased. Most event cameras have a temporal resolution of the order of microseconds, making them significantly faster than RGB sensors, and also demonstrate a high dynamic range and low motion blur. More details about event cameras can be found in this tutorial from RPG-UZH","title":"Event cameras"},{"location":"event_sim/#cosys-airsim-event-simulator","text":"The Cosys-AirSim event simulator uses two consecutive RGB images (converted to grayscale), and computes \"past events\" that would have occurred during the transition based on the change in log luminance between the images. These events are reported as a stream of bytes, following this format: x and y are the pixel locations of the event firing, timestamp is the global timestamp in microseconds and pol is either +1/-1 depending on whether the brightness increased or decreased. Along with this bytestream, an accumulation of events over a 2D frame is also constructed, known as an 'event image' that visualizes +1 events as red and -1 as blue pixels. An example event image is shown below:","title":"Cosys-AirSim event simulator"},{"location":"event_sim/#usage","text":"An example script to run the event simulator alongside Cosys-AirSim is located at https://github.com/Cosys-Lab/Cosys-AirSim/blob/main/PythonClient/eventcamera_sim/test_event_sim.py. The following optional command-line arguments can be passed to this script. args.width, args.height (float): Simulated event camera resolution args.save (bool): Whether or not to save the event data to a file, args.debug (bool): Whether or not to display the simulated events as an image The implementation of the actual event simulation, written in Python and numba, is at https://github.com/Cosys-Lab/Cosys-AirSim/blob/main/PythonClient/eventcamera_sim/event_simulator.py. The event simulator is initialized as follows, with the arguments controlling the resolution of the camera. from event_simulator import * ev_sim = EventSimulator(W, H) The actual computation of the events is triggered through an image_callback function, which is executed every time a new RGB image is obtained. The first time this function is called, due to the lack of a 'previous' image, it acts as an initialization of the event sim. event_img, events = ev_sim.image_callback(img, ts_delta) This function, which behaves similar to a callback (called every time a new image is received) returns an event image as a one dimensional array of +1/-1 values, thus indicating only whether events were seen at each pixel, but not the timing/number of events. This one dimensional array can be converted into the red/blue event image as seen in the function convert_event_img_rgb . events is a numpy array of events, each of format . Through this function, the event sim computes the difference between the past and the current image, and computes a stream of events which is then returned as a numpy array. This can then be appended to a file. There are quite a few parameters that can be tuned to achieve a level of visual fidelity/performance of the event simulation. The main factors to tune are the following: The resolution of the camera. The log luminance threshold TOL that determines whether or not a detected change counts as an event. Note: There is also currently a max limit on the number of events generated per pair of images, which can also be tuned.","title":"Usage"},{"location":"event_sim/#algorithm","text":"The working of the event simulator loosely follows this set of operations: 1. Take the difference between the log intensities of the current and previous frames. 2. Iterating over all pixels, calculate the polarity for each each pixel based on a threshold of change in log intensity. 3. Determine the number of events to be fired per pixel, based on extent of intensity change over the threshold. Let $N_{max}$ be the maximum number of events that can occur at a single pixel, then the total number of firings to be simulated at pixel location $u$ would be $N_e(u) = min(N_{max}, \\frac{\\Delta L(u)}{TOL})$. 4. Determine the timestamps for each interpolated event by interpolating between the amount of time that has elapsed between the captures of the previous and current images. $t = t_{prev} + \\frac{\\Delta T}{N_e(u)}$ 5. Generate the output bytestream by simulating events at every pixel and sort by timestamp.","title":"Algorithm"},{"location":"flight_controller/","text":"Flight Controller What is Flight Controller? \"Wait!\" you ask, \"Why do you need flight controller for a simulator?\". The primary job of flight controller is to take in desired state as input, estimate actual state using sensors data and then drive the actuators in such a way so that actual state comes as close to the desired state. For quadrotors, desired state can be specified as roll, pitch and yaw, for example. It then estimates actual roll, pitch and yaw using gyroscope and accelerometer. Then it generates appropriate motor signals so actual state becomes desired state. How Simulator uses Flight Controller? Simulator consumes the motor signals generated by flight controller to figure out force and thrust generated by each actuator (i.e. propellers in case of quadrotor). This is then used by the physics engine to compute the kinetic properties of the vehicle. This in turn generates simulated sensor data and feed it back to the flight controller. What is Hardware- and Software-in-Loop? Hardware-in-Loop (HITL or HIL) means flight controller runs in actual hardware such as Naze32 or Pixhawk chip. You then connect this hardware to PC using USB port. Simulator talks to the device to retrieve actuator signals and send it simulated sensor data. This is obviously as close as you can get to real thing. However, it typically requires more steps to set up and usually hard to debug. One big issue is that simulator clock and device clock runs on their own speed and accuracy. Also, USB connection (which is usually only USB 2.0) may not be enough for real-time communication. In \"software-in-loop\" simulation (SITL or SIL) mode the firmware runs in your computer as opposed to separate board. This is generally fine except that now you are not touching any code paths that are specific to your device. Also, none of your code now runs with real-time clock usually provided by specialized hardware board. For well-designed flight controllers with software clock, these are usually not concerning issues. What Flight Controllers are Supported? AirSim has built-in flight controller called simple_flight and it is used by default. You don't need to do anything to use or configure it. AirSim also supports PX4 & ArduPilot as external flight controllers for advanced users. Using AirSim Without Flight Controller Yes, now it's possible to use AirSim without flight controller. Please see the instructions here for how to use so-called \"Computer Vision\" mode. If you don't need vehicle dynamics, we highly recommend using this mode.","title":"Flight Controller"},{"location":"flight_controller/#flight-controller","text":"","title":"Flight Controller"},{"location":"flight_controller/#what-is-flight-controller","text":"\"Wait!\" you ask, \"Why do you need flight controller for a simulator?\". The primary job of flight controller is to take in desired state as input, estimate actual state using sensors data and then drive the actuators in such a way so that actual state comes as close to the desired state. For quadrotors, desired state can be specified as roll, pitch and yaw, for example. It then estimates actual roll, pitch and yaw using gyroscope and accelerometer. Then it generates appropriate motor signals so actual state becomes desired state.","title":"What is Flight Controller?"},{"location":"flight_controller/#how-simulator-uses-flight-controller","text":"Simulator consumes the motor signals generated by flight controller to figure out force and thrust generated by each actuator (i.e. propellers in case of quadrotor). This is then used by the physics engine to compute the kinetic properties of the vehicle. This in turn generates simulated sensor data and feed it back to the flight controller.","title":"How Simulator uses Flight Controller?"},{"location":"flight_controller/#what-is-hardware-and-software-in-loop","text":"Hardware-in-Loop (HITL or HIL) means flight controller runs in actual hardware such as Naze32 or Pixhawk chip. You then connect this hardware to PC using USB port. Simulator talks to the device to retrieve actuator signals and send it simulated sensor data. This is obviously as close as you can get to real thing. However, it typically requires more steps to set up and usually hard to debug. One big issue is that simulator clock and device clock runs on their own speed and accuracy. Also, USB connection (which is usually only USB 2.0) may not be enough for real-time communication. In \"software-in-loop\" simulation (SITL or SIL) mode the firmware runs in your computer as opposed to separate board. This is generally fine except that now you are not touching any code paths that are specific to your device. Also, none of your code now runs with real-time clock usually provided by specialized hardware board. For well-designed flight controllers with software clock, these are usually not concerning issues.","title":"What is Hardware- and Software-in-Loop?"},{"location":"flight_controller/#what-flight-controllers-are-supported","text":"AirSim has built-in flight controller called simple_flight and it is used by default. You don't need to do anything to use or configure it. AirSim also supports PX4 & ArduPilot as external flight controllers for advanced users.","title":"What Flight Controllers are Supported?"},{"location":"flight_controller/#using-airsim-without-flight-controller","text":"Yes, now it's possible to use AirSim without flight controller. Please see the instructions here for how to use so-called \"Computer Vision\" mode. If you don't need vehicle dynamics, we highly recommend using this mode.","title":"Using AirSim Without Flight Controller"},{"location":"gazebo_drone/","text":"Welcome to GazeboDrone GazeboDrone allows connecting a gazebo drone to the Cosys-AirSim drone, using the gazebo drone as a flight dynamic model (FDM) and Cosys-AirSim to generate environmental sensor data. It can be used for Multicopters , Fixed-wings or any other vehicle. Dependencies Gazebo Make sure you have installed gazebo dependencies: sudo apt-get install libgazebo9-dev AirLib This project is built with GCC 8, so AirLib needs to be built with GCC 8 too. Run from your AirSim root folder: ./clean.sh ./setup.sh ./build.sh --gcc Cosys-AirSim simulator The Cosys-AirSim UE plugin needs to be built with clang, so you can't use the one compiled in the previous step. You can use our binaries or you can clone AirSim again in another folder and buid it without the above option, then you can run Blocks or your own environment. Cosys-AirSim settings Inside your settings.json file you need to add this line: \"PhysicsEngineName\":\"ExternalPhysicsEngine\" . You may want to change the visual model of the Cosys-AirSim drone, for that you can follow this tutorial. Build Execute this from your Cosys-AirSim root folder: cd GazeboDrone mkdir build && cd build cmake -DCMAKE_C_COMPILER=gcc-8 -DCMAKE_CXX_COMPILER=g++-8 .. make Run First run the Cosys-AirSim simulator and your Gazebo model and then execute this from your Cosys-AirSim root folder: cd GazeboDrone/build ./GazeboDrone","title":"Import Gazebo models"},{"location":"gazebo_drone/#welcome-to-gazebodrone","text":"GazeboDrone allows connecting a gazebo drone to the Cosys-AirSim drone, using the gazebo drone as a flight dynamic model (FDM) and Cosys-AirSim to generate environmental sensor data. It can be used for Multicopters , Fixed-wings or any other vehicle.","title":"Welcome to GazeboDrone"},{"location":"gazebo_drone/#dependencies","text":"","title":"Dependencies"},{"location":"gazebo_drone/#gazebo","text":"Make sure you have installed gazebo dependencies: sudo apt-get install libgazebo9-dev","title":"Gazebo"},{"location":"gazebo_drone/#airlib","text":"This project is built with GCC 8, so AirLib needs to be built with GCC 8 too. Run from your AirSim root folder: ./clean.sh ./setup.sh ./build.sh --gcc","title":"AirLib"},{"location":"gazebo_drone/#cosys-airsim-simulator","text":"The Cosys-AirSim UE plugin needs to be built with clang, so you can't use the one compiled in the previous step. You can use our binaries or you can clone AirSim again in another folder and buid it without the above option, then you can run Blocks or your own environment.","title":"Cosys-AirSim simulator"},{"location":"gazebo_drone/#cosys-airsim-settings","text":"Inside your settings.json file you need to add this line: \"PhysicsEngineName\":\"ExternalPhysicsEngine\" . You may want to change the visual model of the Cosys-AirSim drone, for that you can follow this tutorial.","title":"Cosys-AirSim settings"},{"location":"gazebo_drone/#build","text":"Execute this from your Cosys-AirSim root folder: cd GazeboDrone mkdir build && cd build cmake -DCMAKE_C_COMPILER=gcc-8 -DCMAKE_CXX_COMPILER=g++-8 .. make","title":"Build"},{"location":"gazebo_drone/#run","text":"First run the Cosys-AirSim simulator and your Gazebo model and then execute this from your Cosys-AirSim root folder: cd GazeboDrone/build ./GazeboDrone","title":"Run"},{"location":"gpulidar/","text":"How to Use GPU Lidar in Cosys-AirSim Cosys-AirSim supports a GPU accelerated Lidar for multirotors and cars. It uses a depth camera that rotates around to simulate a Lidar while exploiting the GPU to do most of the work. Should allow for a large increase in amount of points that can be simulated. The enablement of a GPU lidar and the other lidar settings can be configured via AirSimSettings json. Please see general sensors for information on configuration of general/shared sensor settings. Note that this sensor type is currently not supported for Multirotor mode. It only works for Car and Computervision. Enabling GPU lidar on a vehicle By default, GPU lidars are not enabled. To enable the sensor, set the SensorType and Enabled attributes in settings json. \"GPULidar1\": { \"SensorType\": 8, \"Enabled\" : true, Multiple GPU lidars can be enabled on a vehicle. But one has to turn off DrawDebugPoints! Ignoring glass and other material types One can set an object that should be invisible to LIDAR sensors (such as glass) by giving them an Unreal Tag called LidarIgnore . GPU Lidar configuration The following parameters can be configured right now via settings json. For some more information check the publication on this topic here . Parameter Description NumberOfChannels Number of channels/lasers of the lidar. When set to 1 it will act as a 2D horizontal LiDAR and will use the VerticalFOVUpper value as the vertical angle to scan. Range Range, in meters MeasurementsPerCycle amount of measurements in one full cycle (horizontal resolution) RotationsPerSecond Rotations per second Resolution Defines the resolution of the depth camera image that generates the Lidar point cloud HorizontalFOVStart Horizontal FOV start for the lidar, in degrees HorizontalFOVEnd Horizontal FOV end for the lidar, in degrees VerticalFOVUpper Vertical FOV upper limit for the lidar, in degrees VerticalFOVLower Vertical FOV lower limit for the lidar, in degrees X Y Z Position of the lidar relative to the vehicle (in NED, in meters) Roll Pitch Yaw Orientation of the lidar relative to the vehicle (in degrees, yaw-pitch-roll order to front vector +X) IgnoreMarked Remove objects with the Unreal Tag MarkedIgnore from the sensor data GroundTruth Generate ground truth labeling color values InstanceSegmentation Enable to set the generated ground truth to the instance segmentation labeling. Set to false to choose a different annotation label Annotation If GroundTruth is enabled and InstanceSegmentation is disabled, you can set this value to the name of the annotation you want to use. This will be used for the ground truth color labels. DrawSensor Draw the physical sensor in the world on the vehicle with a 3D axes shown where the sensor is External Uncouple the sensor from the vehicle. If enabled, the position and orientation will be relative to Unreal world coordinates in NED format from the settings file. ExternalLocal When in external mode, if this is enabled the retrieved pose of the sensor will be in Local NED coordinates(from starting position from vehicle) and not converted Unreal NED coordinates which is default GenerateIntensity Toggle intensity calculation on or off. This requires a surface material map to be available. See below for more information. rangeMaxLambertianPercentage Lambertian reflectivity percentage to max out on. Will act linear to 0% for below. rainMaxIntensity Rain intensity maximum to scale from in mm/hour. rainConstantA Constant one to to calculate the extinction coefficient in rain rainConstantB Constant one to to calculate the extinction coefficient in rain GenerateNoise Generate and add range-noise based on normal distribution if set to true MinNoiseStandardDeviation The standard deviation to generate the noise normal distribution, in meters. This is the minimal noise (at 0 distance) NoiseDistanceScale To scale the noise with distance, set this parameter. This way the minimal noise is scaled depending on the distance compared to total maximum range of the sensor { \"SeeDocsAt\": \"https://cosys-lab.github.io/settings/\", \"SettingsVersion\": 2.0, \"SimMode\": \"SkidVehicle\", \"Vehicles\": { \"airsimvehicle\": { \"VehicleType\": \"CPHusky\", \"AutoCreate\": true, \"Sensors\": { \"gpulidar1\": { \"SensorType\": 8, \"Enabled\" : true, \"External\": false, \"NumberOfChannels\": 32, \"Range\": 50, \"Resolution\": 1024, \"RotationsPerSecond\": 10, \"MeasurementsPerCycle\": 512, \"X\": 0, \"Y\": 0, \"Z\": -0.3, \"Roll\": 0, \"Pitch\": 0, \"Yaw\" : 0, \"VerticalFOVUpper\": 20, \"VerticalFOVLower\": -20, \"HorizontalFOVStart\": 0, \"HorizontalFOVEnd\": 360, \"DrawDebugPoints\": true, \"DrawMode\": 1, \"Resolution\": 1024, \"IgnoreMarked\": true, \"GroundTruth\": true, \"InstanceSegmentation\": true, \"Annotation\": \"\", \"GenerateIntensity\": false, \"rangeMaxLambertianPercentage\": 80, \"rainMaxIntensity\": 70, \"rainConstantA\": 0.01, \"rainConstantB\": 0.6, \"DrawSensor\": false } } } } } Intensity Surface Material map If 'GenerateIntensity' is enabled in the settings json, a surface material map is required. This map is used to calculate the intensity of the lidar points. e.g.: wood,0.9 alluminium,0.5 concrete,0.3 asphalt,0.1 This needs to be saved as 'materials.csv' in your documents folder where also your settings json file resides. Server side visualization for debugging By default, the lidar points are not drawn on the viewport. To enable the drawing of hit laser points on the viewport, please enable setting 'DrawDebugPoints' via settings json. This is only for testing purposes and will affect the data slightly. It also needs to be disabled when using multiple Lidar sensors to avoid artifacts!! e.g.: \"Lidar1\": { ... \"DrawDebugPoints\": true }, You can also tweak the variation of debugging with the 'DrawMode' parameter: - 0 = no coloring - 1 = groundtruth color labels (instance segmentation or other annotation labels depending on settings) - 2 = material - 3 = impact angle - 4 = intensity e.g.: \"Lidar1\": { ... \"DrawDebugPoints\": true, \"DrawMode\": 4 }, Client API Use getGPULidarData(sensor name, vehicle name) API to retrieve the GPU Lidar data. The API returns a Point-Cloud as a flat array of floats along with the timestamp of the capture and lidar pose. Point-Cloud: The floats represent [x,y,z, rgb, intensity] coordinate for each point hit within the range in the last scan in NED format. Lidar Pose: Default: sensor pose in the vehicle frame / External: If set to External (see table) the coordinates will be in either Unreal NED when ExternalLocal is false or Local NED (from starting position from vehicle) when ExternalLocal is true . Rgb represents a float32 representation of the RGB8 value that is linked either the instance segmentation system or a different annotation label. See the Image API documentation , Annotation documentation and the instance segmentation documentation . The float32 comes from binary concatenation of the RGB8 values : rgb = value_segmentation.R << 16 | value_segmentation.G << 8 | value_segmentation.B \\ It can be retrieved from the API and converted back to RGB8 with for example the following Python code: lidar_data = client.getGPULidarData('lidar', 'vehicle') points = np.array(lidar_data.point_cloud, dtype=np.dtype('f4')) points = np.reshape(points, (int(points.shape[0] / 5), 5)) rgb_values = points[:, 3].astype(np.uint32) rgb = np.zeros((np.shape(points)[0], 3)) xyz = points[:, 0:3] for index, rgb_value in enumerate(rgb_values): rgb[index, 0] = (rgb_value >> 16) & 0xFF rgb[index, 1] = (rgb_value >> 8) & 0xFF rgb[index, 2] = rgb_value & 0xFF","title":"GPU LIDAR"},{"location":"gpulidar/#how-to-use-gpu-lidar-in-cosys-airsim","text":"Cosys-AirSim supports a GPU accelerated Lidar for multirotors and cars. It uses a depth camera that rotates around to simulate a Lidar while exploiting the GPU to do most of the work. Should allow for a large increase in amount of points that can be simulated. The enablement of a GPU lidar and the other lidar settings can be configured via AirSimSettings json. Please see general sensors for information on configuration of general/shared sensor settings. Note that this sensor type is currently not supported for Multirotor mode. It only works for Car and Computervision.","title":"How to Use GPU Lidar in Cosys-AirSim"},{"location":"gpulidar/#enabling-gpu-lidar-on-a-vehicle","text":"By default, GPU lidars are not enabled. To enable the sensor, set the SensorType and Enabled attributes in settings json. \"GPULidar1\": { \"SensorType\": 8, \"Enabled\" : true, Multiple GPU lidars can be enabled on a vehicle. But one has to turn off DrawDebugPoints!","title":"Enabling GPU lidar on a vehicle"},{"location":"gpulidar/#ignoring-glass-and-other-material-types","text":"One can set an object that should be invisible to LIDAR sensors (such as glass) by giving them an Unreal Tag called LidarIgnore .","title":"Ignoring glass and other material types"},{"location":"gpulidar/#gpu-lidar-configuration","text":"The following parameters can be configured right now via settings json. For some more information check the publication on this topic here . Parameter Description NumberOfChannels Number of channels/lasers of the lidar. When set to 1 it will act as a 2D horizontal LiDAR and will use the VerticalFOVUpper value as the vertical angle to scan. Range Range, in meters MeasurementsPerCycle amount of measurements in one full cycle (horizontal resolution) RotationsPerSecond Rotations per second Resolution Defines the resolution of the depth camera image that generates the Lidar point cloud HorizontalFOVStart Horizontal FOV start for the lidar, in degrees HorizontalFOVEnd Horizontal FOV end for the lidar, in degrees VerticalFOVUpper Vertical FOV upper limit for the lidar, in degrees VerticalFOVLower Vertical FOV lower limit for the lidar, in degrees X Y Z Position of the lidar relative to the vehicle (in NED, in meters) Roll Pitch Yaw Orientation of the lidar relative to the vehicle (in degrees, yaw-pitch-roll order to front vector +X) IgnoreMarked Remove objects with the Unreal Tag MarkedIgnore from the sensor data GroundTruth Generate ground truth labeling color values InstanceSegmentation Enable to set the generated ground truth to the instance segmentation labeling. Set to false to choose a different annotation label Annotation If GroundTruth is enabled and InstanceSegmentation is disabled, you can set this value to the name of the annotation you want to use. This will be used for the ground truth color labels. DrawSensor Draw the physical sensor in the world on the vehicle with a 3D axes shown where the sensor is External Uncouple the sensor from the vehicle. If enabled, the position and orientation will be relative to Unreal world coordinates in NED format from the settings file. ExternalLocal When in external mode, if this is enabled the retrieved pose of the sensor will be in Local NED coordinates(from starting position from vehicle) and not converted Unreal NED coordinates which is default GenerateIntensity Toggle intensity calculation on or off. This requires a surface material map to be available. See below for more information. rangeMaxLambertianPercentage Lambertian reflectivity percentage to max out on. Will act linear to 0% for below. rainMaxIntensity Rain intensity maximum to scale from in mm/hour. rainConstantA Constant one to to calculate the extinction coefficient in rain rainConstantB Constant one to to calculate the extinction coefficient in rain GenerateNoise Generate and add range-noise based on normal distribution if set to true MinNoiseStandardDeviation The standard deviation to generate the noise normal distribution, in meters. This is the minimal noise (at 0 distance) NoiseDistanceScale To scale the noise with distance, set this parameter. This way the minimal noise is scaled depending on the distance compared to total maximum range of the sensor { \"SeeDocsAt\": \"https://cosys-lab.github.io/settings/\", \"SettingsVersion\": 2.0, \"SimMode\": \"SkidVehicle\", \"Vehicles\": { \"airsimvehicle\": { \"VehicleType\": \"CPHusky\", \"AutoCreate\": true, \"Sensors\": { \"gpulidar1\": { \"SensorType\": 8, \"Enabled\" : true, \"External\": false, \"NumberOfChannels\": 32, \"Range\": 50, \"Resolution\": 1024, \"RotationsPerSecond\": 10, \"MeasurementsPerCycle\": 512, \"X\": 0, \"Y\": 0, \"Z\": -0.3, \"Roll\": 0, \"Pitch\": 0, \"Yaw\" : 0, \"VerticalFOVUpper\": 20, \"VerticalFOVLower\": -20, \"HorizontalFOVStart\": 0, \"HorizontalFOVEnd\": 360, \"DrawDebugPoints\": true, \"DrawMode\": 1, \"Resolution\": 1024, \"IgnoreMarked\": true, \"GroundTruth\": true, \"InstanceSegmentation\": true, \"Annotation\": \"\", \"GenerateIntensity\": false, \"rangeMaxLambertianPercentage\": 80, \"rainMaxIntensity\": 70, \"rainConstantA\": 0.01, \"rainConstantB\": 0.6, \"DrawSensor\": false } } } } }","title":"GPU Lidar configuration"},{"location":"gpulidar/#intensity-surface-material-map","text":"If 'GenerateIntensity' is enabled in the settings json, a surface material map is required. This map is used to calculate the intensity of the lidar points. e.g.: wood,0.9 alluminium,0.5 concrete,0.3 asphalt,0.1 This needs to be saved as 'materials.csv' in your documents folder where also your settings json file resides.","title":"Intensity Surface Material map"},{"location":"gpulidar/#server-side-visualization-for-debugging","text":"By default, the lidar points are not drawn on the viewport. To enable the drawing of hit laser points on the viewport, please enable setting 'DrawDebugPoints' via settings json. This is only for testing purposes and will affect the data slightly. It also needs to be disabled when using multiple Lidar sensors to avoid artifacts!! e.g.: \"Lidar1\": { ... \"DrawDebugPoints\": true }, You can also tweak the variation of debugging with the 'DrawMode' parameter: - 0 = no coloring - 1 = groundtruth color labels (instance segmentation or other annotation labels depending on settings) - 2 = material - 3 = impact angle - 4 = intensity e.g.: \"Lidar1\": { ... \"DrawDebugPoints\": true, \"DrawMode\": 4 },","title":"Server side visualization for debugging"},{"location":"gpulidar/#client-api","text":"Use getGPULidarData(sensor name, vehicle name) API to retrieve the GPU Lidar data. The API returns a Point-Cloud as a flat array of floats along with the timestamp of the capture and lidar pose. Point-Cloud: The floats represent [x,y,z, rgb, intensity] coordinate for each point hit within the range in the last scan in NED format. Lidar Pose: Default: sensor pose in the vehicle frame / External: If set to External (see table) the coordinates will be in either Unreal NED when ExternalLocal is false or Local NED (from starting position from vehicle) when ExternalLocal is true . Rgb represents a float32 representation of the RGB8 value that is linked either the instance segmentation system or a different annotation label. See the Image API documentation , Annotation documentation and the instance segmentation documentation . The float32 comes from binary concatenation of the RGB8 values : rgb = value_segmentation.R << 16 | value_segmentation.G << 8 | value_segmentation.B \\ It can be retrieved from the API and converted back to RGB8 with for example the following Python code: lidar_data = client.getGPULidarData('lidar', 'vehicle') points = np.array(lidar_data.point_cloud, dtype=np.dtype('f4')) points = np.reshape(points, (int(points.shape[0] / 5), 5)) rgb_values = points[:, 3].astype(np.uint32) rgb = np.zeros((np.shape(points)[0], 3)) xyz = points[:, 0:3] for index, rgb_value in enumerate(rgb_values): rgb[index, 0] = (rgb_value >> 16) & 0xFF rgb[index, 1] = (rgb_value >> 8) & 0xFF rgb[index, 2] = rgb_value & 0xFF","title":"Client API"},{"location":"image_apis/","text":"Image APIs Please read general API doc first if you are not familiar with AirSim APIs. Getting a Single Image Here's a sample code to get a single image from camera named \"0\". The returned value is bytes of png format image. To get uncompressed and other format as well as available cameras please see next sections. Python import cosysairsim as airsim # for car use CarClient() client = airsim.MultirotorClient() png_image = client.simGetImage(\"0\", airsim.ImageType.Scene) # do something with image C++ #include \"vehicles/multirotor/api/MultirotorRpcLibClient.hpp\" int getOneImage() { using namespace msr::airlib; // for car use CarRpcLibClient MultirotorRpcLibClient client; std::vector png_image = client.simGetImage(\"0\", VehicleCameraBase::ImageType::Scene); // do something with images } Getting Images with More Flexibility The simGetImages API which is slightly more complex to use than simGetImage API, for example, you can get left camera view, right camera view and depth image from left camera in a single API call. The simGetImages API also allows you to get uncompressed images as well as floating point single channel images (instead of 3 channel (RGB), each 8 bit). Python import cosysairsim as airsim # for car use CarClient() client = airsim.MultirotorClient() responses = client.simGetImages([ # png format airsim.ImageRequest(0, airsim.ImageType.Scene), # uncompressed RGB array bytes airsim.ImageRequest(1, airsim.ImageType.Scene, False, False), # floating point uncompressed image airsim.ImageRequest(1, airsim.ImageType.DepthPlanar, True)]) # do something with response which contains image data, pose, timestamp etc Using AirSim Images with NumPy If you plan to use numpy for image manipulation, you should get uncompressed RGB image and then convert to numpy like this: responses = client.simGetImages([airsim.ImageRequest(\"0\", airsim.ImageType.Scene, False, False)]) response = responses[0] # get numpy array img1d = np.fromstring(response.image_data_uint8, dtype=np.uint8) # reshape array to 4 channel image array H X W X 4 img_rgb = img1d.reshape(response.height, response.width, 3) # original image is fliped vertically img_rgb = np.flipud(img_rgb) # write to png airsim.write_png(os.path.normpath(filename + '.png'), img_rgb) Quick Tips The API simGetImage returns binary string literal which means you can simply dump it in binary file to create a .png file. However if you want to process it in any other way than you can handy function airsim.string_to_uint8_array . This converts binary string literal to NumPy uint8 array. The API simGetImages can accept request for multiple image types from any cameras in single call. You can specify if image is png compressed, RGB uncompressed or float array. For png compressed images, you get binary string literal . For float array you get Python list of float64. You can convert this float array to NumPy 2D array using airsim.list_to_2d_float_array(response.image_data_float, response.width, response.height) You can also save float array to .pfm file (Portable Float Map format) using airsim.write_pfm() function. If you are looking to query position and orientation information in sync with a call to one of the image APIs, you can use client.simPause(True) and client.simPause(False) to pause the simulation while calling the image API and querying the desired physics state, ensuring that the physics state remains the same immediately after the image API call. C++ int getStereoAndDepthImages() { using namespace msr::airlib; typedef VehicleCameraBase::ImageRequest ImageRequest; typedef VehicleCameraBase::ImageResponse ImageResponse; typedef VehicleCameraBase::ImageType ImageType; // for car use // CarRpcLibClient client; MultirotorRpcLibClient client; // get right, left and depth images. First two as png, second as float16. std::vector request = { //png format ImageRequest(\"0\", ImageType::Scene), //uncompressed RGB array bytes ImageRequest(\"1\", ImageType::Scene, false, false), //floating point uncompressed image ImageRequest(\"1\", ImageType::DepthPlanar, true) }; const std::vector& response = client.simGetImages(request); // do something with response which contains image data, pose, timestamp etc } Ready to Run Complete Examples Python C++ For a more complete ready to run sample code please see sample code in HelloDrone project for multirotors or HelloCar project . See also other example code that generates specified number of stereo images along with ground truth depth and disparity and saving it to pfm format . Available Cameras These are the default cameras already available in each vehicle. Apart from these, you can add more cameras to the vehicles or make them are not attached to any vehicle by setting them as external . Car The cameras on car can be accessed by following names in API calls: front_center , front_right , front_left , fpv and back_center . Here FPV camera is driver's head position in the car. Multirotor The cameras on the drone can be accessed by following names in API calls: front_center , front_right , front_left , bottom_center and back_center . Computer Vision Mode Camera names are same as in multirotor. Backward compatibility for camera names Before AirSim v1.2, cameras were accessed using ID numbers instead of names. For backward compatibility you can still use following ID numbers for above camera names in same order as above: \"0\" , \"1\" , \"2\" , \"3\" , \"4\" . In addition, camera name \"\" is also available to access the default camera which is generally the camera \"0\" . \"Computer Vision\" Mode You can use AirSim in so-called \"Computer Vision\" mode. In this mode, physics engine is disabled and there is no vehicle, just cameras (If you want to have the vehicle but without its kinematics, you can use the Multirotor mode with the Physics Engine ExternalPhysicsEngine ). You can move around using keyboard (use F1 to see help on keys). You can press Record button to continuously generate images. Or you can call APIs to move cameras around and take images. You can use AirSim in so-called \"Computer Vision\" mode. In this mode, physics engine is disabled. It has a standard set of cameras and can have any sensor added similar to other vehicles. You can move around using keyboard (use F1 to see help on keys, additionally use left shift to go faster and spacebar to hold in place (handy for when moving camera manually). You can press Record button to continuously generate images. Or you can call APIs to move cameras around and take images. To active this mode, edit settings.json that you can find in your Documents\\AirSim folder (or ~/Documents/AirSim on Linux) and make sure following values exist at root level: { \"SettingsVersion\": 2.0, \"SimMode\": \"ComputerVision\" } This mode was inspired from UnrealCV project . Setting Pose in Computer Vision Mode To move around the environment using APIs you can use simSetVehiclePose API. This API takes position and orientation and sets that on the invisible vehicle where the front-center camera is located. All rest of the cameras move along keeping the relative position. If you don't want to change position (or orientation) then just set components of position (or orientation) to floating point nan values. The simGetVehiclePose allows to retrieve the current pose. You can also use simGetGroundTruthKinematics to get the quantities kinematics quantities for the movement. Many other non-vehicle specific APIs are also available such as segmentation APIs, collision APIs and camera APIs. Camera APIs The simGetCameraInfo returns the FOV(in degrees), projection matrix of a camera as well as the pose which can be: Default: The pose of the camera in the vehicle frame. External: If set to External the coordinates will be in either Unreal NED when ExternalLocal is false or Local NED (from starting position from vehicle) when ExternalLocal is true . Note that if MoveWorldOrigin in the settings.json is set to true the Unreal coordinates will be moved to be the same origin as the player start location and as such this may effect where the sensor will spawn and which coordinates are returned when ExternalLocal is false . The simSetCameraPose sets the pose for the specified camera while taking an input pose as a combination of relative position and a quaternion in NED frame. The handy airsim.to_quaternion() function allows to convert pitch, roll, yaw to quaternion. For example, to set camera-0 to 15-degree pitch while maintaining the same position, you can use: camera_pose = airsim.Pose(airsim.Vector3r(0, 0, 0), airsim.to_quaternion(0.261799, 0, 0)) #PRY in radians client.simSetCameraPose(0, camera_pose); simSetCameraFov allows changing the Field-of-View of the camera at runtime. simSetDistortionParams , simGetDistortionParams allow setting and fetching the distortion parameters K1, K2, K3, P1, P2 All Camera APIs take in 3 common parameters apart from the API-specific ones, camera_name (str), vehicle_name (str). Camera and vehicle name is used to get the specific camera on the specific vehicle. Gimbal You can set stabilization for pitch, roll or yaw for any camera using settings . Changing Resolution and Camera Parameters To change resolution, FOV etc, you can use settings.json . For example, below addition in settings.json sets parameters for scene capture and uses \"Computer Vision\" mode described above. If you omit any setting then below default values will be used. For more information see settings doc . If you are using stereo camera, currently the distance between left and right is fixed at 25 cm. { \"SettingsVersion\": 2.0, \"CameraDefaults\": { \"CaptureSettings\": [ { \"ImageType\": 0, \"Width\": 256, \"Height\": 144, \"FOV_Degrees\": 90, \"AutoExposureBias\": 1.3, \"AutoExposureMaxBrightness\": 0.64, \"AutoExposureMinBrightness\": 0.03, \"MotionBlurAmount\": 1, \"MotionBlurMax\": 10, \"ChromaticAberrationScale\": 2, \"LumenGIEnable\": true, \"LumenReflectionEnable\": true, \"LumenFinalQuality\": 1, \"LumenSceneDetail\": 1, \"LumenSceneLightningDetail\": 1 } ] }, \"SimMode\": \"ComputerVision\" } What Does Pixel Values Mean in Different Image Types? Available ImageType Values Scene = 0, DepthPlanar = 1, DepthPerspective = 2, DepthVis = 3, DisparityNormalized = 4, Segmentation = 5, SurfaceNormals = 6, Infrared = 7, OpticalFlow = 8, OpticalFlowVis = 9 Annotation = 10 DepthPlanar and DepthPerspective You normally want to retrieve the depth image as float (i.e. set pixels_as_float = true ) and specify ImageType = DepthPlanar or ImageType = DepthPerspective in ImageRequest . For ImageType = DepthPlanar , you get depth in camera plane, i.e., all points that are plane-parallel to the camera have same depth. For ImageType = DepthPerspective , you get depth from camera using a projection ray that hits that pixel. Depending on your use case, planner depth or perspective depth may be the ground truth image that you want. For example, you may be able to feed perspective depth to ROS package such as depth_image_proc to generate a point cloud. Or planner depth may be more compatible with estimated depth image generated by stereo algorithms such as SGM. DepthVis When you specify ImageType = DepthVis in ImageRequest , you get an image that helps depth visualization. In this case, each pixel value is interpolated from black to white depending on depth in camera plane in meters. The pixels with pure white means depth of 100m or more while pure black means depth of 0 meters. DisparityNormalized You normally want to retrieve disparity image as float (i.e. set pixels_as_float = true and specify ImageType = DisparityNormalized in ImageRequest ) in which case each pixel is (Xl - Xr)/Xmax , which is thereby normalized to values between 0 to 1. Segmentation When you specify ImageType = Segmentation in ImageRequest , you get an image that gives you ground truth instance segmentation of the scene. At the startup, AirSim assigns a random color index to each mesh available in environment. The RGB values for each color index ID can be retrieved from the API. You can assign a specific value to a specific mesh using APIs. For example, below Python code sets the object ID for the mesh called \"Ground\" to 20 in Blocks environment and hence changes its color in Segmentation view to the 20th color of the instance segmentation colormap: Note that this will not do a check if this color is already assigned to a different object! success = client.simSetSegmentationObjectID(\"Ground\", 20) The return value is a boolean type that lets you know if the mesh was found. Notice that typical Unreal environments, like Blocks, usually have many other meshes that comprises of same object, for example, \"Ground_2\", \"Ground_3\" and so on. As it is tedious to set object ID for all of these meshes, AirSim also supports regular expressions. For example, the code below sets all meshes which have names starting with \"ground\" (ignoring case) to 21 with just one line: success = client.simSetSegmentationObjectID(\"ground[\\w]*\", 21, True) The return value is true if at least one mesh was found using regular expression matching. When wanting to retrieve the segmentation image through the API, it is recommended that you request uncompressed image using this API to ensure you get precise RGB values for segmentation image: responses = client.simGetImages([airsim.ImageRequest( \"front_center\", airsim.ImageType.Segmentation, False, False)]) img_rgb_string = responses[0].image_data_uint8 rgbarray = np.frombuffer(img_rgb_string, np.uint8) rgbarray_shaped = rgbarray.reshape((540,960,3)) rgbarray_shaped = rgbarray_shaped img = Image.fromarray(rgbarray_shaped, 'RGB') img.show() To retrieve the color map to know which color is assign to each color index you can use: colorMap = client.simGetSegmentationColorMap() An example can be found in segmentation_test.py . For a script that generates a full list of objects and their associated color, please see the script segmentation_generate_list.py . How to Find Mesh names? To get desired ground truth segmentation you will need to know the names of the meshes in your Unreal environment. To do this, you can use the API: currentObjectList = client.simListInstanceSegmentationObjects() This will use an understandable naming depending on the hierarchy the object belong to in the Unreal World (example box_2_fullpalletspawner_5_pallet_4 or door_window_door_38 ). Note that this provides a different result from simListSceneObjects() as this one will make a simple list of all Unreal Actors in the scene, without keeping the hierarchy in mind. An extension to simListInstanceSegmentationObjects() is simListInstanceSegmentationPoses(ned=True, only_visible=True) which will retrieve the 3D object pose of each element in the same order as the first mentioned function. only_visible allows you to only get the objects that are physically visible in the scene. Once you decide on the meshes you are interested, note down their names and use above API to set their object IDs. T Changing Colors for Object IDs At present the color for each object ID is fixed as in this pallet . We will be adding ability to change colors for object IDs to desired values shortly. In the meantime you can open the segmentation image in your favorite image editor and get the RGB values you are interested in. Startup Object IDs At the start, AirSim assigns color indexes to each object found in environment of type UStaticMeshComponent or USkinnedMeshComponent . It then makes an understandable naming depending on the hierarchy the object belong to in the Unreal World (example box_2_fullpalletspawner_5_pallet_4 or door_window_door_38 ). Getting Object ID for Mesh The simGetSegmentationObjectID API allows you get object ID for given mesh name. More information Please see the instance segmentation documentation for some more information on the segmentation system created by Cosys-Lab. Infrared Currently, this is just a map from object ID to grey scale 0-255. So any mesh with object ID 42 shows up with color (42, 42, 42). Please see segmentation section for more details on how to set object IDs. Typically noise setting can be applied for this image type to get slightly more realistic effect. We are still working on adding other infrared artifacts and any contributions are welcome. OpticalFlow and OpticalFlowVis These image types return information about motion perceived by the point of view of the camera. OpticalFlow returns a 2-channel image where the channels correspond to vx and vy respectively. OpticalFlowVis is similar to OpticalFlow but converts flow data to RGB for a more 'visual' output. Object Detection This feature lets you generate object detection using existing cameras in AirSim, find more info here . Annotation The annotation system allows you to choose different groundtruth labeling techniques to create more data from your simulation. Find more info here . When enabling annotation layers, one can choose to render images as well from these layers. The image type if set to annotation does usually require to also supply the name of the annotation layer as defined in the settings. For example with Python, you can use the following examples for RGB and greyscale annotation layers. responses = client.simGetImages([airsim.ImageRequest( \"front_center\", airsim.ImageType.Annotation, False, False, \"RGBTest\")]) img_rgb_string = responses[0].image_data_uint8 rgbarray = np.frombuffer(img_rgb_string, np.uint8) rgbarray_shaped = rgbarray.reshape((540,960,3)) img = Image.fromarray(rgbarray_shaped, 'RGB') img.show() responses = client.simGetImages([airsim.ImageRequest( \"front_center\", airsim.ImageType.Annotation, False, False, \"GreyscaleTest\")]) img_rgb_string = responses[0].image_data_uint8 rgbarray = np.frombuffer(img_rgb_string, np.uint8) rgbarray_shaped = rgbarray.reshape((540,960,3)) greyscale_values = np.divide(rgbarray_shaped[:,:,0], 255) img = Image.fromarray(rgbarray_shaped[:,:,0]) img.show() Lumen Lightning for Scene camera Unreal 5 introduces Lumen lightning. Due to the cameras using scene capture components enabling Lumen for them can be costly on performance. Settings have been added specfically for the scene camera to customize the usage of Lumen for Global Illumination and Reflections. The LumenGIEnable and LumenReflectionEnable settings enable or disable Lumen for the camera. The LumenFinalQuality (0.25-2) setting determines the quality of the final image. The LumenSceneDetail (0.25-4) setting determines the quality of the scene. The LumenSceneLightningDetail (0.25-2) setting determines the quality of the lightning in the scene.","title":"Image APIs"},{"location":"image_apis/#image-apis","text":"Please read general API doc first if you are not familiar with AirSim APIs.","title":"Image APIs"},{"location":"image_apis/#getting-a-single-image","text":"Here's a sample code to get a single image from camera named \"0\". The returned value is bytes of png format image. To get uncompressed and other format as well as available cameras please see next sections.","title":"Getting a Single Image"},{"location":"image_apis/#python","text":"import cosysairsim as airsim # for car use CarClient() client = airsim.MultirotorClient() png_image = client.simGetImage(\"0\", airsim.ImageType.Scene) # do something with image","title":"Python"},{"location":"image_apis/#c","text":"#include \"vehicles/multirotor/api/MultirotorRpcLibClient.hpp\" int getOneImage() { using namespace msr::airlib; // for car use CarRpcLibClient MultirotorRpcLibClient client; std::vector png_image = client.simGetImage(\"0\", VehicleCameraBase::ImageType::Scene); // do something with images }","title":"C++"},{"location":"image_apis/#getting-images-with-more-flexibility","text":"The simGetImages API which is slightly more complex to use than simGetImage API, for example, you can get left camera view, right camera view and depth image from left camera in a single API call. The simGetImages API also allows you to get uncompressed images as well as floating point single channel images (instead of 3 channel (RGB), each 8 bit).","title":"Getting Images with More Flexibility"},{"location":"image_apis/#python_1","text":"import cosysairsim as airsim # for car use CarClient() client = airsim.MultirotorClient() responses = client.simGetImages([ # png format airsim.ImageRequest(0, airsim.ImageType.Scene), # uncompressed RGB array bytes airsim.ImageRequest(1, airsim.ImageType.Scene, False, False), # floating point uncompressed image airsim.ImageRequest(1, airsim.ImageType.DepthPlanar, True)]) # do something with response which contains image data, pose, timestamp etc","title":"Python"},{"location":"image_apis/#using-airsim-images-with-numpy","text":"If you plan to use numpy for image manipulation, you should get uncompressed RGB image and then convert to numpy like this: responses = client.simGetImages([airsim.ImageRequest(\"0\", airsim.ImageType.Scene, False, False)]) response = responses[0] # get numpy array img1d = np.fromstring(response.image_data_uint8, dtype=np.uint8) # reshape array to 4 channel image array H X W X 4 img_rgb = img1d.reshape(response.height, response.width, 3) # original image is fliped vertically img_rgb = np.flipud(img_rgb) # write to png airsim.write_png(os.path.normpath(filename + '.png'), img_rgb)","title":"Using AirSim Images with NumPy"},{"location":"image_apis/#quick-tips","text":"The API simGetImage returns binary string literal which means you can simply dump it in binary file to create a .png file. However if you want to process it in any other way than you can handy function airsim.string_to_uint8_array . This converts binary string literal to NumPy uint8 array. The API simGetImages can accept request for multiple image types from any cameras in single call. You can specify if image is png compressed, RGB uncompressed or float array. For png compressed images, you get binary string literal . For float array you get Python list of float64. You can convert this float array to NumPy 2D array using airsim.list_to_2d_float_array(response.image_data_float, response.width, response.height) You can also save float array to .pfm file (Portable Float Map format) using airsim.write_pfm() function. If you are looking to query position and orientation information in sync with a call to one of the image APIs, you can use client.simPause(True) and client.simPause(False) to pause the simulation while calling the image API and querying the desired physics state, ensuring that the physics state remains the same immediately after the image API call.","title":"Quick Tips"},{"location":"image_apis/#c_1","text":"int getStereoAndDepthImages() { using namespace msr::airlib; typedef VehicleCameraBase::ImageRequest ImageRequest; typedef VehicleCameraBase::ImageResponse ImageResponse; typedef VehicleCameraBase::ImageType ImageType; // for car use // CarRpcLibClient client; MultirotorRpcLibClient client; // get right, left and depth images. First two as png, second as float16. std::vector request = { //png format ImageRequest(\"0\", ImageType::Scene), //uncompressed RGB array bytes ImageRequest(\"1\", ImageType::Scene, false, false), //floating point uncompressed image ImageRequest(\"1\", ImageType::DepthPlanar, true) }; const std::vector& response = client.simGetImages(request); // do something with response which contains image data, pose, timestamp etc }","title":"C++"},{"location":"image_apis/#ready-to-run-complete-examples","text":"","title":"Ready to Run Complete Examples"},{"location":"image_apis/#python_2","text":"","title":"Python"},{"location":"image_apis/#c_2","text":"For a more complete ready to run sample code please see sample code in HelloDrone project for multirotors or HelloCar project . See also other example code that generates specified number of stereo images along with ground truth depth and disparity and saving it to pfm format .","title":"C++"},{"location":"image_apis/#available-cameras","text":"These are the default cameras already available in each vehicle. Apart from these, you can add more cameras to the vehicles or make them are not attached to any vehicle by setting them as external .","title":"Available Cameras"},{"location":"image_apis/#car","text":"The cameras on car can be accessed by following names in API calls: front_center , front_right , front_left , fpv and back_center . Here FPV camera is driver's head position in the car.","title":"Car"},{"location":"image_apis/#multirotor","text":"The cameras on the drone can be accessed by following names in API calls: front_center , front_right , front_left , bottom_center and back_center .","title":"Multirotor"},{"location":"image_apis/#computer-vision-mode","text":"Camera names are same as in multirotor.","title":"Computer Vision Mode"},{"location":"image_apis/#backward-compatibility-for-camera-names","text":"Before AirSim v1.2, cameras were accessed using ID numbers instead of names. For backward compatibility you can still use following ID numbers for above camera names in same order as above: \"0\" , \"1\" , \"2\" , \"3\" , \"4\" . In addition, camera name \"\" is also available to access the default camera which is generally the camera \"0\" .","title":"Backward compatibility for camera names"},{"location":"image_apis/#computer-vision-mode_1","text":"You can use AirSim in so-called \"Computer Vision\" mode. In this mode, physics engine is disabled and there is no vehicle, just cameras (If you want to have the vehicle but without its kinematics, you can use the Multirotor mode with the Physics Engine ExternalPhysicsEngine ). You can move around using keyboard (use F1 to see help on keys). You can press Record button to continuously generate images. Or you can call APIs to move cameras around and take images. You can use AirSim in so-called \"Computer Vision\" mode. In this mode, physics engine is disabled. It has a standard set of cameras and can have any sensor added similar to other vehicles. You can move around using keyboard (use F1 to see help on keys, additionally use left shift to go faster and spacebar to hold in place (handy for when moving camera manually). You can press Record button to continuously generate images. Or you can call APIs to move cameras around and take images. To active this mode, edit settings.json that you can find in your Documents\\AirSim folder (or ~/Documents/AirSim on Linux) and make sure following values exist at root level: { \"SettingsVersion\": 2.0, \"SimMode\": \"ComputerVision\" } This mode was inspired from UnrealCV project .","title":"\"Computer Vision\" Mode"},{"location":"image_apis/#setting-pose-in-computer-vision-mode","text":"To move around the environment using APIs you can use simSetVehiclePose API. This API takes position and orientation and sets that on the invisible vehicle where the front-center camera is located. All rest of the cameras move along keeping the relative position. If you don't want to change position (or orientation) then just set components of position (or orientation) to floating point nan values. The simGetVehiclePose allows to retrieve the current pose. You can also use simGetGroundTruthKinematics to get the quantities kinematics quantities for the movement. Many other non-vehicle specific APIs are also available such as segmentation APIs, collision APIs and camera APIs.","title":"Setting Pose in Computer Vision Mode"},{"location":"image_apis/#camera-apis","text":"The simGetCameraInfo returns the FOV(in degrees), projection matrix of a camera as well as the pose which can be: Default: The pose of the camera in the vehicle frame. External: If set to External the coordinates will be in either Unreal NED when ExternalLocal is false or Local NED (from starting position from vehicle) when ExternalLocal is true . Note that if MoveWorldOrigin in the settings.json is set to true the Unreal coordinates will be moved to be the same origin as the player start location and as such this may effect where the sensor will spawn and which coordinates are returned when ExternalLocal is false . The simSetCameraPose sets the pose for the specified camera while taking an input pose as a combination of relative position and a quaternion in NED frame. The handy airsim.to_quaternion() function allows to convert pitch, roll, yaw to quaternion. For example, to set camera-0 to 15-degree pitch while maintaining the same position, you can use: camera_pose = airsim.Pose(airsim.Vector3r(0, 0, 0), airsim.to_quaternion(0.261799, 0, 0)) #PRY in radians client.simSetCameraPose(0, camera_pose); simSetCameraFov allows changing the Field-of-View of the camera at runtime. simSetDistortionParams , simGetDistortionParams allow setting and fetching the distortion parameters K1, K2, K3, P1, P2 All Camera APIs take in 3 common parameters apart from the API-specific ones, camera_name (str), vehicle_name (str). Camera and vehicle name is used to get the specific camera on the specific vehicle.","title":"Camera APIs"},{"location":"image_apis/#gimbal","text":"You can set stabilization for pitch, roll or yaw for any camera using settings .","title":"Gimbal"},{"location":"image_apis/#changing-resolution-and-camera-parameters","text":"To change resolution, FOV etc, you can use settings.json . For example, below addition in settings.json sets parameters for scene capture and uses \"Computer Vision\" mode described above. If you omit any setting then below default values will be used. For more information see settings doc . If you are using stereo camera, currently the distance between left and right is fixed at 25 cm. { \"SettingsVersion\": 2.0, \"CameraDefaults\": { \"CaptureSettings\": [ { \"ImageType\": 0, \"Width\": 256, \"Height\": 144, \"FOV_Degrees\": 90, \"AutoExposureBias\": 1.3, \"AutoExposureMaxBrightness\": 0.64, \"AutoExposureMinBrightness\": 0.03, \"MotionBlurAmount\": 1, \"MotionBlurMax\": 10, \"ChromaticAberrationScale\": 2, \"LumenGIEnable\": true, \"LumenReflectionEnable\": true, \"LumenFinalQuality\": 1, \"LumenSceneDetail\": 1, \"LumenSceneLightningDetail\": 1 } ] }, \"SimMode\": \"ComputerVision\" }","title":"Changing Resolution and Camera Parameters"},{"location":"image_apis/#what-does-pixel-values-mean-in-different-image-types","text":"","title":"What Does Pixel Values Mean in Different Image Types?"},{"location":"image_apis/#available-imagetype-values","text":"Scene = 0, DepthPlanar = 1, DepthPerspective = 2, DepthVis = 3, DisparityNormalized = 4, Segmentation = 5, SurfaceNormals = 6, Infrared = 7, OpticalFlow = 8, OpticalFlowVis = 9 Annotation = 10","title":"Available ImageType Values"},{"location":"image_apis/#depthplanar-and-depthperspective","text":"You normally want to retrieve the depth image as float (i.e. set pixels_as_float = true ) and specify ImageType = DepthPlanar or ImageType = DepthPerspective in ImageRequest . For ImageType = DepthPlanar , you get depth in camera plane, i.e., all points that are plane-parallel to the camera have same depth. For ImageType = DepthPerspective , you get depth from camera using a projection ray that hits that pixel. Depending on your use case, planner depth or perspective depth may be the ground truth image that you want. For example, you may be able to feed perspective depth to ROS package such as depth_image_proc to generate a point cloud. Or planner depth may be more compatible with estimated depth image generated by stereo algorithms such as SGM.","title":"DepthPlanar and DepthPerspective"},{"location":"image_apis/#depthvis","text":"When you specify ImageType = DepthVis in ImageRequest , you get an image that helps depth visualization. In this case, each pixel value is interpolated from black to white depending on depth in camera plane in meters. The pixels with pure white means depth of 100m or more while pure black means depth of 0 meters.","title":"DepthVis"},{"location":"image_apis/#disparitynormalized","text":"You normally want to retrieve disparity image as float (i.e. set pixels_as_float = true and specify ImageType = DisparityNormalized in ImageRequest ) in which case each pixel is (Xl - Xr)/Xmax , which is thereby normalized to values between 0 to 1.","title":"DisparityNormalized"},{"location":"image_apis/#segmentation","text":"When you specify ImageType = Segmentation in ImageRequest , you get an image that gives you ground truth instance segmentation of the scene. At the startup, AirSim assigns a random color index to each mesh available in environment. The RGB values for each color index ID can be retrieved from the API. You can assign a specific value to a specific mesh using APIs. For example, below Python code sets the object ID for the mesh called \"Ground\" to 20 in Blocks environment and hence changes its color in Segmentation view to the 20th color of the instance segmentation colormap: Note that this will not do a check if this color is already assigned to a different object! success = client.simSetSegmentationObjectID(\"Ground\", 20) The return value is a boolean type that lets you know if the mesh was found. Notice that typical Unreal environments, like Blocks, usually have many other meshes that comprises of same object, for example, \"Ground_2\", \"Ground_3\" and so on. As it is tedious to set object ID for all of these meshes, AirSim also supports regular expressions. For example, the code below sets all meshes which have names starting with \"ground\" (ignoring case) to 21 with just one line: success = client.simSetSegmentationObjectID(\"ground[\\w]*\", 21, True) The return value is true if at least one mesh was found using regular expression matching. When wanting to retrieve the segmentation image through the API, it is recommended that you request uncompressed image using this API to ensure you get precise RGB values for segmentation image: responses = client.simGetImages([airsim.ImageRequest( \"front_center\", airsim.ImageType.Segmentation, False, False)]) img_rgb_string = responses[0].image_data_uint8 rgbarray = np.frombuffer(img_rgb_string, np.uint8) rgbarray_shaped = rgbarray.reshape((540,960,3)) rgbarray_shaped = rgbarray_shaped img = Image.fromarray(rgbarray_shaped, 'RGB') img.show() To retrieve the color map to know which color is assign to each color index you can use: colorMap = client.simGetSegmentationColorMap() An example can be found in segmentation_test.py . For a script that generates a full list of objects and their associated color, please see the script segmentation_generate_list.py .","title":"Segmentation"},{"location":"image_apis/#how-to-find-mesh-names","text":"To get desired ground truth segmentation you will need to know the names of the meshes in your Unreal environment. To do this, you can use the API: currentObjectList = client.simListInstanceSegmentationObjects() This will use an understandable naming depending on the hierarchy the object belong to in the Unreal World (example box_2_fullpalletspawner_5_pallet_4 or door_window_door_38 ). Note that this provides a different result from simListSceneObjects() as this one will make a simple list of all Unreal Actors in the scene, without keeping the hierarchy in mind. An extension to simListInstanceSegmentationObjects() is simListInstanceSegmentationPoses(ned=True, only_visible=True) which will retrieve the 3D object pose of each element in the same order as the first mentioned function. only_visible allows you to only get the objects that are physically visible in the scene. Once you decide on the meshes you are interested, note down their names and use above API to set their object IDs. T","title":"How to Find Mesh names?"},{"location":"image_apis/#changing-colors-for-object-ids","text":"At present the color for each object ID is fixed as in this pallet . We will be adding ability to change colors for object IDs to desired values shortly. In the meantime you can open the segmentation image in your favorite image editor and get the RGB values you are interested in.","title":"Changing Colors for Object IDs"},{"location":"image_apis/#startup-object-ids","text":"At the start, AirSim assigns color indexes to each object found in environment of type UStaticMeshComponent or USkinnedMeshComponent . It then makes an understandable naming depending on the hierarchy the object belong to in the Unreal World (example box_2_fullpalletspawner_5_pallet_4 or door_window_door_38 ).","title":"Startup Object IDs"},{"location":"image_apis/#getting-object-id-for-mesh","text":"The simGetSegmentationObjectID API allows you get object ID for given mesh name.","title":"Getting Object ID for Mesh"},{"location":"image_apis/#more-information","text":"Please see the instance segmentation documentation for some more information on the segmentation system created by Cosys-Lab.","title":"More information"},{"location":"image_apis/#infrared","text":"Currently, this is just a map from object ID to grey scale 0-255. So any mesh with object ID 42 shows up with color (42, 42, 42). Please see segmentation section for more details on how to set object IDs. Typically noise setting can be applied for this image type to get slightly more realistic effect. We are still working on adding other infrared artifacts and any contributions are welcome.","title":"Infrared"},{"location":"image_apis/#opticalflow-and-opticalflowvis","text":"These image types return information about motion perceived by the point of view of the camera. OpticalFlow returns a 2-channel image where the channels correspond to vx and vy respectively. OpticalFlowVis is similar to OpticalFlow but converts flow data to RGB for a more 'visual' output.","title":"OpticalFlow and OpticalFlowVis"},{"location":"image_apis/#object-detection","text":"This feature lets you generate object detection using existing cameras in AirSim, find more info here .","title":"Object Detection"},{"location":"image_apis/#annotation","text":"The annotation system allows you to choose different groundtruth labeling techniques to create more data from your simulation. Find more info here . When enabling annotation layers, one can choose to render images as well from these layers. The image type if set to annotation does usually require to also supply the name of the annotation layer as defined in the settings. For example with Python, you can use the following examples for RGB and greyscale annotation layers. responses = client.simGetImages([airsim.ImageRequest( \"front_center\", airsim.ImageType.Annotation, False, False, \"RGBTest\")]) img_rgb_string = responses[0].image_data_uint8 rgbarray = np.frombuffer(img_rgb_string, np.uint8) rgbarray_shaped = rgbarray.reshape((540,960,3)) img = Image.fromarray(rgbarray_shaped, 'RGB') img.show() responses = client.simGetImages([airsim.ImageRequest( \"front_center\", airsim.ImageType.Annotation, False, False, \"GreyscaleTest\")]) img_rgb_string = responses[0].image_data_uint8 rgbarray = np.frombuffer(img_rgb_string, np.uint8) rgbarray_shaped = rgbarray.reshape((540,960,3)) greyscale_values = np.divide(rgbarray_shaped[:,:,0], 255) img = Image.fromarray(rgbarray_shaped[:,:,0]) img.show()","title":"Annotation"},{"location":"image_apis/#lumen-lightning-for-scene-camera","text":"Unreal 5 introduces Lumen lightning. Due to the cameras using scene capture components enabling Lumen for them can be costly on performance. Settings have been added specfically for the scene camera to customize the usage of Lumen for Global Illumination and Reflections. The LumenGIEnable and LumenReflectionEnable settings enable or disable Lumen for the camera. The LumenFinalQuality (0.25-2) setting determines the quality of the final image. The LumenSceneDetail (0.25-4) setting determines the quality of the scene. The LumenSceneLightningDetail (0.25-2) setting determines the quality of the lightning in the scene.","title":"Lumen Lightning for Scene camera"},{"location":"install_linux/","text":"Intall or Build Cosys-AirSim on Linux The current recommended and tested environment is Ubuntu 22.04 LTS . Theoretically, you can build on other distros as well, but we haven't tested it. Install Unreal Engine Download the latest version of Unreal Engine 5.4 from the [official download page]](https://www.unrealengine.com/en-US/linux). This will require an Epic Games account. Once the zip archive is downloaded you can extract it to where you want to install the Unreal Engine. unzip Linux_Unreal_Engine_5.4.X.zip -d destination_folder If you chose a folder such as for example /opt/UnrealEngine make sure to provide permissions and to set the owner, otherwise you might run into issues: sudo chmod -R 777 /opt/UnrealEngine sudo chown -r yourusername /opt/UnrealEngine From where you install Unreal Engine, you can run Engine/Binaries/Linux/UnrealEditor from the terminal to launch Unreal Engine. For more information you can read the quick start guide . You can alternatively install Unreal Engine from source if you do not use a Ubuntu distribution, see the documentation linked above for more information. Build Cosys-Airsim Clone Cosys-AirSim and build it: bash # go to the folder where you clone GitHub projects git clone https://github.com/Cosys-Lab/Cosys-AirSim.git cd Cosys-AirSim ./setup.sh ./build.sh Build Unreal Environment Finally, you will need an Unreal project that hosts the environment for your vehicles. Cosys-AirSim comes with a built-in \"Blocks Environment\" which you can use, or you can create your own. Please see setting up Unreal Environment if you'd like to setup your own environment. The other environments available often need additional asset packs to be downloaded first, read here for more information. How to Use Cosys-AirSim Once Cosys-AirSim is setup: - Navigate to the environment folder (for example for BLocks it is Unreal\\Environments\\Blocks ), and run update_from_git.sh . - Go to UnrealEngine installation folder and start Unreal by running ./Engine/Binaries/Linux/UnrealEditor . - When Unreal Engine prompts for opening or creating project, select Browse and choose Cosys-AirSim/Unreal/Environments/Blocks (or your custom Unreal project). - Alternatively, the project file can be passed as a commandline argument. For Blocks: ./Engine/Binaries/Linux/UnrealEditor /Unreal/Environments/Blocks/Blocks.uproject - If you get prompts to convert project, look for More Options or Convert-In-Place option. If you get prompted to build, choose Yes. If you get prompted to disable Cosys-AirSim plugin, choose No. - After Unreal Editor loads, press Play button. See Using APIs and settings.json for various options available for Cosys-AirSim usage. !!! tip Go to 'Edit->Editor Preferences', in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. [Optional] Setup Remote Control (Multirotor Only) A remote control is required if you want to fly manually. See the remote control setup for more details. Alternatively, you can use APIs for programmatic control or use the so-called Computer Vision mode to move around using the keyboard.","title":"Install from Source on Linux"},{"location":"install_linux/#intall-or-build-cosys-airsim-on-linux","text":"The current recommended and tested environment is Ubuntu 22.04 LTS . Theoretically, you can build on other distros as well, but we haven't tested it.","title":"Intall or Build Cosys-AirSim on Linux"},{"location":"install_linux/#install-unreal-engine","text":"Download the latest version of Unreal Engine 5.4 from the [official download page]](https://www.unrealengine.com/en-US/linux). This will require an Epic Games account. Once the zip archive is downloaded you can extract it to where you want to install the Unreal Engine. unzip Linux_Unreal_Engine_5.4.X.zip -d destination_folder If you chose a folder such as for example /opt/UnrealEngine make sure to provide permissions and to set the owner, otherwise you might run into issues: sudo chmod -R 777 /opt/UnrealEngine sudo chown -r yourusername /opt/UnrealEngine From where you install Unreal Engine, you can run Engine/Binaries/Linux/UnrealEditor from the terminal to launch Unreal Engine. For more information you can read the quick start guide . You can alternatively install Unreal Engine from source if you do not use a Ubuntu distribution, see the documentation linked above for more information.","title":"Install Unreal Engine"},{"location":"install_linux/#build-cosys-airsim","text":"Clone Cosys-AirSim and build it: bash # go to the folder where you clone GitHub projects git clone https://github.com/Cosys-Lab/Cosys-AirSim.git cd Cosys-AirSim ./setup.sh ./build.sh","title":"Build Cosys-Airsim"},{"location":"install_linux/#build-unreal-environment","text":"Finally, you will need an Unreal project that hosts the environment for your vehicles. Cosys-AirSim comes with a built-in \"Blocks Environment\" which you can use, or you can create your own. Please see setting up Unreal Environment if you'd like to setup your own environment. The other environments available often need additional asset packs to be downloaded first, read here for more information.","title":"Build Unreal Environment"},{"location":"install_linux/#how-to-use-cosys-airsim","text":"Once Cosys-AirSim is setup: - Navigate to the environment folder (for example for BLocks it is Unreal\\Environments\\Blocks ), and run update_from_git.sh . - Go to UnrealEngine installation folder and start Unreal by running ./Engine/Binaries/Linux/UnrealEditor . - When Unreal Engine prompts for opening or creating project, select Browse and choose Cosys-AirSim/Unreal/Environments/Blocks (or your custom Unreal project). - Alternatively, the project file can be passed as a commandline argument. For Blocks: ./Engine/Binaries/Linux/UnrealEditor /Unreal/Environments/Blocks/Blocks.uproject - If you get prompts to convert project, look for More Options or Convert-In-Place option. If you get prompted to build, choose Yes. If you get prompted to disable Cosys-AirSim plugin, choose No. - After Unreal Editor loads, press Play button. See Using APIs and settings.json for various options available for Cosys-AirSim usage. !!! tip Go to 'Edit->Editor Preferences', in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked.","title":"How to Use Cosys-AirSim"},{"location":"install_linux/#optional-setup-remote-control-multirotor-only","text":"A remote control is required if you want to fly manually. See the remote control setup for more details. Alternatively, you can use APIs for programmatic control or use the so-called Computer Vision mode to move around using the keyboard.","title":"[Optional] Setup Remote Control (Multirotor Only)"},{"location":"install_precompiled/","text":"Download and install precompiled Plugin If you wish to not build the plugin from source, you can download the precompiled plugin from the releases page for the right version of Unreal you are using. It does not come with a environment so you will need to create your own Unreal project. Follow this step-by-step guide . The releases page also comes with additional downloads and links to the several API implementations for ROS(2) and the Python and Matlab API clients for that specific version of the Cosys-AirSim plugin.","title":"Download and install from precompiled plugin"},{"location":"install_precompiled/#download-and-install-precompiled-plugin","text":"If you wish to not build the plugin from source, you can download the precompiled plugin from the releases page for the right version of Unreal you are using. It does not come with a environment so you will need to create your own Unreal project. Follow this step-by-step guide . The releases page also comes with additional downloads and links to the several API implementations for ROS(2) and the Python and Matlab API clients for that specific version of the Cosys-AirSim plugin.","title":"Download and install precompiled Plugin"},{"location":"install_windows/","text":"Install or Build Cosys-AirSim on Windows Install Unreal Engine Download the Epic Games Launcher. While the Unreal Engine is open source and free to download, registration is still required. Run the Epic Games Launcher, open the Unreal Engine tab on the left pane. Click on the Install button on the top right, which should show the option to download Unreal Engine 5.4.X . Chose the install location to suit your needs, as shown in the images below. If you have multiple versions of Unreal installed then make sure the version you are using is set to current by clicking down arrow next to the Launch button for the version. Build Cosys-AirSim Install Visual Studio 2022. Make sure to select Desktop Development with C++ and Windows 10/11 SDK 10.0.X (choose latest) and select the latest .NET Framework SDK under the 'Individual Components' tab while installing VS 2022. More info here . Start Developer Command Prompt for VS 2022 . Clone the repo: git clone https://github.com/Cosys-Lab/Cosys-AirSim.git , and go the AirSim directory by cd Cosys-AirSim . Run build.cmd from the command line. This will create ready to use plugin bits in the Unreal\\Plugins folder that can be dropped into any Unreal project. Build Unreal Project Finally, you will need an Unreal project that hosts the environment for your vehicles. Make sure to close and re-open the Unreal Engine and the Epic Games Launcher before building your first environment if you haven't done so already. After restarting the Epic Games Launcher it will ask you to associate project file extensions with Unreal Engine, click on 'fix now' to fix it. Cosys-AirSim comes with a built-in \"Blocks Environment\" which you can use, or you can create your own. Please see setting up Unreal Environment . Setup Remote Control (Multirotor only) A remote control is required if you want to fly manually. See the remote control setup for more details. Alternatively, you can use APIs for programmatic control or use the so-called Computer Vision mode to move around using the keyboard. How to Use Cosys-AirSim Once Cosys-AirSim is set up by following above steps, you can, 1. Navigate to folder Unreal\\Environments\\Blocks and run update_from_git.bat . 2. Double click on .sln file to load the Blocks project in Unreal\\Environments\\Blocks (or .sln file in your own custom Unreal project). If you don't see .sln file then you probably haven't completed steps in Build Unreal Project section above. 3. Select your Unreal project as Start Up project (for example, Blocks project) and make sure Build config is set to \"Develop Editor\" and x64. 4. After Unreal Editor loads, press Play button. !!! tip Go to 'Edit->Editor Preferences', in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. See Using APIs and settings.json for various options available. The other environments available often need additional asset packs to be downloaded first, read here for more information. FAQ I get an error Il \u2018P1\u2019, version \u2018X\u2019, does not match \u2018P2\u2019, version \u2018X\u2019 This is caused by multiple versions of Visual Studio installed on the machine. The build script of Cosys-AirSim will use the latest versions it can find so need to make Unreal does the same. Open or create a file called BuildConfiguration.xml in C:\\Users\\USERNAME\\AppData\\Roaming\\Unreal Engine\\UnrealBuildTool and add the following: Latest I get error C100 : An internal error has occurred in the compiler when running build.cmd We have noticed this happening with VS version 15.9.0 and have checked-in a workaround in Cosys-AirSim code. If you have this VS version, please make sure to pull the latest Cosys-AirSim code. I get error \"'corecrt.h': No such file or directory\" or \"Windows SDK version 8.1 not found\" Very likely you don't have Windows SDK installed with Visual Studio. How do I use PX4 firmware with Cosys-AirSim? By default, Cosys-AirSim uses its own built-in firmware called simple_flight . There is no additional setup if you just want to go with it. If you want to switch to using PX4 instead then please see this guide . I made changes in Visual Studio but there is no effect Sometimes the Unreal + VS build system doesn't recompile if you make changes to only header files. To ensure a recompile, make some Unreal based cpp file \"dirty\" like AirSimGameMode.cpp. Unreal still uses VS2015 or I'm getting some link error Running several versions of VS can lead to issues when compiling UE projects. One problem that may arise is that UE will try to compile with an older version of VS which may or may not work. There are two settings in Unreal, one for for the engine and one for the project, to adjust the version of VS to be used. 1. Edit -> Editor preferences -> General -> Source code 2. Edit -> Project Settings -> Platforms -> Windows -> Toolchain ->CompilerVersion In some cases, these settings will still not lead to the desired result and errors such as the following might be produced: LINK : fatal error LNK1181: cannot open input file 'ws2_32.lib' To resolve such issues the following procedure can be applied: 1. Uninstall all old versions of VS using the VisualStudioUninstaller 2. Repair/Install VS2017 3. Restart machine and install Epic launcher and desired version of the engine","title":"Install from Source on Windows"},{"location":"install_windows/#install-or-build-cosys-airsim-on-windows","text":"","title":"Install or Build Cosys-AirSim on Windows"},{"location":"install_windows/#install-unreal-engine","text":"Download the Epic Games Launcher. While the Unreal Engine is open source and free to download, registration is still required. Run the Epic Games Launcher, open the Unreal Engine tab on the left pane. Click on the Install button on the top right, which should show the option to download Unreal Engine 5.4.X . Chose the install location to suit your needs, as shown in the images below. If you have multiple versions of Unreal installed then make sure the version you are using is set to current by clicking down arrow next to the Launch button for the version.","title":"Install Unreal Engine"},{"location":"install_windows/#build-cosys-airsim","text":"Install Visual Studio 2022. Make sure to select Desktop Development with C++ and Windows 10/11 SDK 10.0.X (choose latest) and select the latest .NET Framework SDK under the 'Individual Components' tab while installing VS 2022. More info here . Start Developer Command Prompt for VS 2022 . Clone the repo: git clone https://github.com/Cosys-Lab/Cosys-AirSim.git , and go the AirSim directory by cd Cosys-AirSim . Run build.cmd from the command line. This will create ready to use plugin bits in the Unreal\\Plugins folder that can be dropped into any Unreal project.","title":"Build Cosys-AirSim"},{"location":"install_windows/#build-unreal-project","text":"Finally, you will need an Unreal project that hosts the environment for your vehicles. Make sure to close and re-open the Unreal Engine and the Epic Games Launcher before building your first environment if you haven't done so already. After restarting the Epic Games Launcher it will ask you to associate project file extensions with Unreal Engine, click on 'fix now' to fix it. Cosys-AirSim comes with a built-in \"Blocks Environment\" which you can use, or you can create your own. Please see setting up Unreal Environment .","title":"Build Unreal Project"},{"location":"install_windows/#setup-remote-control-multirotor-only","text":"A remote control is required if you want to fly manually. See the remote control setup for more details. Alternatively, you can use APIs for programmatic control or use the so-called Computer Vision mode to move around using the keyboard.","title":"Setup Remote Control (Multirotor only)"},{"location":"install_windows/#how-to-use-cosys-airsim","text":"Once Cosys-AirSim is set up by following above steps, you can, 1. Navigate to folder Unreal\\Environments\\Blocks and run update_from_git.bat . 2. Double click on .sln file to load the Blocks project in Unreal\\Environments\\Blocks (or .sln file in your own custom Unreal project). If you don't see .sln file then you probably haven't completed steps in Build Unreal Project section above. 3. Select your Unreal project as Start Up project (for example, Blocks project) and make sure Build config is set to \"Develop Editor\" and x64. 4. After Unreal Editor loads, press Play button. !!! tip Go to 'Edit->Editor Preferences', in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. See Using APIs and settings.json for various options available. The other environments available often need additional asset packs to be downloaded first, read here for more information.","title":"How to Use Cosys-AirSim"},{"location":"install_windows/#faq","text":"","title":"FAQ"},{"location":"install_windows/#i-get-an-error-il-p1-version-x-does-not-match-p2-version-x","text":"This is caused by multiple versions of Visual Studio installed on the machine. The build script of Cosys-AirSim will use the latest versions it can find so need to make Unreal does the same. Open or create a file called BuildConfiguration.xml in C:\\Users\\USERNAME\\AppData\\Roaming\\Unreal Engine\\UnrealBuildTool and add the following: Latest ","title":"I get an error Il \u2018P1\u2019, version \u2018X\u2019, does not match \u2018P2\u2019, version \u2018X\u2019"},{"location":"install_windows/#i-get-error-c100-an-internal-error-has-occurred-in-the-compiler-when-running-buildcmd","text":"We have noticed this happening with VS version 15.9.0 and have checked-in a workaround in Cosys-AirSim code. If you have this VS version, please make sure to pull the latest Cosys-AirSim code.","title":"I get error C100 : An internal error has occurred in the compiler when running build.cmd"},{"location":"install_windows/#i-get-error-corecrth-no-such-file-or-directory-or-windows-sdk-version-81-not-found","text":"Very likely you don't have Windows SDK installed with Visual Studio.","title":"I get error \"'corecrt.h': No such file or directory\" or \"Windows SDK version 8.1 not found\""},{"location":"install_windows/#how-do-i-use-px4-firmware-with-cosys-airsim","text":"By default, Cosys-AirSim uses its own built-in firmware called simple_flight . There is no additional setup if you just want to go with it. If you want to switch to using PX4 instead then please see this guide .","title":"How do I use PX4 firmware with Cosys-AirSim?"},{"location":"install_windows/#i-made-changes-in-visual-studio-but-there-is-no-effect","text":"Sometimes the Unreal + VS build system doesn't recompile if you make changes to only header files. To ensure a recompile, make some Unreal based cpp file \"dirty\" like AirSimGameMode.cpp.","title":"I made changes in Visual Studio but there is no effect"},{"location":"install_windows/#unreal-still-uses-vs2015-or-im-getting-some-link-error","text":"Running several versions of VS can lead to issues when compiling UE projects. One problem that may arise is that UE will try to compile with an older version of VS which may or may not work. There are two settings in Unreal, one for for the engine and one for the project, to adjust the version of VS to be used. 1. Edit -> Editor preferences -> General -> Source code 2. Edit -> Project Settings -> Platforms -> Windows -> Toolchain ->CompilerVersion In some cases, these settings will still not lead to the desired result and errors such as the following might be produced: LINK : fatal error LNK1181: cannot open input file 'ws2_32.lib' To resolve such issues the following procedure can be applied: 1. Uninstall all old versions of VS using the VisualStudioUninstaller 2. Repair/Install VS2017 3. Restart machine and install Epic launcher and desired version of the engine","title":"Unreal still uses VS2015 or I'm getting some link error"},{"location":"instance_segmentation/","text":"Instance Segmentation in Cosys-AirSim An Instance segmentation system is implemented into Cosys-AirSim. It uses Proxy Mesh rendering to allow for each object in the world to get its own color. Limitations 2744000 different colors are currently available to be assigned to unique objects. If your environment during a run requires more colors, you will generate errors and new objects will be assigned color [0,0,0]. Only static and skeletal meshes are supported. Landscape objects aren't supported. This is the special object type in Unreal to make terrain with. As a work-around, StaticMesh terrain must be used. Foliage objects aren't supported. This is the special object type in Unreal to place trees, grass and other plants that move with the wind. As a work-around, StaticMesh objects must be used. Brush objects aren't supported. This is a special object type in Unreal to create your own meshes with. As a work-around, you can convert them to a StaticMesh. These and other unsupported object types that are less common that either will not be rendered (decals, text, foliage, ...) or will by default be given the RGB color value of [149,149,149] or 0,0,0 . Usage By default, at the start of the simulation, it will give a random color to each object. Please see the Image API documentation on how to manually set or get the color information. For an example of the Instance Segmentation API, please see the script segmentation_test.py . For a script that generates a full list of objects and their associated color, please see the script segmentation_generate_list.py . When a new object is spawned in your environment by for example a c++ or blueprint extension you made, and you want it to work with the instance segmentation system, you can use the extended function ASimModeBase::AddNewActorToSegmentation(AActor) which is also available in blueprints. Make sure to provide human-readable names to your objects in your environment as the ground truth tables that the AirSim API can provide will use your object naming to create the table. Credits The method used to use Proxy meshes to segment object is a derivative of and inspired by the work of UnrealCV . Their work is licensed under the MIT License. It is made by students from Johns Hopkins University and Peking University under the supervision of Prof. Alan Yuille and Prof. Yizhou Wang. You can read the paper on their work here .","title":"Instance Segmentation"},{"location":"instance_segmentation/#instance-segmentation-in-cosys-airsim","text":"An Instance segmentation system is implemented into Cosys-AirSim. It uses Proxy Mesh rendering to allow for each object in the world to get its own color.","title":"Instance Segmentation in Cosys-AirSim"},{"location":"instance_segmentation/#limitations","text":"2744000 different colors are currently available to be assigned to unique objects. If your environment during a run requires more colors, you will generate errors and new objects will be assigned color [0,0,0]. Only static and skeletal meshes are supported. Landscape objects aren't supported. This is the special object type in Unreal to make terrain with. As a work-around, StaticMesh terrain must be used. Foliage objects aren't supported. This is the special object type in Unreal to place trees, grass and other plants that move with the wind. As a work-around, StaticMesh objects must be used. Brush objects aren't supported. This is a special object type in Unreal to create your own meshes with. As a work-around, you can convert them to a StaticMesh. These and other unsupported object types that are less common that either will not be rendered (decals, text, foliage, ...) or will by default be given the RGB color value of [149,149,149] or 0,0,0 .","title":"Limitations"},{"location":"instance_segmentation/#usage","text":"By default, at the start of the simulation, it will give a random color to each object. Please see the Image API documentation on how to manually set or get the color information. For an example of the Instance Segmentation API, please see the script segmentation_test.py . For a script that generates a full list of objects and their associated color, please see the script segmentation_generate_list.py . When a new object is spawned in your environment by for example a c++ or blueprint extension you made, and you want it to work with the instance segmentation system, you can use the extended function ASimModeBase::AddNewActorToSegmentation(AActor) which is also available in blueprints. Make sure to provide human-readable names to your objects in your environment as the ground truth tables that the AirSim API can provide will use your object naming to create the table.","title":"Usage"},{"location":"instance_segmentation/#credits","text":"The method used to use Proxy meshes to segment object is a derivative of and inspired by the work of UnrealCV . Their work is licensed under the MIT License. It is made by students from Johns Hopkins University and Peking University under the supervision of Prof. Alan Yuille and Prof. Yizhou Wang. You can read the paper on their work here .","title":"Credits"},{"location":"lidar/","text":"How to Use Lidar in AirSim AirSim supports Lidar for multirotors and cars. The enablement of lidar and the other lidar settings can be configured via AirSimSettings json. Please see general sensors for information on configruation of general/shared sensor settings. Enabling lidar on a vehicle By default, lidars are not enabled. To enable lidar, set the SensorType and Enabled attributes in settings json. \"Lidar1\": { \"SensorType\": 6, \"Enabled\" : true, } Multiple lidars can be enabled on a vehicle. Ignoring glass and other material types One can set an object that should be invisible to LIDAR sensors (such as glass) to have no collision for Unreal Traces in order to have it be 'invisible' for lidar sensors. Lidar configuration The following parameters can be configured right now via settings json. Parameter Description NumberOfChannels Number of channels/lasers of the lidar. When set to 1 it will act as a 2D horizontal LiDAR and will use the VerticalFOVUpper value as the vertical angle to scan. Range Range, in meters MeasurementsPerCycle Horizontal resolution. Amount of points in one cycle. RotationsPerSecond Rotations per second HorizontalFOVStart Horizontal FOV start for the lidar, in degrees HorizontalFOVEnd Horizontal FOV end for the lidar, in degrees VerticalFOVUpper Vertical FOV upper limit for the lidar, in degrees VerticalFOVLower Vertical FOV lower limit for the lidar, in degrees X Y Z Position of the lidar relative to the vehicle (in NED, in meters) Roll Pitch Yaw Orientation of the lidar relative to the vehicle (in degrees, yaw-pitch-roll order to front vector +X) GenerateNoise Generate and add range-noise based on normal distribution if set to true MinNoiseStandardDeviation The standard deviation to generate the noise normal distribution, in meters. This is the minimal noise (at 0 distance) NoiseDistanceScale To scale the noise with distance, set this parameter. This way the minimal noise is scaled depending on the distance compared to total maximum range of the sensor UpdateFrequency Amount of times per second that the sensor should update and calculate the next set of poins DrawSensor Draw the physical sensor in the world on the vehicle with a 3D axes shown where the sensor is LimitPoints Limit the amount of points that can be calculated in one measurement (to work around freezes due to bad performance). Will result in incomplete pointclouds External Uncouple the sensor from the vehicle. If enabled, the position and orientation will be relative to Unreal world coordinates ExternalLocal When in external mode, if this is enabled the retrieved pose of the sensor will be in Local NED coordinates(from starting position from vehicle) and not converted Unreal NED coordinates which is default { \"SeeDocsAt\": \"https://cosys-lab.github.io/settings/\", \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"Drone1\": { \"VehicleType\": \"simpleflight\", \"AutoCreate\": true, \"Sensors\": { \"LidarSensor1\": { \"SensorType\": 6, \"Enabled\" : true, \"NumberOfChannels\": 16, \"RotationsPerSecond\": 10, \"MeasurementsPerCycle\": 512, \"X\": 0, \"Y\": 0, \"Z\": -1, \"Roll\": 0, \"Pitch\": 0, \"Yaw\" : 0, \"VerticalFOVUpper\": -15, \"VerticalFOVLower\": -25, \"HorizontalFOVStart\": -20, \"HorizontalFOVEnd\": 20, \"DrawDebugPoints\": true }, \"LidarSensor2\": { \"SensorType\": 6, \"Enabled\" : true, \"NumberOfChannels\": 4, \"RotationsPerSecond\": 10, \"MeasurementsPerCycle\": 64, \"X\": 0, \"Y\": 0, \"Z\": -1, \"Roll\": 0, \"Pitch\": 0, \"Yaw\" : 0, \"VerticalFOVUpper\": -15, \"VerticalFOVLower\": -25, \"DrawDebugPoints\": true } } } } } Server side visualization for debugging By default, the lidar points are not drawn on the viewport. To enable the drawing of hit laser points on the viewport, please enable setting DrawDebugPoints via settings json. \"Lidar1\": { ... \"DrawDebugPoints\": true }, Client API Use getLidarData(sensor name, vehicle name) API to retrieve the Lidar data. The API returns a full scan Point-Cloud as a flat array of floats along with the timestamp of the capture and lidar pose. Point-Cloud: The floats represent [x,y,z] coordinate for each point hit within the range in the last scan in NED format. It will be [0,0,0] for a laser that didn't get any reflection (out of range). Pose: Default: Sensor pose in the vehicle frame / External: If set to External (see table) the coordinates will be in either Unreal NED when ExternalLocal is false or Local NED (from starting position from vehicle) when ExternalLocal is true . Groundtruth: For each point of the Point-Cloud a label string is kept that has the name of the object that the point belongs to a laser that didn't reflect anything will have label out_of_range .","title":"LIDAR"},{"location":"lidar/#how-to-use-lidar-in-airsim","text":"AirSim supports Lidar for multirotors and cars. The enablement of lidar and the other lidar settings can be configured via AirSimSettings json. Please see general sensors for information on configruation of general/shared sensor settings.","title":"How to Use Lidar in AirSim"},{"location":"lidar/#enabling-lidar-on-a-vehicle","text":"By default, lidars are not enabled. To enable lidar, set the SensorType and Enabled attributes in settings json. \"Lidar1\": { \"SensorType\": 6, \"Enabled\" : true, } Multiple lidars can be enabled on a vehicle.","title":"Enabling lidar on a vehicle"},{"location":"lidar/#ignoring-glass-and-other-material-types","text":"One can set an object that should be invisible to LIDAR sensors (such as glass) to have no collision for Unreal Traces in order to have it be 'invisible' for lidar sensors.","title":"Ignoring glass and other material types"},{"location":"lidar/#lidar-configuration","text":"The following parameters can be configured right now via settings json. Parameter Description NumberOfChannels Number of channels/lasers of the lidar. When set to 1 it will act as a 2D horizontal LiDAR and will use the VerticalFOVUpper value as the vertical angle to scan. Range Range, in meters MeasurementsPerCycle Horizontal resolution. Amount of points in one cycle. RotationsPerSecond Rotations per second HorizontalFOVStart Horizontal FOV start for the lidar, in degrees HorizontalFOVEnd Horizontal FOV end for the lidar, in degrees VerticalFOVUpper Vertical FOV upper limit for the lidar, in degrees VerticalFOVLower Vertical FOV lower limit for the lidar, in degrees X Y Z Position of the lidar relative to the vehicle (in NED, in meters) Roll Pitch Yaw Orientation of the lidar relative to the vehicle (in degrees, yaw-pitch-roll order to front vector +X) GenerateNoise Generate and add range-noise based on normal distribution if set to true MinNoiseStandardDeviation The standard deviation to generate the noise normal distribution, in meters. This is the minimal noise (at 0 distance) NoiseDistanceScale To scale the noise with distance, set this parameter. This way the minimal noise is scaled depending on the distance compared to total maximum range of the sensor UpdateFrequency Amount of times per second that the sensor should update and calculate the next set of poins DrawSensor Draw the physical sensor in the world on the vehicle with a 3D axes shown where the sensor is LimitPoints Limit the amount of points that can be calculated in one measurement (to work around freezes due to bad performance). Will result in incomplete pointclouds External Uncouple the sensor from the vehicle. If enabled, the position and orientation will be relative to Unreal world coordinates ExternalLocal When in external mode, if this is enabled the retrieved pose of the sensor will be in Local NED coordinates(from starting position from vehicle) and not converted Unreal NED coordinates which is default { \"SeeDocsAt\": \"https://cosys-lab.github.io/settings/\", \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"Drone1\": { \"VehicleType\": \"simpleflight\", \"AutoCreate\": true, \"Sensors\": { \"LidarSensor1\": { \"SensorType\": 6, \"Enabled\" : true, \"NumberOfChannels\": 16, \"RotationsPerSecond\": 10, \"MeasurementsPerCycle\": 512, \"X\": 0, \"Y\": 0, \"Z\": -1, \"Roll\": 0, \"Pitch\": 0, \"Yaw\" : 0, \"VerticalFOVUpper\": -15, \"VerticalFOVLower\": -25, \"HorizontalFOVStart\": -20, \"HorizontalFOVEnd\": 20, \"DrawDebugPoints\": true }, \"LidarSensor2\": { \"SensorType\": 6, \"Enabled\" : true, \"NumberOfChannels\": 4, \"RotationsPerSecond\": 10, \"MeasurementsPerCycle\": 64, \"X\": 0, \"Y\": 0, \"Z\": -1, \"Roll\": 0, \"Pitch\": 0, \"Yaw\" : 0, \"VerticalFOVUpper\": -15, \"VerticalFOVLower\": -25, \"DrawDebugPoints\": true } } } } }","title":"Lidar configuration"},{"location":"lidar/#server-side-visualization-for-debugging","text":"By default, the lidar points are not drawn on the viewport. To enable the drawing of hit laser points on the viewport, please enable setting DrawDebugPoints via settings json. \"Lidar1\": { ... \"DrawDebugPoints\": true },","title":"Server side visualization for debugging"},{"location":"lidar/#client-api","text":"Use getLidarData(sensor name, vehicle name) API to retrieve the Lidar data. The API returns a full scan Point-Cloud as a flat array of floats along with the timestamp of the capture and lidar pose. Point-Cloud: The floats represent [x,y,z] coordinate for each point hit within the range in the last scan in NED format. It will be [0,0,0] for a laser that didn't get any reflection (out of range). Pose: Default: Sensor pose in the vehicle frame / External: If set to External (see table) the coordinates will be in either Unreal NED when ExternalLocal is false or Local NED (from starting position from vehicle) when ExternalLocal is true . Groundtruth: For each point of the Point-Cloud a label string is kept that has the name of the object that the point belongs to a laser that didn't reflect anything will have label out_of_range .","title":"Client API"},{"location":"log_viewer/","text":"Log Viewer The LogViewer is a Windows WPF app that presents the MavLink streams that it is getting from the Unreal Simulator. You can use this to monitor what is happening on the drone while it is flying. For example, the picture below shows a real time graph of the x, y an z gyro sensor information being generated by the simulator. Usage You can open a log file, it supports .mavlink and PX4 *.ulg files, then you will see the contents of the log in a tree view on the left, whatever metrics you select will be added to the right the right side. You can close each individual chart with the little close box in the top right of each chart and you can group charts so they share the same vertical axis using the group charts button on the top toolbar. There is also a map option which will plot the GPS path the drone took. You can also load multiple log files so you can compare the data from each. Realtime You can also get a realtime view if you connect the LogViewer before you run the simulation. For this to work you need to configure the settings.json with the following settings: { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"PX4\": { ..., \"LogViewerHostIp\": \"127.0.0.1\", \"LogViewerPort\": 14388, } } } Note: do not use the \"Logs\" setting when you want realtime LogViewer logging. Logging to a file using \"Logs\" is mutually exclusive with LogViewer logging. Simply press the blue connector button on the top right corner of the window, select the Socket tab , enter the port number 14388 , and your localhost network. If you are using WSL 2 on Windows then select vEthernet (WSL) . If you do choose vEthernet (WSL) then make sure you also set LocalHostIp and LogViewerHostIp to the matching WSL ethernet address, something like 172.31.64.1 . Then press the record button (triangle on the right hand side of the toolbar). Now start the simulator, and the data will start streaming into LogViewer. The drone view in Log Viewer shows the actual estimated position coming from the PX4, so that is a great way to check whether the PX4 is in sync with the simulator. Sometimes you can see some drift here as the attitude estimation catches up with reality, this can become more visible after a bad crash. Installation If you can't build the LogViewer.sln, there is also a click once installer . Configuration The magic port number 14388 can be configured in the simulator by editing the settings.json file . If you change the port number in LogViewer connection dialog then be sure to make the matching changes in your settings.json file. Debugging See PX4 Logging for more information on how to use the LogViewer to debug situations you are setting.","title":"MavLink LogViewer"},{"location":"log_viewer/#log-viewer","text":"The LogViewer is a Windows WPF app that presents the MavLink streams that it is getting from the Unreal Simulator. You can use this to monitor what is happening on the drone while it is flying. For example, the picture below shows a real time graph of the x, y an z gyro sensor information being generated by the simulator.","title":"Log Viewer"},{"location":"log_viewer/#usage","text":"You can open a log file, it supports .mavlink and PX4 *.ulg files, then you will see the contents of the log in a tree view on the left, whatever metrics you select will be added to the right the right side. You can close each individual chart with the little close box in the top right of each chart and you can group charts so they share the same vertical axis using the group charts button on the top toolbar. There is also a map option which will plot the GPS path the drone took. You can also load multiple log files so you can compare the data from each.","title":"Usage"},{"location":"log_viewer/#realtime","text":"You can also get a realtime view if you connect the LogViewer before you run the simulation. For this to work you need to configure the settings.json with the following settings: { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"PX4\": { ..., \"LogViewerHostIp\": \"127.0.0.1\", \"LogViewerPort\": 14388, } } } Note: do not use the \"Logs\" setting when you want realtime LogViewer logging. Logging to a file using \"Logs\" is mutually exclusive with LogViewer logging. Simply press the blue connector button on the top right corner of the window, select the Socket tab , enter the port number 14388 , and your localhost network. If you are using WSL 2 on Windows then select vEthernet (WSL) . If you do choose vEthernet (WSL) then make sure you also set LocalHostIp and LogViewerHostIp to the matching WSL ethernet address, something like 172.31.64.1 . Then press the record button (triangle on the right hand side of the toolbar). Now start the simulator, and the data will start streaming into LogViewer. The drone view in Log Viewer shows the actual estimated position coming from the PX4, so that is a great way to check whether the PX4 is in sync with the simulator. Sometimes you can see some drift here as the attitude estimation catches up with reality, this can become more visible after a bad crash.","title":"Realtime"},{"location":"log_viewer/#installation","text":"If you can't build the LogViewer.sln, there is also a click once installer .","title":"Installation"},{"location":"log_viewer/#configuration","text":"The magic port number 14388 can be configured in the simulator by editing the settings.json file . If you change the port number in LogViewer connection dialog then be sure to make the matching changes in your settings.json file.","title":"Configuration"},{"location":"log_viewer/#debugging","text":"See PX4 Logging for more information on how to use the LogViewer to debug situations you are setting.","title":"Debugging"},{"location":"matlab/","text":"How to use AirSim with Matlab AirSim and Matlab can be integrated using Python. an example Matlab client is provided demonstrating how to interact with AirSim from Matlab. This can be used from source or installed as a toolbox (install from File Exchange , or from source by double-clicking or dragging into Matlab the file Cosys-AirSim Matlab API Client.mltbx ) Prerequisites These instructions are for Matlab 2024a (with toolboxes for the client: Computer Vision, Aerospace, Signal Processing Toolbox) UE 5.3 and latest AirSim release. It also requires the AirSim python package to be installed. For this go into the PythonClient folder and use pip to install it to your python environment that is also used in Matlab with pip install . You can find out in Matlab what Python version is used with pe = pyenv; pe.Version You should have these components installed and working before proceeding. Usage This a client implementation of the RPC API for Matlab for the Cosys-AirSim simulation framework. A main class AirSimClient is available which implements all API calls. Do note that at this point not all functions have been tested and most function documentation was auto-generated. This is still a WIP client. Initial setup When starting with this wrapper, first try to make a connection to the Cosys-AirSim simulation. vehicle_name = \"airsimvehicle\"; airSimClient = AirSimClient(IsDrone=false, ApiControl=false, IP=\"127.0.0.1\", port=41451, vehicleName=vehicle_name); Now the client object can be used to run API methods from. All functions have some help text written for more information on them. Example This example will: Connect to AirSim Get/set vehicle pose Get instance segmentation groundtruth table Get object pose(s) Get sensor data (imu, echo (active/passive), (gpu)LiDAR, camera (info, rgb, depth, segmentation, annotation)) Do note that the AirSim matlab client has almost all API functions available but not all are listed in this test script. For a full list see the source code fo the AirSimClient class. Do note the test script requires next to the toolboxes listed above in the Prerequisites the following Matlab toolboxes: Lidar Toolbox Navigation Toolbox Robotics System Toolbox ROS Toolbox UAV Toolbox Setup connection %Define client vehicle_name = \"airsimvehicle\"; airSimClient = AirSimClient(IsDrone=false, ApiControl=false, IP=\"127.0.0.1\", port=41451, vehicleName=vehicle_name); Groundtruth labels % Get groundtruth look-up-table of all objects and their instance % segmentation colors for the cameras and GPU LiDAR groundtruthLUT = airSimClient.getInstanceSegmentationLUT(); Get some poses % All poses are right handed coordinate system X Y Z and % orientations are defined as quaternions W X Y Z. % Get poses of all objects in the scene, this takes a while for large % scene so it is in comment by default poses = airSimClient.getAllObjectPoses(false, false); % Get vehicle pose vehiclePoseLocal = airSimClient.getVehiclePose(); vehiclePoseWorld = airSimClient.getObjectPose(vehicle_name, false); % Get an random object pose or choose if you know the name of one useChosenObject = false; chosenObject = \"Cylinder3\"; if useChosenObject finalName = chosenObject; else randomIndex = randi(size(groundtruthLUT, 1), 1); randomName = groundtruthLUT.name(randomIndex); finalName = randomName; end objectPoseLocal = airSimClient.getObjectPose(finalName, true); objectPoseWorld = airSimClient.getObjectPose(finalName, false); figure; subplot(1, 2, 1); plotTransforms([vehiclePoseLocal.position; objectPoseLocal.position], [vehiclePoseLocal.orientation; objectPoseLocal.orientation], FrameLabel=[\"Vehicle\"; finalName], AxisLabels=\"on\") axis equal; grid on; xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") title(\"Local Plot\") subplot(1, 2, 2); plotTransforms([vehiclePoseWorld.position; objectPoseWorld.position], [vehiclePoseWorld.orientation; objectPoseWorld.orientation], FrameLabel=[\"Vehicle\"; finalName], AxisLabels=\"on\") axis equal; grid on; xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") title(\"World Plot\") drawnow %% Set vehicle pose airSimClient.setVehiclePose(airSimClient.getVehiclePose().position + [1 1 0], airSimClient.getVehiclePose().orientation) IMU sensor Data imuSensorName = \"imu\"; [imuData, imuTimestamp] = airSimClient.getIMUData(imuSensorName); Echo sensor data % Example plots passive echo pointcloud % and its reflection directions as 3D quivers echoSensorName = \"echo\"; enablePassive = true; [activePointCloud, activeData, passivePointCloud, passiveData , echoTimestamp, echoSensorPose] = airSimClient.getEchoData(echoSensorName, enablePassive); figure; subplot(1, 2, 1); if ~isempty(activePointCloud) pcshow(activePointCloud, color=\"X\", MarkerSize=50); else pcshow(pointCloud([0, 0, 0])); end title('Active Echo Sensor Pointcloud') xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") xlim([0 10]) ylim([-10 10]) zlim([-10 10]) subplot(1, 2, 2); if ~isempty(passivePointCloud) pcshow(passivePointCloud, color=\"X\", MarkerSize=50); hold on; quiver3(passivePointCloud.Location(:, 1), passivePointCloud.Location(:, 2), passivePointCloud.Location(:, 3),... passivePointCloud.Normal(:, 1), passivePointCloud.Normal(:, 2), passivePointCloud.Normal(:, 3), 2); hold off else pcshow(pointCloud([0, 0, 0])); end title('Passive Echo Sensor Pointcloud') xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") xlim([0 10]) ylim([-10 10]) zlim([-10 10]) drawnow LiDAR sensor data % Example plots lidar pointcloud and getting the groundtruth labels lidarSensorName = \"lidar\"; enableLabels = true; [lidarPointCloud, lidarLabels, LidarTimestamp, LidarSensorPose] = airSimClient.getLidarData(lidarSensorName, enableLabels); figure; if ~isempty(lidarPointCloud) pcshow(lidarPointCloud, MarkerSize=50); else pcshow(pointCloud([0, 0, 0])); end title('LiDAR Pointcloud') xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") xlim([0 10]) ylim([-10 10]) zlim([-10 10]) drawnow GPU LiDAR sensor data % Example plots GPU lidar pointcloud with its RGB segmentation colors gpuLidarSensorName = \"gpulidar\"; enableLabels = true; [gpuLidarPointCloud, gpuLidarTimestamp, gpuLidarSensorPose] = airSimClient.getGPULidarData(gpuLidarSensorName); figure; if ~isempty(gpuLidarPointCloud) pcshow(gpuLidarPointCloud, MarkerSize=50); else pcshow(pointCloud([0, 0, 0])); end title('GPU-Accelerated LiDAR Pointcloud') xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") xlim([0 10]) ylim([-10 10]) zlim([-10 10]) drawnow Cameras %% Get camera info cameraSensorName = \"frontcamera\"; [intrinsics, cameraSensorPose] = airSimClient.getCameraInfo(cameraSensorName); %% Get single camera images % Get images sequentially cameraSensorName = \"front_center\"; [rgbImage, rgbCameraIimestamp] = airSimClient.getCameraImage(cameraSensorName, AirSimCameraTypes.Scene); [segmentationImage, segmentationCameraIimestamp] = airSimClient.getCameraImage(cameraSensorName, AirSimCameraTypes.Segmentation); [depthImage, depthCameraIimestamp] = airSimClient.getCameraImage(cameraSensorName, AirSimCameraTypes.DepthPlanar); [annotationImage, annotationCameraIimestamp] = airSimClient.getCameraImage(cameraSensorName, AirSimCameraTypes.Annotation, \"TextureTestDirect\"); figure; subplot(4, 1, 1); imshow(rgbImage) title(\"RGB Camera Image\") subplot(4, 1, 2); imshow(segmentationImage) title(\"Segmentation Camera Image\") subplot(4, 1, 3); imshow(depthImage ./ max(max(depthImage)).* 255, gray) title(\"Depth Camera Image\") subplot(4, 1, 4); imshow(annotationImage) title(\"Annotation Camera Image\") %% Get synced camera images % By combining the image requests they will be synced % and taken in the same frame cameraSensorName = \"front_center\"; [images, cameraIimestamp] = airSimClient.getCameraImages(cameraSensorName, ... [AirSimCameraTypes.Scene, AirSimCameraTypes.Segmentation, AirSimCameraTypes.DepthPlanar, AirSimCameraTypes.Annotation], ... [\"\", \"\", \"\", \"GreyscaleTest\"]); figure; subplot(4, 1, 1); imshow(images{1}) title(\"Synced RGB Camera Image\") subplot(4, 1, 2); imshow(images{2}) title(\"Synced Segmentation Camera Image\") subplot(4, 1, 3); imshow(images{3} ./ max(max(images{3})).* 255, gray) title(\"Synced Depth Camera Image\") subplot(4, 1, 4); imshow(images{4}) title(\"Synced Annotation Camera Image\")","title":"Matlab"},{"location":"matlab/#how-to-use-airsim-with-matlab","text":"AirSim and Matlab can be integrated using Python. an example Matlab client is provided demonstrating how to interact with AirSim from Matlab. This can be used from source or installed as a toolbox (install from File Exchange , or from source by double-clicking or dragging into Matlab the file Cosys-AirSim Matlab API Client.mltbx )","title":"How to use AirSim with Matlab"},{"location":"matlab/#prerequisites","text":"These instructions are for Matlab 2024a (with toolboxes for the client: Computer Vision, Aerospace, Signal Processing Toolbox) UE 5.3 and latest AirSim release. It also requires the AirSim python package to be installed. For this go into the PythonClient folder and use pip to install it to your python environment that is also used in Matlab with pip install . You can find out in Matlab what Python version is used with pe = pyenv; pe.Version You should have these components installed and working before proceeding.","title":"Prerequisites"},{"location":"matlab/#usage","text":"This a client implementation of the RPC API for Matlab for the Cosys-AirSim simulation framework. A main class AirSimClient is available which implements all API calls. Do note that at this point not all functions have been tested and most function documentation was auto-generated. This is still a WIP client.","title":"Usage"},{"location":"matlab/#initial-setup","text":"When starting with this wrapper, first try to make a connection to the Cosys-AirSim simulation. vehicle_name = \"airsimvehicle\"; airSimClient = AirSimClient(IsDrone=false, ApiControl=false, IP=\"127.0.0.1\", port=41451, vehicleName=vehicle_name); Now the client object can be used to run API methods from. All functions have some help text written for more information on them.","title":"Initial setup"},{"location":"matlab/#example","text":"This example will: Connect to AirSim Get/set vehicle pose Get instance segmentation groundtruth table Get object pose(s) Get sensor data (imu, echo (active/passive), (gpu)LiDAR, camera (info, rgb, depth, segmentation, annotation)) Do note that the AirSim matlab client has almost all API functions available but not all are listed in this test script. For a full list see the source code fo the AirSimClient class. Do note the test script requires next to the toolboxes listed above in the Prerequisites the following Matlab toolboxes: Lidar Toolbox Navigation Toolbox Robotics System Toolbox ROS Toolbox UAV Toolbox","title":"Example"},{"location":"matlab/#setup-connection","text":"%Define client vehicle_name = \"airsimvehicle\"; airSimClient = AirSimClient(IsDrone=false, ApiControl=false, IP=\"127.0.0.1\", port=41451, vehicleName=vehicle_name);","title":"Setup connection"},{"location":"matlab/#groundtruth-labels","text":"% Get groundtruth look-up-table of all objects and their instance % segmentation colors for the cameras and GPU LiDAR groundtruthLUT = airSimClient.getInstanceSegmentationLUT();","title":"Groundtruth labels"},{"location":"matlab/#get-some-poses","text":"% All poses are right handed coordinate system X Y Z and % orientations are defined as quaternions W X Y Z. % Get poses of all objects in the scene, this takes a while for large % scene so it is in comment by default poses = airSimClient.getAllObjectPoses(false, false); % Get vehicle pose vehiclePoseLocal = airSimClient.getVehiclePose(); vehiclePoseWorld = airSimClient.getObjectPose(vehicle_name, false); % Get an random object pose or choose if you know the name of one useChosenObject = false; chosenObject = \"Cylinder3\"; if useChosenObject finalName = chosenObject; else randomIndex = randi(size(groundtruthLUT, 1), 1); randomName = groundtruthLUT.name(randomIndex); finalName = randomName; end objectPoseLocal = airSimClient.getObjectPose(finalName, true); objectPoseWorld = airSimClient.getObjectPose(finalName, false); figure; subplot(1, 2, 1); plotTransforms([vehiclePoseLocal.position; objectPoseLocal.position], [vehiclePoseLocal.orientation; objectPoseLocal.orientation], FrameLabel=[\"Vehicle\"; finalName], AxisLabels=\"on\") axis equal; grid on; xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") title(\"Local Plot\") subplot(1, 2, 2); plotTransforms([vehiclePoseWorld.position; objectPoseWorld.position], [vehiclePoseWorld.orientation; objectPoseWorld.orientation], FrameLabel=[\"Vehicle\"; finalName], AxisLabels=\"on\") axis equal; grid on; xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") title(\"World Plot\") drawnow %% Set vehicle pose airSimClient.setVehiclePose(airSimClient.getVehiclePose().position + [1 1 0], airSimClient.getVehiclePose().orientation)","title":"Get some poses"},{"location":"matlab/#imu-sensor-data","text":"imuSensorName = \"imu\"; [imuData, imuTimestamp] = airSimClient.getIMUData(imuSensorName);","title":"IMU sensor Data"},{"location":"matlab/#echo-sensor-data","text":"% Example plots passive echo pointcloud % and its reflection directions as 3D quivers echoSensorName = \"echo\"; enablePassive = true; [activePointCloud, activeData, passivePointCloud, passiveData , echoTimestamp, echoSensorPose] = airSimClient.getEchoData(echoSensorName, enablePassive); figure; subplot(1, 2, 1); if ~isempty(activePointCloud) pcshow(activePointCloud, color=\"X\", MarkerSize=50); else pcshow(pointCloud([0, 0, 0])); end title('Active Echo Sensor Pointcloud') xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") xlim([0 10]) ylim([-10 10]) zlim([-10 10]) subplot(1, 2, 2); if ~isempty(passivePointCloud) pcshow(passivePointCloud, color=\"X\", MarkerSize=50); hold on; quiver3(passivePointCloud.Location(:, 1), passivePointCloud.Location(:, 2), passivePointCloud.Location(:, 3),... passivePointCloud.Normal(:, 1), passivePointCloud.Normal(:, 2), passivePointCloud.Normal(:, 3), 2); hold off else pcshow(pointCloud([0, 0, 0])); end title('Passive Echo Sensor Pointcloud') xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") xlim([0 10]) ylim([-10 10]) zlim([-10 10]) drawnow","title":"Echo sensor data"},{"location":"matlab/#lidar-sensor-data","text":"% Example plots lidar pointcloud and getting the groundtruth labels lidarSensorName = \"lidar\"; enableLabels = true; [lidarPointCloud, lidarLabels, LidarTimestamp, LidarSensorPose] = airSimClient.getLidarData(lidarSensorName, enableLabels); figure; if ~isempty(lidarPointCloud) pcshow(lidarPointCloud, MarkerSize=50); else pcshow(pointCloud([0, 0, 0])); end title('LiDAR Pointcloud') xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") xlim([0 10]) ylim([-10 10]) zlim([-10 10]) drawnow","title":"LiDAR sensor data"},{"location":"matlab/#gpu-lidar-sensor-data","text":"% Example plots GPU lidar pointcloud with its RGB segmentation colors gpuLidarSensorName = \"gpulidar\"; enableLabels = true; [gpuLidarPointCloud, gpuLidarTimestamp, gpuLidarSensorPose] = airSimClient.getGPULidarData(gpuLidarSensorName); figure; if ~isempty(gpuLidarPointCloud) pcshow(gpuLidarPointCloud, MarkerSize=50); else pcshow(pointCloud([0, 0, 0])); end title('GPU-Accelerated LiDAR Pointcloud') xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") xlim([0 10]) ylim([-10 10]) zlim([-10 10]) drawnow","title":"GPU LiDAR sensor data"},{"location":"matlab/#cameras","text":"%% Get camera info cameraSensorName = \"frontcamera\"; [intrinsics, cameraSensorPose] = airSimClient.getCameraInfo(cameraSensorName); %% Get single camera images % Get images sequentially cameraSensorName = \"front_center\"; [rgbImage, rgbCameraIimestamp] = airSimClient.getCameraImage(cameraSensorName, AirSimCameraTypes.Scene); [segmentationImage, segmentationCameraIimestamp] = airSimClient.getCameraImage(cameraSensorName, AirSimCameraTypes.Segmentation); [depthImage, depthCameraIimestamp] = airSimClient.getCameraImage(cameraSensorName, AirSimCameraTypes.DepthPlanar); [annotationImage, annotationCameraIimestamp] = airSimClient.getCameraImage(cameraSensorName, AirSimCameraTypes.Annotation, \"TextureTestDirect\"); figure; subplot(4, 1, 1); imshow(rgbImage) title(\"RGB Camera Image\") subplot(4, 1, 2); imshow(segmentationImage) title(\"Segmentation Camera Image\") subplot(4, 1, 3); imshow(depthImage ./ max(max(depthImage)).* 255, gray) title(\"Depth Camera Image\") subplot(4, 1, 4); imshow(annotationImage) title(\"Annotation Camera Image\") %% Get synced camera images % By combining the image requests they will be synced % and taken in the same frame cameraSensorName = \"front_center\"; [images, cameraIimestamp] = airSimClient.getCameraImages(cameraSensorName, ... [AirSimCameraTypes.Scene, AirSimCameraTypes.Segmentation, AirSimCameraTypes.DepthPlanar, AirSimCameraTypes.Annotation], ... [\"\", \"\", \"\", \"GreyscaleTest\"]); figure; subplot(4, 1, 1); imshow(images{1}) title(\"Synced RGB Camera Image\") subplot(4, 1, 2); imshow(images{2}) title(\"Synced Segmentation Camera Image\") subplot(4, 1, 3); imshow(images{3} ./ max(max(images{3})).* 255, gray) title(\"Synced Depth Camera Image\") subplot(4, 1, 4); imshow(images{4}) title(\"Synced Annotation Camera Image\")","title":"Cameras"},{"location":"mavlinkcom/","text":"Welcome to MavLinkCom MavLinkCom is a cross-platform C++ library that helps connect to and communicate with MavLink based vehicles. Specifically this library is designed to work well with PX4 based drones. Design You can view and edit the Design.dgml diagram in Visual Studio. The following are the most important classes in this library. MavLinkNode This is the base class for all MavLinkNodes (subclasses include MavLinkVehicle, MavLinkVideoClient and MavLinkVideoServer). The node connects to your mavlink enabled vehicle via a MavLinkConnection and provides methods for sending MavLinkMessages and MavLinkCommands and for subscribing to receive messages. This base class also stores the local system id and component id your app wants to use to identify itself to your remote vehicle. You can also call startHeartbeat to send regular heartbeat messages to keep the connection alive. MavLinkMessage This is the encoded MavLinkMessage. For those who have used the mavlink.h C API, this is similar to mavlink_message_t. You do not create these manually, they are encoded from a strongly typed MavLinkMessageBase subclass. Strongly typed message and command classes The MavLinkComGenerator parses the mavlink common.xml message definitions and generates all the MavLink* MavLinkMessageBase subclasses as well as a bunch of handy mavlink enums and a bunch of strongly typed MavLinkCommand subclasses. MavLinkMessageBase This is the base class for a set of strongly typed message classes that are code generated by the MavLinkComGenerator project. This replaces the C messages defined in the mavlink C API and provides a slightly more object oriented way to send and receive messages via sendMessage on MavLinkNode. These classes have encode/decode methods that convert to and from the MavLinkMessage class. MavLinkCommand This is the base class for a set of strongly typed command classes that are code generated by the MavLinkComGenerator project. This replaces the C definitions defined in the mavlink C API and provides a more object oriented way to send commands via the sendCommand method on MavLinkNode. The MavLinkNode takes care of turning these into the underlying mavlink COMMAND_LONG message. MavLinkConnection This class provides static helper methods for creating connections to remote MavLink nodes, over serial ports, as well as UDP, or TCP sockets. This class provides a way to subscribe to receive messages from that node in a pub/sub way so you can have multiple subscribers on the same connection. MavLinkVehicle uses this to track various messages that define the overall vehicle state. MavLinkVehicle MavLinkVehicle is a MavLinkNode that tracks various messages that define the overall vehicle state and provides a VehicleState struct containing a snapshot of that state, including home position, current orientation, local position, global position, and so on. This class also provides a bunch of helper methods that wrap commonly used commands providing simple method calls to do things like arm, disarm, takeoff, land, go to a local coordinate, and fly under offbaord control either by position or velocity control. MavLinkTcpServer This helper class provides a way to setup a \"server\" that accepts MavLinkConnections from remote nodes. You can use this class to get a connection that you can then give to MavLinkVideoServer to serve images over MavLink. MavLinkFtpClient This helper class takes a given MavLinkConnection and provides FTP client support for the MAVLINK_MSG_ID_FILE_TRANSFER_PROTOCOL for vehicles that support the FTP capability. This class provides simple methods to list directory contents, and the get and put files. MavLinkVideoClient This helper class takes a given MavLinkConnection and provides helper methods for requesting video from remote node and packaging up the MAVLINK_MSG_ID_DATA_TRANSMISSION_HANDSHAKE and MAVLINK_MSG_ID_ENCAPSULATED_DATA messages into simple to use MavLinkVideoFrames. MavLinkVideoServer This helper class takes a given MavLinkConnection and provides the server side of the MavLinkVideoClient protocol, including helper methods for notifying when there is a video request to process (hasVideoRequest) and a method to send video frames (sendFrame) which will generate the right MAVLINK_MSG_ID_DATA_TRANSMISSION_HANDSHAKE and MAVLINK_MSG_ID_ENCAPSULATED_DATA sequence. Examples The following code from the UnitTest project shows how to connect to a Pixhawk flight controller over USB serial port, then wait for the first heartbeat message to be received: auto connection = MavLinkConnection::connectSerial(\"drone\", \"/dev/ttyACM0\", 115200, \"sh /etc/init.d/rc.usb\\n\"); MavLinkHeartbeat heartbeat; if (!waitForHeartbeat(10000, heartbeat)) { throw std::runtime_error(\"Received no heartbeat from PX4 after 10 seconds\"); } The following code connects to serial port, and then forwards all messages to and from QGroundControl to that drone using another connection that is joined to the drone stream. auto droneConnection = MavLinkConnection::connectSerial(\"drone\", \"/dev/ttyACM0\", 115200, \"sh /etc/init.d/rc.usb\\n\"); auto proxyConnection = MavLinkConnection::connectRemoteUdp(\"qgc\", \"127.0.0.1\", \"127.0.0.1\", 14550); droneConnection->join(proxyConnection); The following code then takes that connection and turns on heartBeats and starts tracking vehicle information using local system id 166 and component id 1. auto vehicle = std::make_shared(166, 1); vehicle->connect(connection); vehicle->startHeartbeat(); std::this_thread::sleep_for(std::chrono::seconds(5)); VehicleState state = vehicle->getVehicleState(); printf(\"Home position is %s, %f,%f,%f\\n\", state.home.is_set ? \"set\" : \"not set\", state.home.global_pos.lat, state.home.global_pos.lon, state.home.global_pos.alt); The following code uses the vehicle object to arm the drone and take off and wait for the takeoff altitude to be reached: bool rc = false; if (!vehicle->armDisarm(true).wait(3000, &rc) || !rc) { printf(\"arm command failed\\n\"); return; } if (!vehicle->takeoff(targetAlt).wait(3000, &rc) || !rc) { printf(\"takeoff command failed\\n\"); return; } int version = vehicle->getVehicleStateVersion(); while (true) { int newVersion = vehicle->getVehicleStateVersion(); if (version != newVersion) { VehicleState state = vehicle->getVehicleState(); float alt = state.local_est.pos.z; if (alt >= targetAlt - delta && alt <= targetAlt + delta) { reached = true; printf(\"Target altitude reached\\n\"); break; } } else { std::this_thread::sleep_for(std::chrono::milliseconds(10)); } } The following code uses offboard control to make the drone fly in a circle with camera pointed at the center. Here we use the subscribe method to check each new local position message to indicate so we can compute the new velocity vector as soon as that new position is received. We request a high rate for those messages using setMessageInterval to ensure smooth circular orbit. vehicle->setMessageInterval((int)MavLinkMessageIds::MAVLINK_MSG_ID_LOCAL_POSITION_NED, 30); vehicle->requestControl(); int subscription = vehicle->getConnection()->subscribe( [&](std::shared_ptr connection, const MavLinkMessage& m) { if (m.msgid == (int)MavLinkMessageIds::MAVLINK_MSG_ID_LOCAL_POSITION_NED) { // convert generic msg to strongly typed message. MavLinkLocalPositionNed localPos; localPos.decode(msg); float x = localPos.x; float y = localPos.y; float dx = x - cx; float dy = y - cy; float angle = atan2(dy, dx); if (angle < 0) angle += M_PI * 2; float tangent = angle + M_PI_2; double newvx = orbitSpeed * cos(tangent); double newvy = orbitSpeed * sin(tangent); float heading = angle + M_PI; vehicle->moveByLocalVelocityWithAltHold(newvx, newvy, altitude, true, heading); } }); The following code stops flying the drone in offboard mode and tells the drone to loiter at its current location. This version of the code shows how to use the AsyncResult without blocking on a wait call. vehicle->releaseControl(); if (vehicle->loiter().then([=](bool rc) { printf(\"loiter command %s\\n\", rc ? \"succeeded\" : \"failed\"); } The following code gets all configurable parameters from the drone and prints them: auto list = vehicle->getParamList(); auto end = list.end(); int count = 0; for (auto iter = list.begin(); iter < end; iter++) { count++; MavLinkParameter p = *iter; if (p.type == MAV_PARAM_TYPE_REAL32 || p.type == MAV_PARAM_TYPE_REAL64) { printf(\"%s=%f\\n\", p.name.c_str(), p.value); } else { printf(\"%s=%d\\n\", p.name.c_str(), static_cast(p.value)); } } The following code sets a parameter on the Pixhawk to disable the USB safety check (this is handy if you are controlling the Pixhawk over USB using another onboard computer that is part of the drone itself). You should NOT do this if you are connecting your PC or laptop to the drone over USB. MavLinkParameter p; p.name = \"CBRK_USB_CHK\"; p.value = 197848; if (!vehicle->setParameter(p).wait(3000,&rc) || !rc) { printf(\"Setting the CBRK_USB_CHK failed\"); } MavLinkVehicle actually has a helper method for this called allowFlightControlOverUsb, so now you know how it is implemented :-) Advanced Connections You can wire up different configurations of mavlink pipelines using the MavLinkConnection class \"join\" method as shown below. Example 1, we connect to PX4 over serial, and proxy those messages through to QGroundControl and the LogViewer who are listening on remote ports. Example 2: simulation can talk to jMavSim and jMavSim connects to PX4. jMavSim can also manage multiple connections, so it can talk to unreal simulator. Another MavLinkConnection can be joined to proxy connections that jMavSim doesn't support, like the LogViewer or a remote camera node. Example 3: we use MavLinkConnection to connect to PX4 over serial, then join additional connections for all our remote nodes including jMavSim. Example 4: We can also do distributed systems to control the drone remotely:","title":"MavLinkCom"},{"location":"mavlinkcom/#welcome-to-mavlinkcom","text":"MavLinkCom is a cross-platform C++ library that helps connect to and communicate with MavLink based vehicles. Specifically this library is designed to work well with PX4 based drones.","title":"Welcome to MavLinkCom"},{"location":"mavlinkcom/#design","text":"You can view and edit the Design.dgml diagram in Visual Studio. The following are the most important classes in this library.","title":"Design"},{"location":"mavlinkcom/#mavlinknode","text":"This is the base class for all MavLinkNodes (subclasses include MavLinkVehicle, MavLinkVideoClient and MavLinkVideoServer). The node connects to your mavlink enabled vehicle via a MavLinkConnection and provides methods for sending MavLinkMessages and MavLinkCommands and for subscribing to receive messages. This base class also stores the local system id and component id your app wants to use to identify itself to your remote vehicle. You can also call startHeartbeat to send regular heartbeat messages to keep the connection alive.","title":"MavLinkNode"},{"location":"mavlinkcom/#mavlinkmessage","text":"This is the encoded MavLinkMessage. For those who have used the mavlink.h C API, this is similar to mavlink_message_t. You do not create these manually, they are encoded from a strongly typed MavLinkMessageBase subclass.","title":"MavLinkMessage"},{"location":"mavlinkcom/#strongly-typed-message-and-command-classes","text":"The MavLinkComGenerator parses the mavlink common.xml message definitions and generates all the MavLink* MavLinkMessageBase subclasses as well as a bunch of handy mavlink enums and a bunch of strongly typed MavLinkCommand subclasses.","title":"Strongly typed message and command classes"},{"location":"mavlinkcom/#mavlinkmessagebase","text":"This is the base class for a set of strongly typed message classes that are code generated by the MavLinkComGenerator project. This replaces the C messages defined in the mavlink C API and provides a slightly more object oriented way to send and receive messages via sendMessage on MavLinkNode. These classes have encode/decode methods that convert to and from the MavLinkMessage class.","title":"MavLinkMessageBase"},{"location":"mavlinkcom/#mavlinkcommand","text":"This is the base class for a set of strongly typed command classes that are code generated by the MavLinkComGenerator project. This replaces the C definitions defined in the mavlink C API and provides a more object oriented way to send commands via the sendCommand method on MavLinkNode. The MavLinkNode takes care of turning these into the underlying mavlink COMMAND_LONG message.","title":"MavLinkCommand"},{"location":"mavlinkcom/#mavlinkconnection","text":"This class provides static helper methods for creating connections to remote MavLink nodes, over serial ports, as well as UDP, or TCP sockets. This class provides a way to subscribe to receive messages from that node in a pub/sub way so you can have multiple subscribers on the same connection. MavLinkVehicle uses this to track various messages that define the overall vehicle state.","title":"MavLinkConnection"},{"location":"mavlinkcom/#mavlinkvehicle","text":"MavLinkVehicle is a MavLinkNode that tracks various messages that define the overall vehicle state and provides a VehicleState struct containing a snapshot of that state, including home position, current orientation, local position, global position, and so on. This class also provides a bunch of helper methods that wrap commonly used commands providing simple method calls to do things like arm, disarm, takeoff, land, go to a local coordinate, and fly under offbaord control either by position or velocity control.","title":"MavLinkVehicle"},{"location":"mavlinkcom/#mavlinktcpserver","text":"This helper class provides a way to setup a \"server\" that accepts MavLinkConnections from remote nodes. You can use this class to get a connection that you can then give to MavLinkVideoServer to serve images over MavLink.","title":"MavLinkTcpServer"},{"location":"mavlinkcom/#mavlinkftpclient","text":"This helper class takes a given MavLinkConnection and provides FTP client support for the MAVLINK_MSG_ID_FILE_TRANSFER_PROTOCOL for vehicles that support the FTP capability. This class provides simple methods to list directory contents, and the get and put files.","title":"MavLinkFtpClient"},{"location":"mavlinkcom/#mavlinkvideoclient","text":"This helper class takes a given MavLinkConnection and provides helper methods for requesting video from remote node and packaging up the MAVLINK_MSG_ID_DATA_TRANSMISSION_HANDSHAKE and MAVLINK_MSG_ID_ENCAPSULATED_DATA messages into simple to use MavLinkVideoFrames.","title":"MavLinkVideoClient"},{"location":"mavlinkcom/#mavlinkvideoserver","text":"This helper class takes a given MavLinkConnection and provides the server side of the MavLinkVideoClient protocol, including helper methods for notifying when there is a video request to process (hasVideoRequest) and a method to send video frames (sendFrame) which will generate the right MAVLINK_MSG_ID_DATA_TRANSMISSION_HANDSHAKE and MAVLINK_MSG_ID_ENCAPSULATED_DATA sequence.","title":"MavLinkVideoServer"},{"location":"mavlinkcom/#examples","text":"The following code from the UnitTest project shows how to connect to a Pixhawk flight controller over USB serial port, then wait for the first heartbeat message to be received: auto connection = MavLinkConnection::connectSerial(\"drone\", \"/dev/ttyACM0\", 115200, \"sh /etc/init.d/rc.usb\\n\"); MavLinkHeartbeat heartbeat; if (!waitForHeartbeat(10000, heartbeat)) { throw std::runtime_error(\"Received no heartbeat from PX4 after 10 seconds\"); } The following code connects to serial port, and then forwards all messages to and from QGroundControl to that drone using another connection that is joined to the drone stream. auto droneConnection = MavLinkConnection::connectSerial(\"drone\", \"/dev/ttyACM0\", 115200, \"sh /etc/init.d/rc.usb\\n\"); auto proxyConnection = MavLinkConnection::connectRemoteUdp(\"qgc\", \"127.0.0.1\", \"127.0.0.1\", 14550); droneConnection->join(proxyConnection); The following code then takes that connection and turns on heartBeats and starts tracking vehicle information using local system id 166 and component id 1. auto vehicle = std::make_shared(166, 1); vehicle->connect(connection); vehicle->startHeartbeat(); std::this_thread::sleep_for(std::chrono::seconds(5)); VehicleState state = vehicle->getVehicleState(); printf(\"Home position is %s, %f,%f,%f\\n\", state.home.is_set ? \"set\" : \"not set\", state.home.global_pos.lat, state.home.global_pos.lon, state.home.global_pos.alt); The following code uses the vehicle object to arm the drone and take off and wait for the takeoff altitude to be reached: bool rc = false; if (!vehicle->armDisarm(true).wait(3000, &rc) || !rc) { printf(\"arm command failed\\n\"); return; } if (!vehicle->takeoff(targetAlt).wait(3000, &rc) || !rc) { printf(\"takeoff command failed\\n\"); return; } int version = vehicle->getVehicleStateVersion(); while (true) { int newVersion = vehicle->getVehicleStateVersion(); if (version != newVersion) { VehicleState state = vehicle->getVehicleState(); float alt = state.local_est.pos.z; if (alt >= targetAlt - delta && alt <= targetAlt + delta) { reached = true; printf(\"Target altitude reached\\n\"); break; } } else { std::this_thread::sleep_for(std::chrono::milliseconds(10)); } } The following code uses offboard control to make the drone fly in a circle with camera pointed at the center. Here we use the subscribe method to check each new local position message to indicate so we can compute the new velocity vector as soon as that new position is received. We request a high rate for those messages using setMessageInterval to ensure smooth circular orbit. vehicle->setMessageInterval((int)MavLinkMessageIds::MAVLINK_MSG_ID_LOCAL_POSITION_NED, 30); vehicle->requestControl(); int subscription = vehicle->getConnection()->subscribe( [&](std::shared_ptr connection, const MavLinkMessage& m) { if (m.msgid == (int)MavLinkMessageIds::MAVLINK_MSG_ID_LOCAL_POSITION_NED) { // convert generic msg to strongly typed message. MavLinkLocalPositionNed localPos; localPos.decode(msg); float x = localPos.x; float y = localPos.y; float dx = x - cx; float dy = y - cy; float angle = atan2(dy, dx); if (angle < 0) angle += M_PI * 2; float tangent = angle + M_PI_2; double newvx = orbitSpeed * cos(tangent); double newvy = orbitSpeed * sin(tangent); float heading = angle + M_PI; vehicle->moveByLocalVelocityWithAltHold(newvx, newvy, altitude, true, heading); } }); The following code stops flying the drone in offboard mode and tells the drone to loiter at its current location. This version of the code shows how to use the AsyncResult without blocking on a wait call. vehicle->releaseControl(); if (vehicle->loiter().then([=](bool rc) { printf(\"loiter command %s\\n\", rc ? \"succeeded\" : \"failed\"); } The following code gets all configurable parameters from the drone and prints them: auto list = vehicle->getParamList(); auto end = list.end(); int count = 0; for (auto iter = list.begin(); iter < end; iter++) { count++; MavLinkParameter p = *iter; if (p.type == MAV_PARAM_TYPE_REAL32 || p.type == MAV_PARAM_TYPE_REAL64) { printf(\"%s=%f\\n\", p.name.c_str(), p.value); } else { printf(\"%s=%d\\n\", p.name.c_str(), static_cast(p.value)); } } The following code sets a parameter on the Pixhawk to disable the USB safety check (this is handy if you are controlling the Pixhawk over USB using another onboard computer that is part of the drone itself). You should NOT do this if you are connecting your PC or laptop to the drone over USB. MavLinkParameter p; p.name = \"CBRK_USB_CHK\"; p.value = 197848; if (!vehicle->setParameter(p).wait(3000,&rc) || !rc) { printf(\"Setting the CBRK_USB_CHK failed\"); } MavLinkVehicle actually has a helper method for this called allowFlightControlOverUsb, so now you know how it is implemented :-)","title":"Examples"},{"location":"mavlinkcom/#advanced-connections","text":"You can wire up different configurations of mavlink pipelines using the MavLinkConnection class \"join\" method as shown below. Example 1, we connect to PX4 over serial, and proxy those messages through to QGroundControl and the LogViewer who are listening on remote ports. Example 2: simulation can talk to jMavSim and jMavSim connects to PX4. jMavSim can also manage multiple connections, so it can talk to unreal simulator. Another MavLinkConnection can be joined to proxy connections that jMavSim doesn't support, like the LogViewer or a remote camera node. Example 3: we use MavLinkConnection to connect to PX4 over serial, then join additional connections for all our remote nodes including jMavSim. Example 4: We can also do distributed systems to control the drone remotely:","title":"Advanced Connections"},{"location":"mavlinkcom_mocap/","text":"Welcome to MavLinkMoCap This folder contains the MavLinkMoCap library which connects to a OptiTrack camera system for accurate indoor location. Dependencies: OptiTrack Motive . MavLinkCom . Setup RigidBody First you need to define a RigidBody named 'Quadrocopter' using Motive. See Rigid_Body_Tracking . MavLinkTest Use MavLinkTest to talk to your PX4 drone, with \"-server:addr:port\", for example, when connected to drone wifi use: MavLinkMoCap -server:10.42.0.228:14590 \"-project:D:\\OptiTrack\\Motive Project 2016-12-19 04.09.42 PM.ttp\" This publishes the ATT_POS_MOCAP messages and you can proxy those through to the PX4 by running MavLinkTest on the dronebrain using: MavLinkTest -serial:/dev/ttyACM0,115200 -proxy:10.42.0.228:14590 Now the drone will get the ATT_POS_MOCAP and you should see the light turn green meaning it is now has a home position and is ready to fly.","title":"MavLink MoCap"},{"location":"mavlinkcom_mocap/#welcome-to-mavlinkmocap","text":"This folder contains the MavLinkMoCap library which connects to a OptiTrack camera system for accurate indoor location.","title":"Welcome to MavLinkMoCap"},{"location":"mavlinkcom_mocap/#dependencies","text":"OptiTrack Motive . MavLinkCom .","title":"Dependencies:"},{"location":"mavlinkcom_mocap/#setup-rigidbody","text":"First you need to define a RigidBody named 'Quadrocopter' using Motive. See Rigid_Body_Tracking .","title":"Setup RigidBody"},{"location":"mavlinkcom_mocap/#mavlinktest","text":"Use MavLinkTest to talk to your PX4 drone, with \"-server:addr:port\", for example, when connected to drone wifi use: MavLinkMoCap -server:10.42.0.228:14590 \"-project:D:\\OptiTrack\\Motive Project 2016-12-19 04.09.42 PM.ttp\" This publishes the ATT_POS_MOCAP messages and you can proxy those through to the PX4 by running MavLinkTest on the dronebrain using: MavLinkTest -serial:/dev/ttyACM0,115200 -proxy:10.42.0.228:14590 Now the drone will get the ATT_POS_MOCAP and you should see the light turn green meaning it is now has a home position and is ready to fly.","title":"MavLinkTest"},{"location":"meshes/","text":"How to Access Meshes in AIRSIM Cosys-AirSim supports the ability to access the static meshes that make up the scene. Mesh structure Each mesh is represented with the below struct. struct MeshPositionVertexBuffersResponse { Vector3r position; Quaternionr orientation; std::vector vertices; std::vector indices; std::string name; }; The position and orientation are in the Unreal coordinate system. The mesh itself is a triangular mesh represented by the vertices and the indices. The triangular mesh type is typically called a Face-Vertex Mesh. This means every triplet of indices hold the indexes of the vertices that make up the triangle/face. The x,y,z coordinates of the vertices are all stored in a single vector. This means the vertices vector is Nx3 where N is number of vertices. The position of the vertices are the global positions in the Unreal coordinate system. This means they have already been transformed by the position and orientation. How to use The API to get the meshes in the scene is quite simple. However, one should note that the function call is very expensive and should very rarely be called. In general this is ok because this function only accesses the static meshes which for most applications are not changing during the duration of your program. Note that you will have to use a 3rdparty library or your own custom code to actually interact with the received meshes. Below I utilize the Python bindings of libigl to visualize the received meshes. import cosysairsim as airsim AIRSIM_HOST_IP='127.0.0.1' client = airsim.VehicleClient(ip=AIRSIM_HOST_IP) client.confirmConnection() # List of returned meshes are received via this function meshes=client.simGetMeshPositionVertexBuffers() index=0 for m in meshes: # Finds one of the cube meshes in the Blocks environment if 'cube' in m.name: # Code from here on relies on libigl. Libigl uses pybind11 to wrap C++ code. So here the built pyigl.so # library is in the same directory as this example code. # This is here as code for your own mesh library should require something similar from pyigl import * from iglhelpers import * # Convert the lists to numpy arrays vertex_list=np.array(m.vertices,dtype=np.float32) indices=np.array(m.indices,dtype=np.uint32) num_vertices=int(len(vertex_list)/3) num_indices=len(indices) # Libigl requires the shape to be Nx3 where N is number of vertices or indices # It also requires the actual type to be double(float64) for vertices and int64 for the triangles/indices vertices_reshaped=vertex_list.reshape((num_vertices,3)) indices_reshaped=indices.reshape((int(num_indices/3),3)) vertices_reshaped=vertices_reshaped.astype(np.float64) indices_reshaped=indices_reshaped.astype(np.int64) # Libigl function to convert to internal Eigen format v_eig=p2e(vertices_reshaped) i_eig=p2e(indices_reshaped) # View the mesh viewer = igl.glfw.Viewer() viewer.data().set_mesh(v_eig,i_eig) viewer.launch() break","title":"Mesh Vertex Buffers"},{"location":"meshes/#how-to-access-meshes-in-airsim","text":"Cosys-AirSim supports the ability to access the static meshes that make up the scene.","title":"How to Access Meshes in AIRSIM"},{"location":"meshes/#mesh-structure","text":"Each mesh is represented with the below struct. struct MeshPositionVertexBuffersResponse { Vector3r position; Quaternionr orientation; std::vector vertices; std::vector indices; std::string name; }; The position and orientation are in the Unreal coordinate system. The mesh itself is a triangular mesh represented by the vertices and the indices. The triangular mesh type is typically called a Face-Vertex Mesh. This means every triplet of indices hold the indexes of the vertices that make up the triangle/face. The x,y,z coordinates of the vertices are all stored in a single vector. This means the vertices vector is Nx3 where N is number of vertices. The position of the vertices are the global positions in the Unreal coordinate system. This means they have already been transformed by the position and orientation.","title":"Mesh structure"},{"location":"meshes/#how-to-use","text":"The API to get the meshes in the scene is quite simple. However, one should note that the function call is very expensive and should very rarely be called. In general this is ok because this function only accesses the static meshes which for most applications are not changing during the duration of your program. Note that you will have to use a 3rdparty library or your own custom code to actually interact with the received meshes. Below I utilize the Python bindings of libigl to visualize the received meshes. import cosysairsim as airsim AIRSIM_HOST_IP='127.0.0.1' client = airsim.VehicleClient(ip=AIRSIM_HOST_IP) client.confirmConnection() # List of returned meshes are received via this function meshes=client.simGetMeshPositionVertexBuffers() index=0 for m in meshes: # Finds one of the cube meshes in the Blocks environment if 'cube' in m.name: # Code from here on relies on libigl. Libigl uses pybind11 to wrap C++ code. So here the built pyigl.so # library is in the same directory as this example code. # This is here as code for your own mesh library should require something similar from pyigl import * from iglhelpers import * # Convert the lists to numpy arrays vertex_list=np.array(m.vertices,dtype=np.float32) indices=np.array(m.indices,dtype=np.uint32) num_vertices=int(len(vertex_list)/3) num_indices=len(indices) # Libigl requires the shape to be Nx3 where N is number of vertices or indices # It also requires the actual type to be double(float64) for vertices and int64 for the triangles/indices vertices_reshaped=vertex_list.reshape((num_vertices,3)) indices_reshaped=indices.reshape((int(num_indices/3),3)) vertices_reshaped=vertices_reshaped.astype(np.float64) indices_reshaped=indices_reshaped.astype(np.int64) # Libigl function to convert to internal Eigen format v_eig=p2e(vertices_reshaped) i_eig=p2e(indices_reshaped) # View the mesh viewer = igl.glfw.Viewer() viewer.data().set_mesh(v_eig,i_eig) viewer.launch() break","title":"How to use"},{"location":"modify_recording_data/","text":"Modifying Recording Data Cosys-AirSim has a Recording feature to easily collect data and images. The Recording APIs also allows starting and stopping the recording using API. However, the data recorded by default might not be sufficient for your use cases, and it might be preferable to record additional data such as IMU, GPS sensors, Rotor speed for copters, etc. You can use the existing Python and C++ APIs to get the information and store it as required, especially for Lidar. Another option for adding small fields such as GPS or internal data such as Unreal position or something else is possible through modifying the recording methods inside Cosys-AirSim. This page describes the specific methods which you might need to change. The recorded data is written in a airsim_rec.txt file in a tab-separated format, with images in an images/ folder. The entire folder is by default present in the Documents folder (or specified in settings) with the timestamp of when the recording started in %Y-%M-%D-%H-%M-%S format. Car vehicle records the following fields - VehicleName TimeStamp POS_X POS_Y POS_Z Q_W Q_X Q_Y Q_Z Throttle Steering Brake Gear Handbrake RPM Speed ImageFile For Multirotor - VehicleName TimeStamp POS_X POS_Y POS_Z Q_W Q_X Q_Y Q_Z ImageFile Code Changes Note that this requires building and using Cosys-AirSim from source. You can compile a binary yourself after modifying if needed. The primary method which fills the data to be stored is PawnSimApi::getRecordFileLine , it's the base method for all the vehicles, and Car overrides it to log additional data, as can be seen in CarPawnSimApi::getRecordFileLine . To record additional data for multirotor, you can add a similar method in MultirotorPawnSimApi.cpp/h files which overrides the base class implementation and append other data. The currently logged data can also be modified and removed as needed. E.g. recording GPS, IMU and Barometer data also for multirotor - // MultirotorPawnSimApi.cpp std::string MultirotorPawnSimApi::getRecordFileLine(bool is_header_line) const { std::string common_line = PawnSimApi::getRecordFileLine(is_header_line); if (is_header_line) { return common_line + \"Latitude\\tLongitude\\tAltitude\\tPressure\\tAccX\\tAccY\\tAccZ\\t\"; } const auto& state = vehicle_api_->getMultirotorState(); const auto& bar_data = vehicle_api_->getBarometerData(\"\"); const auto& imu_data = vehicle_api_->getImuData(\"\"); std::ostringstream ss; ss << common_line; ss << state.gps_location.latitude << \"\\t\" << state.gps_location.longitude << \"\\t\" << state.gps_location.altitude << \"\\t\"; ss << bar_data.pressure << \"\\t\"; ss << imu_data.linear_acceleration.x() << \"\\t\" << imu_data.linear_acceleration.y() << \"\\t\" << imu_data.linear_acceleration.z() << \"\\t\"; return ss.str(); } // MultirotorPawnSimApi.h virtual std::string getRecordFileLine(bool is_header_line) const override;","title":"Modifying Recording Data"},{"location":"modify_recording_data/#modifying-recording-data","text":"Cosys-AirSim has a Recording feature to easily collect data and images. The Recording APIs also allows starting and stopping the recording using API. However, the data recorded by default might not be sufficient for your use cases, and it might be preferable to record additional data such as IMU, GPS sensors, Rotor speed for copters, etc. You can use the existing Python and C++ APIs to get the information and store it as required, especially for Lidar. Another option for adding small fields such as GPS or internal data such as Unreal position or something else is possible through modifying the recording methods inside Cosys-AirSim. This page describes the specific methods which you might need to change. The recorded data is written in a airsim_rec.txt file in a tab-separated format, with images in an images/ folder. The entire folder is by default present in the Documents folder (or specified in settings) with the timestamp of when the recording started in %Y-%M-%D-%H-%M-%S format. Car vehicle records the following fields - VehicleName TimeStamp POS_X POS_Y POS_Z Q_W Q_X Q_Y Q_Z Throttle Steering Brake Gear Handbrake RPM Speed ImageFile For Multirotor - VehicleName TimeStamp POS_X POS_Y POS_Z Q_W Q_X Q_Y Q_Z ImageFile","title":"Modifying Recording Data"},{"location":"modify_recording_data/#code-changes","text":"Note that this requires building and using Cosys-AirSim from source. You can compile a binary yourself after modifying if needed. The primary method which fills the data to be stored is PawnSimApi::getRecordFileLine , it's the base method for all the vehicles, and Car overrides it to log additional data, as can be seen in CarPawnSimApi::getRecordFileLine . To record additional data for multirotor, you can add a similar method in MultirotorPawnSimApi.cpp/h files which overrides the base class implementation and append other data. The currently logged data can also be modified and removed as needed. E.g. recording GPS, IMU and Barometer data also for multirotor - // MultirotorPawnSimApi.cpp std::string MultirotorPawnSimApi::getRecordFileLine(bool is_header_line) const { std::string common_line = PawnSimApi::getRecordFileLine(is_header_line); if (is_header_line) { return common_line + \"Latitude\\tLongitude\\tAltitude\\tPressure\\tAccX\\tAccY\\tAccZ\\t\"; } const auto& state = vehicle_api_->getMultirotorState(); const auto& bar_data = vehicle_api_->getBarometerData(\"\"); const auto& imu_data = vehicle_api_->getImuData(\"\"); std::ostringstream ss; ss << common_line; ss << state.gps_location.latitude << \"\\t\" << state.gps_location.longitude << \"\\t\" << state.gps_location.altitude << \"\\t\"; ss << bar_data.pressure << \"\\t\"; ss << imu_data.linear_acceleration.x() << \"\\t\" << imu_data.linear_acceleration.y() << \"\\t\" << imu_data.linear_acceleration.z() << \"\\t\"; return ss.str(); } // MultirotorPawnSimApi.h virtual std::string getRecordFileLine(bool is_header_line) const override;","title":"Code Changes"},{"location":"multi_vehicle/","text":"Multiple Vehicles in AirSim Since release 1.2, AirSim is fully enabled for multiple vehicles. This capability allows you to create multiple vehicles easily and use APIs to control them. Creating Multiple Vehicles It's as easy as specifying them in settings.json . The Vehicles element allows you to specify list of vehicles you want to create along with their initial positions and orientations. The positions are specified in NED coordinates in SI units with origin set at Player Start component in Unreal environment. The orientation is specified as Yaw, Pitch and Roll in degrees. Creating Multiple Cars { \"SettingsVersion\": 2.0, \"SimMode\": \"Car\", \"Vehicles\": { \"Car1\": { \"VehicleType\": \"PhysXCar\", \"X\": 4, \"Y\": 0, \"Z\": -2 }, \"Car2\": { \"VehicleType\": \"PhysXCar\", \"X\": -4, \"Y\": 0, \"Z\": -2, \"Yaw\": 90 } } } Creating Multiple Drones { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"Drone1\": { \"VehicleType\": \"SimpleFlight\", \"X\": 4, \"Y\": 0, \"Z\": -2, \"Yaw\": -180 }, \"Drone2\": { \"VehicleType\": \"SimpleFlight\", \"X\": 8, \"Y\": 0, \"Z\": -2 } } } Using APIs for Multiple Vehicles The new APIs since AirSim 1.2 allows you to specify vehicle_name . This name corresponds to keys in json settings (for example, Car1 or Drone2 above). Example code for cars Example code for multirotors Using APIs for multi-vehicles requires specifying the vehicle_name , which needs to be hardcoded in the script or requires parsing of the settings file. There's also a simple API listVehicles() which returns a list (vector in C++) of strings containing names of the current vehicles. For example, with the above settings for 2 Cars - >>> client.listVehicles() ['Car1', 'Car2'] Demo Creating vehicles at runtime through API In the latest main branch of AirSim, the simAddVehicle API can be used to create vehicles at runtime. This is useful to create many such vehicles without needing to specify them in the settings. There are some limitations of this currently, described below - simAddVehicle takes in the following arguments: vehicle_name : Name of the vehicle to be created, this should be unique for each vehicle including any exisiting ones defined in the settings.json vehicle_type : Type of vehicle, e.g. \"simpleflight\". Currently only SimpleFlight, PhysXCar, ComputerVision are supported, in their respective SimModes. Other vehicle types including PX4 and ArduPilot-related aren't supported pose : Initial pose of the vehicle pawn_path : Vehicle blueprint path, default empty wbich uses the default blueprint for the vehicle type Returns: bool Whether vehicle was created The usual APIs can be used to control and interact with the vehicle once created, with the vehicle_name parameter. Specifying other settings such as additional cameras, etc. isn't possible currently, a future enhancement could be passing JSON string of settings for the vehicle. It also works with the listVehicles() API described above, so the vehicles spawned would be included in the list. For some examples, check out HelloSpawnedDrones.cpp - And runtime_car.py -","title":"Multiple Vehicles"},{"location":"multi_vehicle/#multiple-vehicles-in-airsim","text":"Since release 1.2, AirSim is fully enabled for multiple vehicles. This capability allows you to create multiple vehicles easily and use APIs to control them.","title":"Multiple Vehicles in AirSim"},{"location":"multi_vehicle/#creating-multiple-vehicles","text":"It's as easy as specifying them in settings.json . The Vehicles element allows you to specify list of vehicles you want to create along with their initial positions and orientations. The positions are specified in NED coordinates in SI units with origin set at Player Start component in Unreal environment. The orientation is specified as Yaw, Pitch and Roll in degrees.","title":"Creating Multiple Vehicles"},{"location":"multi_vehicle/#creating-multiple-cars","text":"{ \"SettingsVersion\": 2.0, \"SimMode\": \"Car\", \"Vehicles\": { \"Car1\": { \"VehicleType\": \"PhysXCar\", \"X\": 4, \"Y\": 0, \"Z\": -2 }, \"Car2\": { \"VehicleType\": \"PhysXCar\", \"X\": -4, \"Y\": 0, \"Z\": -2, \"Yaw\": 90 } } }","title":"Creating Multiple Cars"},{"location":"multi_vehicle/#creating-multiple-drones","text":"{ \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"Drone1\": { \"VehicleType\": \"SimpleFlight\", \"X\": 4, \"Y\": 0, \"Z\": -2, \"Yaw\": -180 }, \"Drone2\": { \"VehicleType\": \"SimpleFlight\", \"X\": 8, \"Y\": 0, \"Z\": -2 } } }","title":"Creating Multiple Drones"},{"location":"multi_vehicle/#using-apis-for-multiple-vehicles","text":"The new APIs since AirSim 1.2 allows you to specify vehicle_name . This name corresponds to keys in json settings (for example, Car1 or Drone2 above). Example code for cars Example code for multirotors Using APIs for multi-vehicles requires specifying the vehicle_name , which needs to be hardcoded in the script or requires parsing of the settings file. There's also a simple API listVehicles() which returns a list (vector in C++) of strings containing names of the current vehicles. For example, with the above settings for 2 Cars - >>> client.listVehicles() ['Car1', 'Car2']","title":"Using APIs for Multiple Vehicles"},{"location":"multi_vehicle/#demo","text":"","title":"Demo"},{"location":"multi_vehicle/#creating-vehicles-at-runtime-through-api","text":"In the latest main branch of AirSim, the simAddVehicle API can be used to create vehicles at runtime. This is useful to create many such vehicles without needing to specify them in the settings. There are some limitations of this currently, described below - simAddVehicle takes in the following arguments: vehicle_name : Name of the vehicle to be created, this should be unique for each vehicle including any exisiting ones defined in the settings.json vehicle_type : Type of vehicle, e.g. \"simpleflight\". Currently only SimpleFlight, PhysXCar, ComputerVision are supported, in their respective SimModes. Other vehicle types including PX4 and ArduPilot-related aren't supported pose : Initial pose of the vehicle pawn_path : Vehicle blueprint path, default empty wbich uses the default blueprint for the vehicle type Returns: bool Whether vehicle was created The usual APIs can be used to control and interact with the vehicle once created, with the vehicle_name parameter. Specifying other settings such as additional cameras, etc. isn't possible currently, a future enhancement could be passing JSON string of settings for the vehicle. It also works with the listVehicles() API described above, so the vehicles spawned would be included in the list. For some examples, check out HelloSpawnedDrones.cpp - And runtime_car.py -","title":"Creating vehicles at runtime through API"},{"location":"object_detection/","text":"Object Detection About This feature lets you generate object detection using existing cameras in Cosys-AirSim, similar to detection DNN. Using the API you can control which object to detect by name and radius from camera. One can control these settings for each camera, image type and vehicle combination separately. API Set mesh name to detect in wildcard format simAddDetectionFilterMeshName(camera_name, image_type, mesh_name, vehicle_name = '') Clear all mesh names previously added simClearDetectionMeshNames(camera_name, image_type, vehicle_name = '') Set detection radius in cm simSetDetectionFilterRadius(camera_name, image_type, radius_cm, vehicle_name = '') Get detections simGetDetections(camera_name, image_type, vehicle_name = '') Note that if using Annotation camera one has to also give the annotation_name argument to choose the right annotation camera. For example: simGetDetections(camera_name, image_type, vehicle_name = '', annotation_name=\"mygreyscaleannotation\") The return value of simGetDetections is a DetectionInfo array: DetectionInfo name = '' geo_point = GeoPoint() box2D = Box2D() box3D = Box3D() relative_pose = Pose() Usage example Python script detection.py shows how to set detection parameters and shows the result in OpenCV capture. A minimal example using API with Blocks environment to detect Cylinder objects: camera_name = \"0\" image_type = airsim.ImageType.Scene client = airsim.MultirotorClient() client.confirmConnection() client.simSetDetectionFilterRadius(camera_name, image_type, 80 * 100) # in [cm] client.simAddDetectionFilterMeshName(camera_name, image_type, \"Cylinder_*\") client.simGetDetections(camera_name, image_type) detections = client.simClearDetectionMeshNames(camera_name, image_type) Output result: Cylinder: { 'box2D': { 'max': { 'x_val': 617.025634765625, 'y_val': 583.5487060546875}, 'min': { 'x_val': 485.74359130859375, 'y_val': 438.33465576171875}}, 'box3D': { 'max': { 'x_val': 4.900000095367432, 'y_val': 0.7999999523162842, 'z_val': 0.5199999809265137}, 'min': { 'x_val': 3.8999998569488525, 'y_val': -0.19999998807907104, 'z_val': 1.5199999809265137}}, 'geo_point': { 'altitude': 16.979999542236328, 'latitude': 32.28772183970703, 'longitude': 34.864785008379876}, 'name': 'Cylinder9_2', 'relative_pose': { 'orientation': { 'w_val': 0.9929741621017456, 'x_val': 0.0038591264747083187, 'y_val': -0.11333247274160385, 'z_val': 0.03381215035915375}, 'position': { 'x_val': 4.400000095367432, 'y_val': 0.29999998211860657, 'z_val': 1.0199999809265137}}}","title":"Object Detection"},{"location":"object_detection/#object-detection","text":"","title":"Object Detection"},{"location":"object_detection/#about","text":"This feature lets you generate object detection using existing cameras in Cosys-AirSim, similar to detection DNN. Using the API you can control which object to detect by name and radius from camera. One can control these settings for each camera, image type and vehicle combination separately.","title":"About"},{"location":"object_detection/#api","text":"Set mesh name to detect in wildcard format simAddDetectionFilterMeshName(camera_name, image_type, mesh_name, vehicle_name = '') Clear all mesh names previously added simClearDetectionMeshNames(camera_name, image_type, vehicle_name = '') Set detection radius in cm simSetDetectionFilterRadius(camera_name, image_type, radius_cm, vehicle_name = '') Get detections simGetDetections(camera_name, image_type, vehicle_name = '') Note that if using Annotation camera one has to also give the annotation_name argument to choose the right annotation camera. For example: simGetDetections(camera_name, image_type, vehicle_name = '', annotation_name=\"mygreyscaleannotation\") The return value of simGetDetections is a DetectionInfo array: DetectionInfo name = '' geo_point = GeoPoint() box2D = Box2D() box3D = Box3D() relative_pose = Pose()","title":"API"},{"location":"object_detection/#usage-example","text":"Python script detection.py shows how to set detection parameters and shows the result in OpenCV capture. A minimal example using API with Blocks environment to detect Cylinder objects: camera_name = \"0\" image_type = airsim.ImageType.Scene client = airsim.MultirotorClient() client.confirmConnection() client.simSetDetectionFilterRadius(camera_name, image_type, 80 * 100) # in [cm] client.simAddDetectionFilterMeshName(camera_name, image_type, \"Cylinder_*\") client.simGetDetections(camera_name, image_type) detections = client.simClearDetectionMeshNames(camera_name, image_type) Output result: Cylinder: { 'box2D': { 'max': { 'x_val': 617.025634765625, 'y_val': 583.5487060546875}, 'min': { 'x_val': 485.74359130859375, 'y_val': 438.33465576171875}}, 'box3D': { 'max': { 'x_val': 4.900000095367432, 'y_val': 0.7999999523162842, 'z_val': 0.5199999809265137}, 'min': { 'x_val': 3.8999998569488525, 'y_val': -0.19999998807907104, 'z_val': 1.5199999809265137}}, 'geo_point': { 'altitude': 16.979999542236328, 'latitude': 32.28772183970703, 'longitude': 34.864785008379876}, 'name': 'Cylinder9_2', 'relative_pose': { 'orientation': { 'w_val': 0.9929741621017456, 'x_val': 0.0038591264747083187, 'y_val': -0.11333247274160385, 'z_val': 0.03381215035915375}, 'position': { 'x_val': 4.400000095367432, 'y_val': 0.29999998211860657, 'z_val': 1.0199999809265137}}}","title":"Usage example"},{"location":"pfm/","text":"pfm Format Pfm (or Portable FloatMap) image format stores image as floating point pixels and hence are not restricted to usual 0-255 pixel value range. This is useful for HDR images or images that describes something other than colors like depth. One of the good viewer to view this file format is PfmPad . We don't recommend Maverick photo viewer because it doesn't seem to show depth images properly. AirSim has code to write pfm file for C++ and read as well as write for Python .","title":"pfm format"},{"location":"pfm/#pfm-format","text":"Pfm (or Portable FloatMap) image format stores image as floating point pixels and hence are not restricted to usual 0-255 pixel value range. This is useful for HDR images or images that describes something other than colors like depth. One of the good viewer to view this file format is PfmPad . We don't recommend Maverick photo viewer because it doesn't seem to show depth images properly. AirSim has code to write pfm file for C++ and read as well as write for Python .","title":"pfm Format"},{"location":"playback/","text":"Playback AirSim supports playing back the high level commands in a *.mavlink log file that were recorded using the MavLinkTest app for the purpose of comparing real and simulated flight. The recording.mavlink is an example of a log file captured using a real drone using the following command line: MavLinkTest -serial:/dev/ttyACM0,115200 -logdir:. Then the log file contains the commands performed, which included several \"orbit\" commands, the resulting GPS map of the flight looks like this: Side-by-side comparison Now we can copy the *.mavlink log file recorded by MavLinkTest to the PC running the Unreal simulator with AirSim plugin. When the Simulator is running and the drone is parked in a place in a map that has room to do the same maneuvers we can run this MavLinkTest command line: MavLinkTest -server:127.0.0.1:14550 This should connect to the simulator. Now you can enter this command: PlayLog recording.mavlink The same commands you performed on the real drone will now play again in the simulator. You can then press 't' to see the trace, and it will show you the trace of the real drone and the simulated drone. Every time you press 't' again you can reset the lines so they are sync'd to the current position, this way I was able to capture a side-by-side trace of the \"orbit\" command performed in this recording, which generates the picture below. The pink line is the simulated flight and the red line is the real flight: Note: I'm using the ';' key in the simulator to take control of camera position using keyboard to get this shot. Parameters It may help to set the simulator up with some of the same flight parameters that your real drone is using, for example, in my case I was using a lower than normal cruise speed, slow takeoff speed, and it helps to tell the simulator to wait a long time before disarming (COM_DISARM_LAND) and to turn off the safety switches NAV_RCL_ACT and NAV_DLL_ACT ( don't do that on a real drone). param MPC_XY_CRUISE 2 param MPC_XY_VEL_MAX 2 param MPC_TKO_SPEED 1 param COM_DISARM_LAND 60 param NAV_RCL_ACT 0 param NAV_DLL_ACT 0","title":"Playing Logs"},{"location":"playback/#playback","text":"AirSim supports playing back the high level commands in a *.mavlink log file that were recorded using the MavLinkTest app for the purpose of comparing real and simulated flight. The recording.mavlink is an example of a log file captured using a real drone using the following command line: MavLinkTest -serial:/dev/ttyACM0,115200 -logdir:. Then the log file contains the commands performed, which included several \"orbit\" commands, the resulting GPS map of the flight looks like this:","title":"Playback"},{"location":"playback/#side-by-side-comparison","text":"Now we can copy the *.mavlink log file recorded by MavLinkTest to the PC running the Unreal simulator with AirSim plugin. When the Simulator is running and the drone is parked in a place in a map that has room to do the same maneuvers we can run this MavLinkTest command line: MavLinkTest -server:127.0.0.1:14550 This should connect to the simulator. Now you can enter this command: PlayLog recording.mavlink The same commands you performed on the real drone will now play again in the simulator. You can then press 't' to see the trace, and it will show you the trace of the real drone and the simulated drone. Every time you press 't' again you can reset the lines so they are sync'd to the current position, this way I was able to capture a side-by-side trace of the \"orbit\" command performed in this recording, which generates the picture below. The pink line is the simulated flight and the red line is the real flight: Note: I'm using the ';' key in the simulator to take control of camera position using keyboard to get this shot.","title":"Side-by-side comparison"},{"location":"playback/#parameters","text":"It may help to set the simulator up with some of the same flight parameters that your real drone is using, for example, in my case I was using a lower than normal cruise speed, slow takeoff speed, and it helps to tell the simulator to wait a long time before disarming (COM_DISARM_LAND) and to turn off the safety switches NAV_RCL_ACT and NAV_DLL_ACT ( don't do that on a real drone). param MPC_XY_CRUISE 2 param MPC_XY_VEL_MAX 2 param MPC_TKO_SPEED 1 param COM_DISARM_LAND 60 param NAV_RCL_ACT 0 param NAV_DLL_ACT 0","title":"Parameters"},{"location":"px4_build/","text":"Building PX4 Source code Getting the PX4 source code is easy: sudo apt-get install git git clone https://github.com/PX4/PX4-Autopilot.git --recursive bash ./PX4-Autopilot/Tools/setup/ubuntu.sh --no-sim-tools cd PX4-Autopilot Now to build it you will need the right tools. PX4 Build tools The full instructions are available on the dev.px4.io website, but we've copied the relevant subset of those instructions here for your convenience. (Note that BashOnWindows ) can be used to build the PX4 firmware, just follow the BashOnWindows instructions at the bottom of this page) then proceed with the Ubuntu setup for PX4. Build SITL version Now you can make the SITL version that runs in posix, from the Firmware folder you created above: make px4_sitl_default none_iris Note: this build system is quite special, it knows how to update git submodules (and there's a lot of them), then it runs cmake (if necessary), then it runs the build itself. So in a way the root Makefile is a meta-meta makefile :-) You might see prompts like this: ******************************************************************************* * IF YOU DID NOT CHANGE THIS FILE (OR YOU DON'T KNOW WHAT A SUBMODULE IS): * * Hit 'u' and to update ALL submodules and resolve this. * * (performs git submodule sync --recursive * * and git submodule update --init --recursive ) * ******************************************************************************* Every time you see this prompt type 'u' on your keyboard. It shouldn't take long, about 2 minutes. If all succeeds, the last line will link the px4 app, which you can then run using the following: make px4_sitl_default none_iris And you should see output that looks like this: creating new parameters file creating new dataman file ______ __ __ ___ | ___ \\ \\ \\ / / / | | |_/ / \\ V / / /| | | __/ / \\ / /_| | | | / /^\\ \\ \\___ | \\_| \\/ \\/ |_/ px4 starting. 18446744073709551615 WARNING: setRealtimeSched failed (not run as root?) ERROR [param] importing from 'rootfs/eeprom/parameters' failed (-1) Command 'param' failed, returned 1 SYS_AUTOSTART: curr: 0 -> new: 4010 SYS_MC_EST_GROUP: curr: 2 -> new: 1 INFO [dataman] Unkown restart, data manager file 'rootfs/fs/microsd/dataman' size is 11797680 bytes BAT_N_CELLS: curr: 0 -> new: 3 CAL_GYRO0_ID: curr: 0 -> new: 2293768 CAL_ACC0_ID: curr: 0 -> new: 1376264 CAL_ACC1_ID: curr: 0 -> new: 1310728 CAL_MAG0_ID: curr: 0 -> new: 196616 so this is good, first run sets up the px4 parameters for SITL mode. Second run has less output. This app is also an interactive console where you can type commands. Type 'help' to see what they are and just type ctrl-C to kill it. You can do that and restart it any time, that's a great way to reset any wonky state if you need to (it's equivalent to a Pixhawk hardware reboot). ARM embedded tools If you plan to build the PX4 firmware for real Pixhawk hardware then you will need the gcc cross-compiler for ARM Cortex-M4 chipset. You can get this compiler by PX4 DevGuide, specifically this is in their ubuntu_sim_nuttx.sh setup script. After following those setup instructions you can verify the install by entering this command arm-none-eabi-gcc --version . You should see the following output: arm-none-eabi-gcc (GNU Tools for Arm Embedded Processors 7-2017-q4-major) 7.2.1 20170904 (release) [ARM/embedded-7-branch revision 255204] Copyright (C) 2017 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. Build PX4 for ARM hardware Now you can build the PX4 firmware for running on real pixhawk hardware: make px4_fmu-v4 This build will take a little longer because it is building a lot more including the NuttX real time OS, all the drivers for the sensors in the Pixhawk flight controller, and more. It is also running the compiler in super size-squeezing mode so it can fit all that in a 1 megabyte ROM !! One nice tid bit is you can plug in your pixhawk USB, and type make px4fmu-v2_default upload to flash the hardware with these brand new bits, so you don't need to use QGroundControl for that. Some Useful Parameters PX4 has many customizable parameters (over 700 of them, in fact) and to get best results with Cosys-AirSim we have found the following parameters are handy: // be sure to enable the new position estimator module: param set SYS_MC_EST_GROUP 2 // increase default limits on cruise speed so you can move around a large map more quickly. param MPC_XY_CRUISE 10 param MPC_XY_VEL_MAX 10 param MPC_Z_VEL_MAX_DN 2 // increase timeout for auto-disarm on landing so that any long running app doesn't have to worry about it param COM_DISARM_LAND 60 // make it possible to fly without radio control attached (do NOT do this one on a real drone) param NAV_RCL_ACT 0 // enable new syslogger to get more information from PX4 logs param set SYS_LOGGER 1 Using BashOnWindows See Bash on Windows Toolchain .","title":"Building PX4"},{"location":"px4_build/#building-px4","text":"","title":"Building PX4"},{"location":"px4_build/#source-code","text":"Getting the PX4 source code is easy: sudo apt-get install git git clone https://github.com/PX4/PX4-Autopilot.git --recursive bash ./PX4-Autopilot/Tools/setup/ubuntu.sh --no-sim-tools cd PX4-Autopilot Now to build it you will need the right tools.","title":"Source code"},{"location":"px4_build/#px4-build-tools","text":"The full instructions are available on the dev.px4.io website, but we've copied the relevant subset of those instructions here for your convenience. (Note that BashOnWindows ) can be used to build the PX4 firmware, just follow the BashOnWindows instructions at the bottom of this page) then proceed with the Ubuntu setup for PX4.","title":"PX4 Build tools"},{"location":"px4_build/#build-sitl-version","text":"Now you can make the SITL version that runs in posix, from the Firmware folder you created above: make px4_sitl_default none_iris Note: this build system is quite special, it knows how to update git submodules (and there's a lot of them), then it runs cmake (if necessary), then it runs the build itself. So in a way the root Makefile is a meta-meta makefile :-) You might see prompts like this: ******************************************************************************* * IF YOU DID NOT CHANGE THIS FILE (OR YOU DON'T KNOW WHAT A SUBMODULE IS): * * Hit 'u' and to update ALL submodules and resolve this. * * (performs git submodule sync --recursive * * and git submodule update --init --recursive ) * ******************************************************************************* Every time you see this prompt type 'u' on your keyboard. It shouldn't take long, about 2 minutes. If all succeeds, the last line will link the px4 app, which you can then run using the following: make px4_sitl_default none_iris And you should see output that looks like this: creating new parameters file creating new dataman file ______ __ __ ___ | ___ \\ \\ \\ / / / | | |_/ / \\ V / / /| | | __/ / \\ / /_| | | | / /^\\ \\ \\___ | \\_| \\/ \\/ |_/ px4 starting. 18446744073709551615 WARNING: setRealtimeSched failed (not run as root?) ERROR [param] importing from 'rootfs/eeprom/parameters' failed (-1) Command 'param' failed, returned 1 SYS_AUTOSTART: curr: 0 -> new: 4010 SYS_MC_EST_GROUP: curr: 2 -> new: 1 INFO [dataman] Unkown restart, data manager file 'rootfs/fs/microsd/dataman' size is 11797680 bytes BAT_N_CELLS: curr: 0 -> new: 3 CAL_GYRO0_ID: curr: 0 -> new: 2293768 CAL_ACC0_ID: curr: 0 -> new: 1376264 CAL_ACC1_ID: curr: 0 -> new: 1310728 CAL_MAG0_ID: curr: 0 -> new: 196616 so this is good, first run sets up the px4 parameters for SITL mode. Second run has less output. This app is also an interactive console where you can type commands. Type 'help' to see what they are and just type ctrl-C to kill it. You can do that and restart it any time, that's a great way to reset any wonky state if you need to (it's equivalent to a Pixhawk hardware reboot).","title":"Build SITL version"},{"location":"px4_build/#arm-embedded-tools","text":"If you plan to build the PX4 firmware for real Pixhawk hardware then you will need the gcc cross-compiler for ARM Cortex-M4 chipset. You can get this compiler by PX4 DevGuide, specifically this is in their ubuntu_sim_nuttx.sh setup script. After following those setup instructions you can verify the install by entering this command arm-none-eabi-gcc --version . You should see the following output: arm-none-eabi-gcc (GNU Tools for Arm Embedded Processors 7-2017-q4-major) 7.2.1 20170904 (release) [ARM/embedded-7-branch revision 255204] Copyright (C) 2017 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.","title":"ARM embedded tools"},{"location":"px4_build/#build-px4-for-arm-hardware","text":"Now you can build the PX4 firmware for running on real pixhawk hardware: make px4_fmu-v4 This build will take a little longer because it is building a lot more including the NuttX real time OS, all the drivers for the sensors in the Pixhawk flight controller, and more. It is also running the compiler in super size-squeezing mode so it can fit all that in a 1 megabyte ROM !! One nice tid bit is you can plug in your pixhawk USB, and type make px4fmu-v2_default upload to flash the hardware with these brand new bits, so you don't need to use QGroundControl for that.","title":"Build PX4 for ARM hardware"},{"location":"px4_build/#some-useful-parameters","text":"PX4 has many customizable parameters (over 700 of them, in fact) and to get best results with Cosys-AirSim we have found the following parameters are handy: // be sure to enable the new position estimator module: param set SYS_MC_EST_GROUP 2 // increase default limits on cruise speed so you can move around a large map more quickly. param MPC_XY_CRUISE 10 param MPC_XY_VEL_MAX 10 param MPC_Z_VEL_MAX_DN 2 // increase timeout for auto-disarm on landing so that any long running app doesn't have to worry about it param COM_DISARM_LAND 60 // make it possible to fly without radio control attached (do NOT do this one on a real drone) param NAV_RCL_ACT 0 // enable new syslogger to get more information from PX4 logs param set SYS_LOGGER 1","title":"Some Useful Parameters"},{"location":"px4_build/#using-bashonwindows","text":"See Bash on Windows Toolchain .","title":"Using BashOnWindows"},{"location":"px4_lockstep/","text":"LockStep The latest version of PX4 supports a new lockstep feature when communicating with the simulator over TCP. Lockstep is an important feature because it synchronizes PX4 and the simulator so they essentially use the same clock time. This makes PX4 behave normally even during unusually long delays in Simulator performance. It is recommended that when you are running a lockstep enabled version of PX4 in SITL mode that you tell AirSim to use a SteppableClock , and set UseTcp to true and LockStep to true . { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"ClockType\": \"SteppableClock\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"UseTcp\": true, \"LockStep\": true, ... This causes AirSim to not use a \"realtime\" clock, but instead it advances the clock in step which each sensor update sent to PX4. This way PX4 thinks time is progressing smoothly no matter how long it takes AirSim to really process that update loop. This has the following advantages: AirSim can be used on slow machines that cannot process updates quickly. You can debug AirSim and hit a breakpoint, and when you resume PX4 will behave normally. You can enable very slow sensors like the Lidar with large number of simulated points, and PX4 will still behave normally. There will be some side effects to lockstep , namely, slower update loops caused by running AirSim on an underpowered machine or from expensive sensors (like Lidar) will create some visible jerkiness in the simulated flight if you look at the updates on screen in realtime. Disabling LockStep If you are running PX4 in cygwin, there is an open issue with lockstep . PX4 is configured to use lockstep by default. To disable this feature, first disable it in PX4 : Navigate to boards/px4/sitl/ in your local PX4 repository Edit default.cmake and find the following line: set(ENABLE_LOCKSTEP_SCHEDULER yes) Change this line to: set(ENABLE_LOCKSTEP_SCHEDULER no) Disable it in AirSim by setting LockStep to false and either removing any \"ClockType\": \"SteppableClock\" setting or resetting ClockType back to default: { ... \"ClockType\": \"\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"LockStep\": false, ... Now you can run PX4 SITL as you normally would ( make px4_sitl_default none_iris ) and it will use the host system time without waiting on AirSim.","title":"PX4 Lockstep"},{"location":"px4_lockstep/#lockstep","text":"The latest version of PX4 supports a new lockstep feature when communicating with the simulator over TCP. Lockstep is an important feature because it synchronizes PX4 and the simulator so they essentially use the same clock time. This makes PX4 behave normally even during unusually long delays in Simulator performance. It is recommended that when you are running a lockstep enabled version of PX4 in SITL mode that you tell AirSim to use a SteppableClock , and set UseTcp to true and LockStep to true . { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"ClockType\": \"SteppableClock\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"UseTcp\": true, \"LockStep\": true, ... This causes AirSim to not use a \"realtime\" clock, but instead it advances the clock in step which each sensor update sent to PX4. This way PX4 thinks time is progressing smoothly no matter how long it takes AirSim to really process that update loop. This has the following advantages: AirSim can be used on slow machines that cannot process updates quickly. You can debug AirSim and hit a breakpoint, and when you resume PX4 will behave normally. You can enable very slow sensors like the Lidar with large number of simulated points, and PX4 will still behave normally. There will be some side effects to lockstep , namely, slower update loops caused by running AirSim on an underpowered machine or from expensive sensors (like Lidar) will create some visible jerkiness in the simulated flight if you look at the updates on screen in realtime.","title":"LockStep"},{"location":"px4_lockstep/#disabling-lockstep","text":"If you are running PX4 in cygwin, there is an open issue with lockstep . PX4 is configured to use lockstep by default. To disable this feature, first disable it in PX4 : Navigate to boards/px4/sitl/ in your local PX4 repository Edit default.cmake and find the following line: set(ENABLE_LOCKSTEP_SCHEDULER yes) Change this line to: set(ENABLE_LOCKSTEP_SCHEDULER no) Disable it in AirSim by setting LockStep to false and either removing any \"ClockType\": \"SteppableClock\" setting or resetting ClockType back to default: { ... \"ClockType\": \"\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"LockStep\": false, ... Now you can run PX4 SITL as you normally would ( make px4_sitl_default none_iris ) and it will use the host system time without waiting on AirSim.","title":"Disabling LockStep"},{"location":"px4_logging/","text":"PX4/MavLink Logging Thanks to Chris Lovett for developing various tools for PX4/MavLink logging mentioned on this page! Logging MavLink Messages AirSim can capture mavlink log files if you add the following to the PX4 section of your settings.json file: { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"PX4\": { ..., \"Logs\": \"c:/temp/mavlink\" } } } AirSim will create a timestamped log file in this folder for each \"armed/disarmed\" flight session. You will then see log files organized by date in d:\\temp\\logs, specifically input.mavlink and output.mavlink files. MavLink LogViewer For MavLink enabled drones, you can also use our Log Viewer to visualize the streams of data. If you enable this form of realtime logging you should not use the \"Logs\" setting above, these two forms of logging are mutually exclusive. PX4 Log in SITL Mode In SITL mode, please a log file is produced when drone is armed. The SITL terminal will contain the path to the log file, it should look something like this INFO [logger] Opened log file: rootfs/fs/microsd/log/2017-03-27/20_02_49.ulg PX4 Log in HITL Mode If you are using Pixhawk hardware in HIL mode, then set parameter SYS_LOGGER=1 using QGroundControl. PX4 will write log file on device which you can download at later date using QGroundControl. Debugging a bad flight You can use these *.mavlink log files to debug a bad flight using the LogViewer . For example, AirSim/PX4 flight may misbehave if you run it on an under powered computer. The following shows what might happen in that situation. In this flight we ran a simple commander takeoff test as performed by PythonClient/multirotor/stability_test.py and the flight started off fine, but then went crazy at the end and the drone crashed. So why is that? What can the log file show? Here we've plotted the following 5 metrics: - hil_gps.alt - the simulated altitude sent from AirSim to PX4 - telemetry.update_rate - the rate AirSim is performing the critical drone update loop in updates per second. - telemetry.update_time - the average time taken inside AirSim performing the critical drone update loop. - telemetry.actuation_delay - this is a very interesting metric measuring how long it takes PX4 to send back updated actuator controls message (motor power) - actuator_controls.0 - the actuator controls signal from PX4 for the first rotor. What we see then with these metrics is that things started off nicely, with nice flat altitude, high update rate in the 275 to 300 fps range, and a nice low update time inside AirSim around 113 microseconds, and a nice low actuation delay in the round trip from PX4. The actuator controls also stabilize quickly to a nice flat line. But then the update_time starts to climb, at the same time the actuation_delay climbs and we see a little tip in actuator_controls. This dip should not happen, the PX4 is panicking over loss of update rate but it recovers. But then we see actuator controls go crazy, a huge spike in actuation delay, and around this time we see a message from AirSim saying lockstep disabled . A delay over 100 millisecond triggers AirSim into jumping out of lockstep mode and the PX4 goes nuts and the drone crashes. The button line is that if a simple takeoff cannot maintain steady smooth flight and you see these kinds of spikes and uneven update rates then it means you are running AirSim on a computer that does not have enough horsepower. This is what a simple takeoff and hover and land should look like: Here you see the update_rate sticking the target of 333 updates per second. You also see the update_time a nice flat 39 microseconds and the actuator_delay somewhere between 1.1 and 1.7 milliseconds, and the resulting actuator_controls a lovely flat line.","title":"PX4/MavLink Logging"},{"location":"px4_logging/#px4mavlink-logging","text":"Thanks to Chris Lovett for developing various tools for PX4/MavLink logging mentioned on this page!","title":"PX4/MavLink Logging"},{"location":"px4_logging/#logging-mavlink-messages","text":"AirSim can capture mavlink log files if you add the following to the PX4 section of your settings.json file: { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"PX4\": { ..., \"Logs\": \"c:/temp/mavlink\" } } } AirSim will create a timestamped log file in this folder for each \"armed/disarmed\" flight session. You will then see log files organized by date in d:\\temp\\logs, specifically input.mavlink and output.mavlink files.","title":"Logging MavLink Messages"},{"location":"px4_logging/#mavlink-logviewer","text":"For MavLink enabled drones, you can also use our Log Viewer to visualize the streams of data. If you enable this form of realtime logging you should not use the \"Logs\" setting above, these two forms of logging are mutually exclusive.","title":"MavLink LogViewer"},{"location":"px4_logging/#px4-log-in-sitl-mode","text":"In SITL mode, please a log file is produced when drone is armed. The SITL terminal will contain the path to the log file, it should look something like this INFO [logger] Opened log file: rootfs/fs/microsd/log/2017-03-27/20_02_49.ulg","title":"PX4 Log in SITL Mode"},{"location":"px4_logging/#px4-log-in-hitl-mode","text":"If you are using Pixhawk hardware in HIL mode, then set parameter SYS_LOGGER=1 using QGroundControl. PX4 will write log file on device which you can download at later date using QGroundControl.","title":"PX4 Log in HITL Mode"},{"location":"px4_logging/#debugging-a-bad-flight","text":"You can use these *.mavlink log files to debug a bad flight using the LogViewer . For example, AirSim/PX4 flight may misbehave if you run it on an under powered computer. The following shows what might happen in that situation. In this flight we ran a simple commander takeoff test as performed by PythonClient/multirotor/stability_test.py and the flight started off fine, but then went crazy at the end and the drone crashed. So why is that? What can the log file show? Here we've plotted the following 5 metrics: - hil_gps.alt - the simulated altitude sent from AirSim to PX4 - telemetry.update_rate - the rate AirSim is performing the critical drone update loop in updates per second. - telemetry.update_time - the average time taken inside AirSim performing the critical drone update loop. - telemetry.actuation_delay - this is a very interesting metric measuring how long it takes PX4 to send back updated actuator controls message (motor power) - actuator_controls.0 - the actuator controls signal from PX4 for the first rotor. What we see then with these metrics is that things started off nicely, with nice flat altitude, high update rate in the 275 to 300 fps range, and a nice low update time inside AirSim around 113 microseconds, and a nice low actuation delay in the round trip from PX4. The actuator controls also stabilize quickly to a nice flat line. But then the update_time starts to climb, at the same time the actuation_delay climbs and we see a little tip in actuator_controls. This dip should not happen, the PX4 is panicking over loss of update rate but it recovers. But then we see actuator controls go crazy, a huge spike in actuation delay, and around this time we see a message from AirSim saying lockstep disabled . A delay over 100 millisecond triggers AirSim into jumping out of lockstep mode and the PX4 goes nuts and the drone crashes. The button line is that if a simple takeoff cannot maintain steady smooth flight and you see these kinds of spikes and uneven update rates then it means you are running AirSim on a computer that does not have enough horsepower. This is what a simple takeoff and hover and land should look like: Here you see the update_rate sticking the target of 333 updates per second. You also see the update_time a nice flat 39 microseconds and the actuator_delay somewhere between 1.1 and 1.7 milliseconds, and the resulting actuator_controls a lovely flat line.","title":"Debugging a bad flight"},{"location":"px4_multi_vehicle/","text":"Setting up multi-vehicle PX4 simulation The PX4 SITL stack comes with a sitl_multiple_run.sh shell script that runs multiple instances of the PX4 binary. This would allow the SITL stack to listen to connections from multiple Cosys-AirSim vehicles on multiple TCP ports starting from 4560. However, the provided script does not let us view the PX4 console. If you want to run the instances manually while being able to view each instance's console ( Recommended ) see this section Setting up multiple instances of PX4 Software-in-Loop Note you have to build PX4 with make px4_sitl_default none_iris as shown here before trying to run multiple PX4 instances. From your bash (or Cygwin) terminal go to the PX4 Firmware directory and run the sitl_multiple_run.sh script while specifying the number of vehicles you need cd PX4-Autopilot ./Tools/sitl_multiple_run.sh 2 # 2 here is the number of vehicles/instances This starts multiple instances that listen to TCP ports 4560 to 4560+i where 'i' is the number of vehicles/instances specified You should get a confirmation message that says that old instances have been stopped and new instances have been started killing running instances starting instance 0 in /cygdrive/c/PX4/home/PX4/Firmware/build/px4_sitl_default/instance_0 starting instance 1 in /cygdrive/c/PX4/home/PX4/Firmware/build/px4_sitl_default/instance_1 Now edit AirSim settings file to make sure you have matching TCP port settings for the set number of vehicles and to make sure that both vehicles do not spawn on the same point. For example, these settings would spawn two PX4Multirotors where one of them would try to connect to PX4 SITL at port 4560 and the other at port 4561 . It also makes sure the vehicles spawn at 0,1,0 and 0,-1,0 to avoid collision: json { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"Drone1\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": false, \"UseTcp\": true, \"TcpPort\": 4560, \"ControlPortLocal\": 14540, \"ControlPortRemote\": 14580, \"X\": 0, \"Y\": 1, \"Z\": 0 }, \"Drone2\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": false, \"UseTcp\": true, \"TcpPort\": 4561, \"ControlPortLocal\": 14541, \"ControlPortRemote\": 14581, \"X\": 0, \"Y\": -1, \"Z\": 0 } } } You can add more than two vehicles but you will need to make sure you adjust the TCP port for each (ie: vehicle 3's port would be 4562 and so on..) and adjust the spawn point. Now run your Unreal Cosys-AirSim environment and it should connect to SITL PX4 via TCP. If you are running the instances with the PX4 console visible , you should see a bunch of messages from each SITL PX4 window. Specifically, the following messages tell you that Cosys-AirSim is connected properly and GPS fusion is stable: INFO [simulator] Simulator connected on UDP port 14560 INFO [mavlink] partner IP: 127.0.0.1 INFO [ecl/EKF] EKF GPS checks passed (WGS-84 origin set) INFO [ecl/EKF] EKF commencing GPS fusion If you do not see these messages then check your port settings. You should also be able to use QGroundControl with SITL mode. Make sure there is no Pixhawk hardware plugged in, otherwise QGroundControl will choose to use that instead. Note that as we don't have a physical board, an RC cannot be connected directly to it. So the alternatives are either use XBox 360 Controller or connect your RC using USB (for example, in case of FrSky Taranis X9D Plus) or using trainer USB cable to your PC. This makes your RC look like a joystick. You will need to do extra set up in QGroundControl to use virtual joystick for RC control. You do not need to do this unless you plan to fly a drone manually in Cosys-AirSim. Autonomous flight using the Python API does not require RC, see No Remote Control . Starting SITL instances with PX4 console If you want to start your SITL instances while being able to view the PX4 console, you will need to run the shell scripts found here rather than sitl_multiple_run.sh . Here is how you would do so: Note This script also assumes PX4 is built with make px4_sitl_default none_iris as shown here before trying to run multiple PX4 instances. From your bash (or Cygwin) terminal go to the PX4 directory and get the scripts (place them in a subdirectory called Scripts win the PX4 directory as shown) cd PX4 mkdir -p Scripts cd Scripts wget https://github.com/Cosys-Lab/Cosys-AirSim/raw/main/PX4Scripts/sitl_kill.sh wget https://github.com/Cosys-Lab/Cosys-AirSim/raw/main/PX4Scripts/run_airsim_sitl.sh Note the shell scripts expect the Scripts and Firmware directories to be within the same parent directory. Also, you may need to make the scripts executable by running chmod +x sitl_kill.sh and chmod +x run_airsim_sitl.sh . Run the sitl_kill.sh script to kill all active PX4 SITL instances ./sitl_kill.sh Run the run_airsim_sitl.sh script while specifying which instance you would like to run in the current terminal window (the first instance would be numbered 0) ./run_airsim_sitl.sh 0 # first instance = 0 You should see the PX4 instance starting and waiting for Cosys-AirSim's connection as it would with a single instance. ``` _ _ ___ | ___ \\ \\ \\ / / / | | |_/ / \\ V / / /| | | / / \\ / / | | | | / /^\\ \\ ___ | _| \\/ \\/ | / px4 starting. INFO [px4] Calling startup script: /bin/sh /cygdrive/c/PX4/home/PX4/Firmware/etc/init.d-posix/rcS 0 INFO [dataman] Unknown restart, data manager file './dataman' size is 11798680 bytes INFO [simulator] Waiting for simulator to connect on TCP port 4560 4. Open a new terminal and go to the Scripts directory and start the next instance cd PX4 cd Scripts ./run_airsim_sitl.sh 1 # ,2,3,4,..,etc ``` Repeat step 4 for as many instances as you would like to start Run your Unreal Cosys-AirSim environment and it should connect to SITL PX4 via TCP (assuming your settings.json file has the right ports).","title":"PX4 Multi-vehicle in SITL"},{"location":"px4_multi_vehicle/#setting-up-multi-vehicle-px4-simulation","text":"The PX4 SITL stack comes with a sitl_multiple_run.sh shell script that runs multiple instances of the PX4 binary. This would allow the SITL stack to listen to connections from multiple Cosys-AirSim vehicles on multiple TCP ports starting from 4560. However, the provided script does not let us view the PX4 console. If you want to run the instances manually while being able to view each instance's console ( Recommended ) see this section","title":"Setting up multi-vehicle PX4 simulation"},{"location":"px4_multi_vehicle/#setting-up-multiple-instances-of-px4-software-in-loop","text":"Note you have to build PX4 with make px4_sitl_default none_iris as shown here before trying to run multiple PX4 instances. From your bash (or Cygwin) terminal go to the PX4 Firmware directory and run the sitl_multiple_run.sh script while specifying the number of vehicles you need cd PX4-Autopilot ./Tools/sitl_multiple_run.sh 2 # 2 here is the number of vehicles/instances This starts multiple instances that listen to TCP ports 4560 to 4560+i where 'i' is the number of vehicles/instances specified You should get a confirmation message that says that old instances have been stopped and new instances have been started killing running instances starting instance 0 in /cygdrive/c/PX4/home/PX4/Firmware/build/px4_sitl_default/instance_0 starting instance 1 in /cygdrive/c/PX4/home/PX4/Firmware/build/px4_sitl_default/instance_1 Now edit AirSim settings file to make sure you have matching TCP port settings for the set number of vehicles and to make sure that both vehicles do not spawn on the same point. For example, these settings would spawn two PX4Multirotors where one of them would try to connect to PX4 SITL at port 4560 and the other at port 4561 . It also makes sure the vehicles spawn at 0,1,0 and 0,-1,0 to avoid collision: json { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"Drone1\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": false, \"UseTcp\": true, \"TcpPort\": 4560, \"ControlPortLocal\": 14540, \"ControlPortRemote\": 14580, \"X\": 0, \"Y\": 1, \"Z\": 0 }, \"Drone2\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": false, \"UseTcp\": true, \"TcpPort\": 4561, \"ControlPortLocal\": 14541, \"ControlPortRemote\": 14581, \"X\": 0, \"Y\": -1, \"Z\": 0 } } } You can add more than two vehicles but you will need to make sure you adjust the TCP port for each (ie: vehicle 3's port would be 4562 and so on..) and adjust the spawn point. Now run your Unreal Cosys-AirSim environment and it should connect to SITL PX4 via TCP. If you are running the instances with the PX4 console visible , you should see a bunch of messages from each SITL PX4 window. Specifically, the following messages tell you that Cosys-AirSim is connected properly and GPS fusion is stable: INFO [simulator] Simulator connected on UDP port 14560 INFO [mavlink] partner IP: 127.0.0.1 INFO [ecl/EKF] EKF GPS checks passed (WGS-84 origin set) INFO [ecl/EKF] EKF commencing GPS fusion If you do not see these messages then check your port settings. You should also be able to use QGroundControl with SITL mode. Make sure there is no Pixhawk hardware plugged in, otherwise QGroundControl will choose to use that instead. Note that as we don't have a physical board, an RC cannot be connected directly to it. So the alternatives are either use XBox 360 Controller or connect your RC using USB (for example, in case of FrSky Taranis X9D Plus) or using trainer USB cable to your PC. This makes your RC look like a joystick. You will need to do extra set up in QGroundControl to use virtual joystick for RC control. You do not need to do this unless you plan to fly a drone manually in Cosys-AirSim. Autonomous flight using the Python API does not require RC, see No Remote Control .","title":"Setting up multiple instances of PX4 Software-in-Loop"},{"location":"px4_multi_vehicle/#starting-sitl-instances-with-px4-console","text":"If you want to start your SITL instances while being able to view the PX4 console, you will need to run the shell scripts found here rather than sitl_multiple_run.sh . Here is how you would do so: Note This script also assumes PX4 is built with make px4_sitl_default none_iris as shown here before trying to run multiple PX4 instances. From your bash (or Cygwin) terminal go to the PX4 directory and get the scripts (place them in a subdirectory called Scripts win the PX4 directory as shown) cd PX4 mkdir -p Scripts cd Scripts wget https://github.com/Cosys-Lab/Cosys-AirSim/raw/main/PX4Scripts/sitl_kill.sh wget https://github.com/Cosys-Lab/Cosys-AirSim/raw/main/PX4Scripts/run_airsim_sitl.sh Note the shell scripts expect the Scripts and Firmware directories to be within the same parent directory. Also, you may need to make the scripts executable by running chmod +x sitl_kill.sh and chmod +x run_airsim_sitl.sh . Run the sitl_kill.sh script to kill all active PX4 SITL instances ./sitl_kill.sh Run the run_airsim_sitl.sh script while specifying which instance you would like to run in the current terminal window (the first instance would be numbered 0) ./run_airsim_sitl.sh 0 # first instance = 0 You should see the PX4 instance starting and waiting for Cosys-AirSim's connection as it would with a single instance. ``` _ _ ___ | ___ \\ \\ \\ / / / | | |_/ / \\ V / / /| | | / / \\ / / | | | | / /^\\ \\ ___ | _| \\/ \\/ | / px4 starting. INFO [px4] Calling startup script: /bin/sh /cygdrive/c/PX4/home/PX4/Firmware/etc/init.d-posix/rcS 0 INFO [dataman] Unknown restart, data manager file './dataman' size is 11798680 bytes INFO [simulator] Waiting for simulator to connect on TCP port 4560 4. Open a new terminal and go to the Scripts directory and start the next instance cd PX4 cd Scripts ./run_airsim_sitl.sh 1 # ,2,3,4,..,etc ``` Repeat step 4 for as many instances as you would like to start Run your Unreal Cosys-AirSim environment and it should connect to SITL PX4 via TCP (assuming your settings.json file has the right ports).","title":"Starting SITL instances with PX4 console"},{"location":"px4_setup/","text":"PX4 Setup for AirSim The PX4 software stack is an open source very popular flight controller with support for wide variety of boards and sensors as well as built-in capability for higher level tasks such as mission planning. Please visit px4.io for more information. Warning : While all releases of AirSim are always tested with PX4 to ensure the support, setting up PX4 is not a trivial task. Unless you have at least intermediate level of experience with PX4 stack, we recommend you use simple_flight , which is now a default in AirSim. Supported Hardware The following Pixhawk hardware has been tested with AirSim: Pixhawk PX4 2.4.8 PixFalcon PixRacer Pixhawk 2.1 Pixhawk 4 mini from Holybro Pixhawk 4 from Holybro Version 1.11.2 of the PX4 firmware also works on the Pixhawk 4 devices. Setting up PX4 Hardware-in-Loop For this you will need one of the supported device listed above. For manual flight you will also need RC + transmitter. Make sure your RC receiver is bound with its RC transmitter. Connect the RC transmitter to the flight controller's RC port. Refer to your RC manual and PX4 docs for more information. Download QGroundControl , launch it and connect your flight controller to the USB port. Use QGroundControl to flash the latest PX4 Flight Stack. See also initial firmware setup video . In QGroundControl, configure your Pixhawk for HIL simulation by selecting the HIL Quadrocopter X airframe. After PX4 reboots, check that \"HIL Quadrocopter X\" is indeed selected. In QGroundControl, go to Radio tab and calibrate (make sure the remote control is on and the receiver is showing the indicator for the binding). Go to the Flight Mode tab and chose one of the remote control switches as \"Mode Channel\". Then set (for example) Stabilized and Attitude flight modes for two positions of the switch. Go to the Tuning section of QGroundControl and set appropriate values. For example, for Fly Sky's FS-TH9X remote control, the following settings give a more realistic feel: Hover Throttle = mid+1 mark, Roll and pitch sensitivity = mid-3 mark, Altitude and position control sensitivity = mid-2 mark. In AirSim settings file, specify PX4 for your vehicle config like this: { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"ClockType\": \"SteppableClock\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": true, \"LockStep\": true, \"Sensors\":{ \"Barometer\":{ \"SensorType\": 1, \"Enabled\": true, \"PressureFactorSigma\": 0.0001825 } }, \"Parameters\": { \"NAV_RCL_ACT\": 0, \"NAV_DLL_ACT\": 0, \"COM_OBL_ACT\": 1, \"LPE_LAT\": 47.641468, \"LPE_LON\": -122.140165 } } } } Notice the PX4 [simulator] is using TCP, which is why we need to add: \"UseTcp\": true, . Notice we are also enabling LockStep , see PX4 LockStep for more information. The Barometer setting keeps PX4 happy because the default AirSim barometer has a bit too much noise generation. This setting clamps that down a bit which allows PX4 to achieve GPS lock more quickly. After above setup you should be able to use a remote control (RC) to fly with AirSim. You can usually arm the vehicle by lowering and bringing two sticks of RC together down and in-wards. You don't need QGroundControl after the initial setup. Typically the Stabilized (instead of Manual) mode gives better experience for beginners. See PX4 Basic Flying Guide . You can also control the drone from Python APIs . See Walkthrough Demo Video and Unreal AirSim Setup Video that shows you all the setup steps in this document. Setting up PX4 Software-in-Loop The PX4 SITL mode doesn't require you to have separate device such as a Pixhawk or Pixracer. This is in fact the recommended way to use PX4 with simulators by PX4 team. However, this is indeed harder to set up. Please see this dedicated page for setting up PX4 in SITL mode. FAQ Drone doesn't fly properly, it just goes \"crazy\". There are a few reasons that can cause this. First, make sure your drone doesn't fall down large distance when starting the simulator. This might happen if you have created a custom Unreal environment and Player Start is placed too high above the ground. It seems that when this happens internal calibration in PX4 gets confused. You should also use QGroundControl and make sure you can arm and takeoff in QGroundControl properly. Finally, this also can be a machine performance issue in some rare cases, check your hard drive performance . Can I use Arducopter or other MavLink implementations? Our code is tested with the PX4 firmware . We have not tested Arducopter or other mavlink implementations. Some of the flight API's do use the PX4 custom modes in the MAV_CMD_DO_SET_MODE messages (like PX4_CUSTOM_MAIN_MODE_AUTO) It is not finding my Pixhawk hardware Check your settings.json file for this line \"SerialPort\":\"*,115200\". The asterisk here means \"find any serial port that looks like a Pixhawk device, but this doesn't always work for all types of Pixhawk hardware. So on Windows you can find the actual COM port using Device Manager, look under \"Ports (COM & LPT), plug the device in and see what new COM port shows up. Let's say you see a new port named \"USB Serial Port (COM5)\". Well, then change the SerialPort setting to this: \"SerialPort\":\"COM5,115200\". On Linux, the device can be found by running \"ls /dev/serial/by-id\" if you see a device name listed that looks like this usb-3D_Robotics_PX4_FMU_v2.x_0-if00 then you can use that name to connect, like this: \"SerialPort\":\"/dev/serial/by-id/usb-3D_Robotics_PX4_FMU_v2.x_0-if00\". Note this long name is actually a symbolic link to the real name, if you use \"ls -l ...\" you can find that symbolic link, it is usually something like \"/dev/ttyACM0\", so this will also work \"SerialPort\":\"/dev/ttyACM0,115200\". But that mapping is similar to windows, it is automatically assigned and can change, whereas the long name will work even if the actual TTY serial device mapping changes. WARN [commander] Takeoff denied, disarm and re-try This happens if you try and take off when PX4 still has not computed the home position. PX4 will report the home position once it is happy with the GPS signal, and you will see these messages: INFO [commander] home: 47.6414680, -122.1401672, 119.99 INFO [tone_alarm] home_set Up until this point in time, however, the PX4 will reject takeoff commands. When I tell the drone to do something it always lands For example, you use DroneShell moveToPosition -z -20 -x 50 -y 0 which it does, but when it gets to the target location the drone starts to land. This is the default behavior of PX4 when offboard mode completes. To set the drone to hover instead set this PX4 parameter: param set COM_OBL_ACT 1 I get message length mismatches errors You might need to set MAV_PROTO_VER parameter in QGC to \"Always use version 1\". Please see this issue more details.","title":"PX4 Setup for AirSim"},{"location":"px4_setup/#px4-setup-for-airsim","text":"The PX4 software stack is an open source very popular flight controller with support for wide variety of boards and sensors as well as built-in capability for higher level tasks such as mission planning. Please visit px4.io for more information. Warning : While all releases of AirSim are always tested with PX4 to ensure the support, setting up PX4 is not a trivial task. Unless you have at least intermediate level of experience with PX4 stack, we recommend you use simple_flight , which is now a default in AirSim.","title":"PX4 Setup for AirSim"},{"location":"px4_setup/#supported-hardware","text":"The following Pixhawk hardware has been tested with AirSim: Pixhawk PX4 2.4.8 PixFalcon PixRacer Pixhawk 2.1 Pixhawk 4 mini from Holybro Pixhawk 4 from Holybro Version 1.11.2 of the PX4 firmware also works on the Pixhawk 4 devices.","title":"Supported Hardware"},{"location":"px4_setup/#setting-up-px4-hardware-in-loop","text":"For this you will need one of the supported device listed above. For manual flight you will also need RC + transmitter. Make sure your RC receiver is bound with its RC transmitter. Connect the RC transmitter to the flight controller's RC port. Refer to your RC manual and PX4 docs for more information. Download QGroundControl , launch it and connect your flight controller to the USB port. Use QGroundControl to flash the latest PX4 Flight Stack. See also initial firmware setup video . In QGroundControl, configure your Pixhawk for HIL simulation by selecting the HIL Quadrocopter X airframe. After PX4 reboots, check that \"HIL Quadrocopter X\" is indeed selected. In QGroundControl, go to Radio tab and calibrate (make sure the remote control is on and the receiver is showing the indicator for the binding). Go to the Flight Mode tab and chose one of the remote control switches as \"Mode Channel\". Then set (for example) Stabilized and Attitude flight modes for two positions of the switch. Go to the Tuning section of QGroundControl and set appropriate values. For example, for Fly Sky's FS-TH9X remote control, the following settings give a more realistic feel: Hover Throttle = mid+1 mark, Roll and pitch sensitivity = mid-3 mark, Altitude and position control sensitivity = mid-2 mark. In AirSim settings file, specify PX4 for your vehicle config like this: { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"ClockType\": \"SteppableClock\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": true, \"LockStep\": true, \"Sensors\":{ \"Barometer\":{ \"SensorType\": 1, \"Enabled\": true, \"PressureFactorSigma\": 0.0001825 } }, \"Parameters\": { \"NAV_RCL_ACT\": 0, \"NAV_DLL_ACT\": 0, \"COM_OBL_ACT\": 1, \"LPE_LAT\": 47.641468, \"LPE_LON\": -122.140165 } } } } Notice the PX4 [simulator] is using TCP, which is why we need to add: \"UseTcp\": true, . Notice we are also enabling LockStep , see PX4 LockStep for more information. The Barometer setting keeps PX4 happy because the default AirSim barometer has a bit too much noise generation. This setting clamps that down a bit which allows PX4 to achieve GPS lock more quickly. After above setup you should be able to use a remote control (RC) to fly with AirSim. You can usually arm the vehicle by lowering and bringing two sticks of RC together down and in-wards. You don't need QGroundControl after the initial setup. Typically the Stabilized (instead of Manual) mode gives better experience for beginners. See PX4 Basic Flying Guide . You can also control the drone from Python APIs . See Walkthrough Demo Video and Unreal AirSim Setup Video that shows you all the setup steps in this document.","title":"Setting up PX4 Hardware-in-Loop"},{"location":"px4_setup/#setting-up-px4-software-in-loop","text":"The PX4 SITL mode doesn't require you to have separate device such as a Pixhawk or Pixracer. This is in fact the recommended way to use PX4 with simulators by PX4 team. However, this is indeed harder to set up. Please see this dedicated page for setting up PX4 in SITL mode.","title":"Setting up PX4 Software-in-Loop"},{"location":"px4_setup/#faq","text":"","title":"FAQ"},{"location":"px4_setup/#drone-doesnt-fly-properly-it-just-goes-crazy","text":"There are a few reasons that can cause this. First, make sure your drone doesn't fall down large distance when starting the simulator. This might happen if you have created a custom Unreal environment and Player Start is placed too high above the ground. It seems that when this happens internal calibration in PX4 gets confused. You should also use QGroundControl and make sure you can arm and takeoff in QGroundControl properly. Finally, this also can be a machine performance issue in some rare cases, check your hard drive performance .","title":"Drone doesn't fly properly, it just goes \"crazy\"."},{"location":"px4_setup/#can-i-use-arducopter-or-other-mavlink-implementations","text":"Our code is tested with the PX4 firmware . We have not tested Arducopter or other mavlink implementations. Some of the flight API's do use the PX4 custom modes in the MAV_CMD_DO_SET_MODE messages (like PX4_CUSTOM_MAIN_MODE_AUTO)","title":"Can I use Arducopter or other MavLink implementations?"},{"location":"px4_setup/#it-is-not-finding-my-pixhawk-hardware","text":"Check your settings.json file for this line \"SerialPort\":\"*,115200\". The asterisk here means \"find any serial port that looks like a Pixhawk device, but this doesn't always work for all types of Pixhawk hardware. So on Windows you can find the actual COM port using Device Manager, look under \"Ports (COM & LPT), plug the device in and see what new COM port shows up. Let's say you see a new port named \"USB Serial Port (COM5)\". Well, then change the SerialPort setting to this: \"SerialPort\":\"COM5,115200\". On Linux, the device can be found by running \"ls /dev/serial/by-id\" if you see a device name listed that looks like this usb-3D_Robotics_PX4_FMU_v2.x_0-if00 then you can use that name to connect, like this: \"SerialPort\":\"/dev/serial/by-id/usb-3D_Robotics_PX4_FMU_v2.x_0-if00\". Note this long name is actually a symbolic link to the real name, if you use \"ls -l ...\" you can find that symbolic link, it is usually something like \"/dev/ttyACM0\", so this will also work \"SerialPort\":\"/dev/ttyACM0,115200\". But that mapping is similar to windows, it is automatically assigned and can change, whereas the long name will work even if the actual TTY serial device mapping changes.","title":"It is not finding my Pixhawk hardware"},{"location":"px4_setup/#warn-commander-takeoff-denied-disarm-and-re-try","text":"This happens if you try and take off when PX4 still has not computed the home position. PX4 will report the home position once it is happy with the GPS signal, and you will see these messages: INFO [commander] home: 47.6414680, -122.1401672, 119.99 INFO [tone_alarm] home_set Up until this point in time, however, the PX4 will reject takeoff commands.","title":"WARN [commander] Takeoff denied, disarm and re-try"},{"location":"px4_setup/#when-i-tell-the-drone-to-do-something-it-always-lands","text":"For example, you use DroneShell moveToPosition -z -20 -x 50 -y 0 which it does, but when it gets to the target location the drone starts to land. This is the default behavior of PX4 when offboard mode completes. To set the drone to hover instead set this PX4 parameter: param set COM_OBL_ACT 1","title":"When I tell the drone to do something it always lands"},{"location":"px4_setup/#i-get-message-length-mismatches-errors","text":"You might need to set MAV_PROTO_VER parameter in QGC to \"Always use version 1\". Please see this issue more details.","title":"I get message length mismatches errors"},{"location":"px4_sitl/","text":"Setting up PX4 Software-in-Loop The PX4 software provides a \"software-in-loop\" simulation (SITL) version of their stack that runs in Linux. If you are on Windows then you can use the Cygwin Toolchain or you can use the Windows subsystem for Linux and follow the PX4 Linux toolchain setup. If you are using WSL2 please read these additional instructions . Note that every time you stop the unreal app you have to restart the px4 app. From your bash terminal follow these steps for Linux and follow all the instructions under NuttX based hardware to install prerequisites. We've also included our own copy of the PX4 build instructions which is a bit more concise about what we need exactly. Get the PX4 source code and build the posix SITL version of PX4: mkdir -p PX4 cd PX4 git clone https://github.com/PX4/PX4-Autopilot.git --recursive bash ./PX4-Autopilot/Tools/setup/ubuntu.sh --no-nuttx --no-sim-tools cd PX4-Autopilot And find the latest stable release from https://github.com/PX4/PX4-Autopilot/releases and checkout the source code matching that release, for example: git checkout v1.11.3 Use following command to build and start PX4 firmware in SITL mode: make px4_sitl_default none_iris If you are using older version v1.8.* use this command instead: make posix_sitl_ekf2 none_iris . You should see a message saying the SITL PX4 app is waiting for the simulator (Cosys-AirSim) to connect. You will also see information about which ports are configured for mavlink connection to the PX4 app. The default ports have changed recently, so check them closely to make sure Cosys-AirSim settings are correct. INFO [simulator] Waiting for simulator to connect on TCP port 4560 INFO [init] Mixer: etc/mixers/quad_w.main.mix on /dev/pwm_output0 INFO [mavlink] mode: Normal, data rate: 4000000 B/s on udp port 14570 remote port 14550 INFO [mavlink] mode: Onboard, data rate: 4000000 B/s on udp port 14580 remote port 14540 Note: this is also an interactive PX4 console, type help to see the list of commands you can enter here. They are mostly low level PX4 commands, but some of them can be useful for debugging. Now edit Cosys-AirSim settings file to make sure you have matching UDP and TCP port settings: json { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"ClockType\": \"SteppableClock\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": false, \"LockStep\": true, \"UseTcp\": true, \"TcpPort\": 4560, \"ControlPortLocal\": 14540, \"ControlPortRemote\": 14580, \"Sensors\":{ \"Barometer\":{ \"SensorType\": 1, \"Enabled\": true, \"PressureFactorSigma\": 0.0001825 } }, \"Parameters\": { \"NAV_RCL_ACT\": 0, \"NAV_DLL_ACT\": 0, \"COM_OBL_ACT\": 1, \"LPE_LAT\": 47.641468, \"LPE_LON\": -122.140165 } } } } Notice the PX4 [simulator] is using TCP, which is why we need to add: \"UseTcp\": true, . Notice we are also enabling LockStep , see PX4 LockStep for more information. The Barometer setting keeps PX4 happy because the default Cosys-AirSim barometer has a bit too much noise generation. This setting clamps that down a bit which allows PX4 to achieve GPS lock more quickly. Open incoming TCP port 4560 and incoming UDP port 14540 using your firewall configuration. Now run your Unreal Cosys-AirSim environment and it should connect to SITL PX4 via TCP. You should see a bunch of messages from the SITL PX4 window. Specifically, the following messages tell you that Cosys-AirSim is connected properly and GPS fusion is stable: INFO [simulator] Simulator connected on UDP port 14560 INFO [mavlink] partner IP: 127.0.0.1 INFO [ecl/EKF] EKF GPS checks passed (WGS-84 origin set) INFO [ecl/EKF] EKF commencing GPS fusion If you do not see these messages then check your port settings. You should also be able to use QGroundControl with SITL mode. Make sure there is no Pixhawk hardware plugged in, otherwise QGroundControl will choose to use that instead. Note that as we don't have a physical board, an RC cannot be connected directly to it. So the alternatives are either use XBox 360 Controller or connect your RC using USB (for example, in case of FrSky Taranis X9D Plus) or using trainer USB cable to your PC. This makes your RC look like a joystick. You will need to do extra set up in QGroundControl to use virtual joystick for RC control. You do not need to do this unless you plan to fly a drone manually in Cosys-AirSim. Autonomous flight using the Python API does not require RC, see No Remote Control below. Setting GPS origin Notice the above settings are provided in the params section of the settings.json file: \"LPE_LAT\": 47.641468, \"LPE_LON\": -122.140165, PX4 SITL mode needs to be configured to get the home location correct. The home location needs to be set to the same coordinates defined in OriginGeopoint . You can also run the following in the SITL PX4 console window to check that these values are set correctly. param show LPE_LAT param show LPE_LON Smooth Offboard Transitions Notice the above setting is provided in the params section of the settings.json file: \"COM_OBL_ACT\": 1 This tells the drone automatically hover after each offboard control command finishes (the default setting is to land). Hovering is a smoother transition between multiple offboard commands. You can check this setting by running the following PX4 console command: param show COM_OBL_ACT Check the Home Position If you are using DroneShell to execute commands (arm, takeoff, etc) then you should wait until the Home position is set. You will see the PX4 SITL console output this message: INFO [commander] home: 47.6414680, -122.1401672, 119.99 INFO [tone_alarm] home_set Now DroneShell 'pos' command should report this position and the commands should be accepted by PX4. If you attempt to takeoff without a home position you will see the message: WARN [commander] Takeoff denied, disarm and re-try After home position is set check the local position reported by 'pos' command : Local position: x=-0.0326988, y=0.00656854, z=5.48506 If the z coordinate is large like this then takeoff might not work as expected. Resetting the SITL and simulation should fix that problem. WSL 2 Windows Subsystem for Linux version 2 operates in a Virtual Machine. This requires additional setup - see additional instructions . No Remote Control Notice the above setting is provided in the params section of the settings.json file: \"NAV_RCL_ACT\": 0, \"NAV_DLL_ACT\": 0, This is required if you plan to fly the SITL mode PX4 with no remote control, just using python scripts, for example. These parameters stop the PX4 from triggering \"failsafe mode on\" every time a move command is finished. You can use the following PX4 command to check these values are set correctly: param show NAV_RCL_ACT param show NAV_DLL_ACT NOTE: Do NOT do this on a real drone as it is too dangerous to fly without these failsafe measures. Manually set parameters You can also run the following in the PX4 console to set all these parameters manually: param set NAV_RCL_ACT 0 param set NAV_DLL_ACT 0 Setting up multi-vehicle simulation You can simulate multiple drones in SITL mode using Cosys-AirSim. However, this requires setting up multiple instances of the PX4 firmware simulator to be able to listen for each vehicle's connection on a separate TCP port (4560, 4561, etc). Please see this dedicated page for instructions on setting up multiple instances of PX4 in SITL mode. Using VirtualBox Ubuntu If you want to run the above posix_sitl in a VirtualBox Ubuntu machine then it will have a different ip address from localhost. So in this case you need to edit the settings file and change the UdpIp and SitlIp to the ip address of your virtual machine set the LocalIpAddress to the address of your host machine running the Unreal engine. Remote Controller There are several options for flying the simulated drone using a remote control or joystick like xbox gamepad. See remote controllers","title":"PX4 in SITL"},{"location":"px4_sitl/#setting-up-px4-software-in-loop","text":"The PX4 software provides a \"software-in-loop\" simulation (SITL) version of their stack that runs in Linux. If you are on Windows then you can use the Cygwin Toolchain or you can use the Windows subsystem for Linux and follow the PX4 Linux toolchain setup. If you are using WSL2 please read these additional instructions . Note that every time you stop the unreal app you have to restart the px4 app. From your bash terminal follow these steps for Linux and follow all the instructions under NuttX based hardware to install prerequisites. We've also included our own copy of the PX4 build instructions which is a bit more concise about what we need exactly. Get the PX4 source code and build the posix SITL version of PX4: mkdir -p PX4 cd PX4 git clone https://github.com/PX4/PX4-Autopilot.git --recursive bash ./PX4-Autopilot/Tools/setup/ubuntu.sh --no-nuttx --no-sim-tools cd PX4-Autopilot And find the latest stable release from https://github.com/PX4/PX4-Autopilot/releases and checkout the source code matching that release, for example: git checkout v1.11.3 Use following command to build and start PX4 firmware in SITL mode: make px4_sitl_default none_iris If you are using older version v1.8.* use this command instead: make posix_sitl_ekf2 none_iris . You should see a message saying the SITL PX4 app is waiting for the simulator (Cosys-AirSim) to connect. You will also see information about which ports are configured for mavlink connection to the PX4 app. The default ports have changed recently, so check them closely to make sure Cosys-AirSim settings are correct. INFO [simulator] Waiting for simulator to connect on TCP port 4560 INFO [init] Mixer: etc/mixers/quad_w.main.mix on /dev/pwm_output0 INFO [mavlink] mode: Normal, data rate: 4000000 B/s on udp port 14570 remote port 14550 INFO [mavlink] mode: Onboard, data rate: 4000000 B/s on udp port 14580 remote port 14540 Note: this is also an interactive PX4 console, type help to see the list of commands you can enter here. They are mostly low level PX4 commands, but some of them can be useful for debugging. Now edit Cosys-AirSim settings file to make sure you have matching UDP and TCP port settings: json { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"ClockType\": \"SteppableClock\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": false, \"LockStep\": true, \"UseTcp\": true, \"TcpPort\": 4560, \"ControlPortLocal\": 14540, \"ControlPortRemote\": 14580, \"Sensors\":{ \"Barometer\":{ \"SensorType\": 1, \"Enabled\": true, \"PressureFactorSigma\": 0.0001825 } }, \"Parameters\": { \"NAV_RCL_ACT\": 0, \"NAV_DLL_ACT\": 0, \"COM_OBL_ACT\": 1, \"LPE_LAT\": 47.641468, \"LPE_LON\": -122.140165 } } } } Notice the PX4 [simulator] is using TCP, which is why we need to add: \"UseTcp\": true, . Notice we are also enabling LockStep , see PX4 LockStep for more information. The Barometer setting keeps PX4 happy because the default Cosys-AirSim barometer has a bit too much noise generation. This setting clamps that down a bit which allows PX4 to achieve GPS lock more quickly. Open incoming TCP port 4560 and incoming UDP port 14540 using your firewall configuration. Now run your Unreal Cosys-AirSim environment and it should connect to SITL PX4 via TCP. You should see a bunch of messages from the SITL PX4 window. Specifically, the following messages tell you that Cosys-AirSim is connected properly and GPS fusion is stable: INFO [simulator] Simulator connected on UDP port 14560 INFO [mavlink] partner IP: 127.0.0.1 INFO [ecl/EKF] EKF GPS checks passed (WGS-84 origin set) INFO [ecl/EKF] EKF commencing GPS fusion If you do not see these messages then check your port settings. You should also be able to use QGroundControl with SITL mode. Make sure there is no Pixhawk hardware plugged in, otherwise QGroundControl will choose to use that instead. Note that as we don't have a physical board, an RC cannot be connected directly to it. So the alternatives are either use XBox 360 Controller or connect your RC using USB (for example, in case of FrSky Taranis X9D Plus) or using trainer USB cable to your PC. This makes your RC look like a joystick. You will need to do extra set up in QGroundControl to use virtual joystick for RC control. You do not need to do this unless you plan to fly a drone manually in Cosys-AirSim. Autonomous flight using the Python API does not require RC, see No Remote Control below.","title":"Setting up PX4 Software-in-Loop"},{"location":"px4_sitl/#setting-gps-origin","text":"Notice the above settings are provided in the params section of the settings.json file: \"LPE_LAT\": 47.641468, \"LPE_LON\": -122.140165, PX4 SITL mode needs to be configured to get the home location correct. The home location needs to be set to the same coordinates defined in OriginGeopoint . You can also run the following in the SITL PX4 console window to check that these values are set correctly. param show LPE_LAT param show LPE_LON","title":"Setting GPS origin"},{"location":"px4_sitl/#smooth-offboard-transitions","text":"Notice the above setting is provided in the params section of the settings.json file: \"COM_OBL_ACT\": 1 This tells the drone automatically hover after each offboard control command finishes (the default setting is to land). Hovering is a smoother transition between multiple offboard commands. You can check this setting by running the following PX4 console command: param show COM_OBL_ACT","title":"Smooth Offboard Transitions"},{"location":"px4_sitl/#check-the-home-position","text":"If you are using DroneShell to execute commands (arm, takeoff, etc) then you should wait until the Home position is set. You will see the PX4 SITL console output this message: INFO [commander] home: 47.6414680, -122.1401672, 119.99 INFO [tone_alarm] home_set Now DroneShell 'pos' command should report this position and the commands should be accepted by PX4. If you attempt to takeoff without a home position you will see the message: WARN [commander] Takeoff denied, disarm and re-try After home position is set check the local position reported by 'pos' command : Local position: x=-0.0326988, y=0.00656854, z=5.48506 If the z coordinate is large like this then takeoff might not work as expected. Resetting the SITL and simulation should fix that problem.","title":"Check the Home Position"},{"location":"px4_sitl/#wsl-2","text":"Windows Subsystem for Linux version 2 operates in a Virtual Machine. This requires additional setup - see additional instructions .","title":"WSL 2"},{"location":"px4_sitl/#no-remote-control","text":"Notice the above setting is provided in the params section of the settings.json file: \"NAV_RCL_ACT\": 0, \"NAV_DLL_ACT\": 0, This is required if you plan to fly the SITL mode PX4 with no remote control, just using python scripts, for example. These parameters stop the PX4 from triggering \"failsafe mode on\" every time a move command is finished. You can use the following PX4 command to check these values are set correctly: param show NAV_RCL_ACT param show NAV_DLL_ACT NOTE: Do NOT do this on a real drone as it is too dangerous to fly without these failsafe measures.","title":"No Remote Control"},{"location":"px4_sitl/#manually-set-parameters","text":"You can also run the following in the PX4 console to set all these parameters manually: param set NAV_RCL_ACT 0 param set NAV_DLL_ACT 0","title":"Manually set parameters"},{"location":"px4_sitl/#setting-up-multi-vehicle-simulation","text":"You can simulate multiple drones in SITL mode using Cosys-AirSim. However, this requires setting up multiple instances of the PX4 firmware simulator to be able to listen for each vehicle's connection on a separate TCP port (4560, 4561, etc). Please see this dedicated page for instructions on setting up multiple instances of PX4 in SITL mode.","title":"Setting up multi-vehicle simulation"},{"location":"px4_sitl/#using-virtualbox-ubuntu","text":"If you want to run the above posix_sitl in a VirtualBox Ubuntu machine then it will have a different ip address from localhost. So in this case you need to edit the settings file and change the UdpIp and SitlIp to the ip address of your virtual machine set the LocalIpAddress to the address of your host machine running the Unreal engine.","title":"Using VirtualBox Ubuntu"},{"location":"px4_sitl/#remote-controller","text":"There are several options for flying the simulated drone using a remote control or joystick like xbox gamepad. See remote controllers","title":"Remote Controller"},{"location":"px4_sitl_wsl2/","text":"PX4 Software-in-Loop with WSL 2 The Windows subsystem for Linux version 2 uses a Virtual Machine which has a separate IP address from your Windows host machine. This means PX4 cannot find AirSim using \"localhost\" which is the default behavior for PX4. You will notice that on Windows ipconfig returns a new ethernet adapter for WSL like this (notice the vEthernet has (WSL) in the name: Ethernet adapter vEthernet (WSL): Connection-specific DNS Suffix . : Link-local IPv6 Address . . . . . : fe80::1192:f9a5:df88:53ba%44 IPv4 Address. . . . . . . . . . . : 172.31.64.1 Subnet Mask . . . . . . . . . . . : 255.255.240.0 Default Gateway . . . . . . . . . : This address 172.31.64.1 is the address that WSL 2 can use to reach your Windows host machine. Starting with this PX4 Change Request (which correlates to version v1.12.0-beta1 or newer) PX4 in SITL mode can now connect to AirSim on a different (remote) IP address. To enable this make sure you have a version of PX4 containing this fix and set the following environment variable in linux: export PX4_SIM_HOST_ADDR=172.31.64.1 Note: Be sure to update the above address 172.31.64.1 to match what you see from your ipconfig command. Open incoming TCP port 4560 and incoming UDP port 14540 using your firewall configuration. Now on the linux side run ip address show and copy the eth0 inet address, it should be something like 172.31.66.156 . This is the address Windows needs to know in order to find PX4. Edit your AirSim settings file and add LocalHostIp to tell AirSim to use the WSL ethernet adapter address instead of the default localhost . This will cause AirSim to open the TCP port on that adapter which is the address that the PX4 app will be looking for. Also tell AirSim to connect the ControlIp UDP channel by setting ControlIp to the magic string remote . This resolves to the WSL 2 remote ip address found in the TCP socket. { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"ClockType\": \"SteppableClock\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": false, \"LockStep\": true, \"UseTcp\": true, \"TcpPort\": 4560, \"ControlIp\": \"remote\", \"ControlPortLocal\": 14540, \"ControlPortRemote\": 14580, \"LocalHostIp\": \"172.31.64.1\", \"Sensors\":{ \"Barometer\":{ \"SensorType\": 1, \"Enabled\": true, \"PressureFactorSigma\": 0.0001825 } }, \"Parameters\": { \"NAV_RCL_ACT\": 0, \"NAV_DLL_ACT\": 0, \"COM_OBL_ACT\": 1, \"LPE_LAT\": 47.641468, \"LPE_LON\": -122.140165 } } } } See PX4 LockStep for more information. The \"Barometer\" setting keeps PX4 happy because the default AirSim barometer has a bit too much noise generation. This setting clamps that down a bit. If your local repo does not include this PX4 commit , please edit the Linux file in ROMFS/px4fmu_common/init.d-posix/rcS and make sure it is looking for the PX4_SIM_HOST_ADDR environment variable and is passing that through to the PX4 simulator like this: # If PX4_SIM_HOST_ADDR environment variable is empty use localhost. if [ -z \"${PX4_SIM_HOST_ADDR}\" ]; then echo \"PX4 SIM HOST: localhost\" simulator start -c $simulator_tcp_port else echo \"PX4 SIM HOST: $PX4_SIM_HOST_ADDR\" simulator start -t $PX4_SIM_HOST_ADDR $simulator_tcp_port fi Note: this code might already be there depending on the version of PX4 you are using. Note: please be patient when waiting for the message: INFO [simulator] Simulator connected on TCP port 4560. It can take a little longer to establish the remote connection than it does with localhost . Now you can proceed with the steps shown in Setting up PX4 Software-in-Loop .","title":"PX4 SITL with WSL 2"},{"location":"px4_sitl_wsl2/#px4-software-in-loop-with-wsl-2","text":"The Windows subsystem for Linux version 2 uses a Virtual Machine which has a separate IP address from your Windows host machine. This means PX4 cannot find AirSim using \"localhost\" which is the default behavior for PX4. You will notice that on Windows ipconfig returns a new ethernet adapter for WSL like this (notice the vEthernet has (WSL) in the name: Ethernet adapter vEthernet (WSL): Connection-specific DNS Suffix . : Link-local IPv6 Address . . . . . : fe80::1192:f9a5:df88:53ba%44 IPv4 Address. . . . . . . . . . . : 172.31.64.1 Subnet Mask . . . . . . . . . . . : 255.255.240.0 Default Gateway . . . . . . . . . : This address 172.31.64.1 is the address that WSL 2 can use to reach your Windows host machine. Starting with this PX4 Change Request (which correlates to version v1.12.0-beta1 or newer) PX4 in SITL mode can now connect to AirSim on a different (remote) IP address. To enable this make sure you have a version of PX4 containing this fix and set the following environment variable in linux: export PX4_SIM_HOST_ADDR=172.31.64.1 Note: Be sure to update the above address 172.31.64.1 to match what you see from your ipconfig command. Open incoming TCP port 4560 and incoming UDP port 14540 using your firewall configuration. Now on the linux side run ip address show and copy the eth0 inet address, it should be something like 172.31.66.156 . This is the address Windows needs to know in order to find PX4. Edit your AirSim settings file and add LocalHostIp to tell AirSim to use the WSL ethernet adapter address instead of the default localhost . This will cause AirSim to open the TCP port on that adapter which is the address that the PX4 app will be looking for. Also tell AirSim to connect the ControlIp UDP channel by setting ControlIp to the magic string remote . This resolves to the WSL 2 remote ip address found in the TCP socket. { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"ClockType\": \"SteppableClock\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": false, \"LockStep\": true, \"UseTcp\": true, \"TcpPort\": 4560, \"ControlIp\": \"remote\", \"ControlPortLocal\": 14540, \"ControlPortRemote\": 14580, \"LocalHostIp\": \"172.31.64.1\", \"Sensors\":{ \"Barometer\":{ \"SensorType\": 1, \"Enabled\": true, \"PressureFactorSigma\": 0.0001825 } }, \"Parameters\": { \"NAV_RCL_ACT\": 0, \"NAV_DLL_ACT\": 0, \"COM_OBL_ACT\": 1, \"LPE_LAT\": 47.641468, \"LPE_LON\": -122.140165 } } } } See PX4 LockStep for more information. The \"Barometer\" setting keeps PX4 happy because the default AirSim barometer has a bit too much noise generation. This setting clamps that down a bit. If your local repo does not include this PX4 commit , please edit the Linux file in ROMFS/px4fmu_common/init.d-posix/rcS and make sure it is looking for the PX4_SIM_HOST_ADDR environment variable and is passing that through to the PX4 simulator like this: # If PX4_SIM_HOST_ADDR environment variable is empty use localhost. if [ -z \"${PX4_SIM_HOST_ADDR}\" ]; then echo \"PX4 SIM HOST: localhost\" simulator start -c $simulator_tcp_port else echo \"PX4 SIM HOST: $PX4_SIM_HOST_ADDR\" simulator start -t $PX4_SIM_HOST_ADDR $simulator_tcp_port fi Note: this code might already be there depending on the version of PX4 you are using. Note: please be patient when waiting for the message: INFO [simulator] Simulator connected on TCP port 4560. It can take a little longer to establish the remote connection than it does with localhost . Now you can proceed with the steps shown in Setting up PX4 Software-in-Loop .","title":"PX4 Software-in-Loop with WSL 2"},{"location":"remote_control/","text":"Remote Control To fly manually, you need remote control or RC. If you don't have one then you can use APIs to fly programmatically or use so-called Computer Vision mode to move around using keyboard. RC Setup for Default Config By default AirSim uses simple_flight as its flight controller which connects to RC via USB port to your computer. You can either use XBox controller or FrSky Taranis X9D Plus . Note that XBox 360 controller is not precise enough and is not recommended if you wanted more real world experience. See FAQ below if things are not working. Other Devices AirSim can detect large variety of devices however devices other than above might need extra configuration. In future we will add ability to set this config through settings.json. For now, if things are not working then you might want to try workarounds such as x360ce or change code in SimJoystick.cpp file . Note on FrSky Taranis X9D Plus FrSky Taranis X9D Plus is real UAV remote control with an advantage that it has USB port so it can be directly connected to PC. You can download AirSim config file and follow this tutorial to import it in your RC. You should then see \"sim\" model in RC with all channels configured properly. Note on Linux Currently default config on Linux is for using Xbox controller. This means other devices might not work properly. In future we will add ability to configure RC in settings.json but for now you might have to change code in SimJoystick.cpp file to use other devices. RC Setup for PX4 AirSim supports PX4 flight controller however it requires different setup. There are many remote control options that you can use with quadrotors. We have successfully used FrSky Taranis X9D Plus, FlySky FS-TH9X and Futaba 14SG with AirSim. Following are the high level steps to configure your RC: If you are going to use Hardware-in-Loop mode, you need transmitter for your specific brand of RC and bind it. You can find this information in RC's user guide. For Hardware-in-Loop mode, you connect transmitter to Pixhawk. Usually you can find online doc or YouTube video tutorial on how to do that. Calibrate your RC in QGroundControl . See PX4 RC configuration and Please see this guide for more information. Using XBox 360 USB Gamepad You can also use an xbox controller in SITL mode, it just won't be as precise as a real RC controller. See xbox controller for details on how to set that up. Using Playstation 3 controller A Playstation 3 controller is confirmed to work as an AirSim controller. On Windows, an emulator to make it look like an Xbox 360 controller, is required however. Many different solutions are available online, for example x360ce Xbox 360 Controller Emulator . DJI Controller Nils Tijtgat wrote an excellent blog on how to get the DJI controller working with AirSim . FAQ I'm using default config and AirSim says my RC is not detected on USB. This typically happens if you have multiple RCs and or XBox/Playstation gamepads etc connected. In Windows, hit Windows+S key and search for \"Set up USB Game controllers\" (in older versions of Windows try \"joystick\"). This will show you all game controllers connected to your PC. If you don't see yours than Windows haven't detected it and so you need to first solve that issue. If you do see yours but not at the top of the list (i.e. index 0) than you need to tell AirSim because AirSim by default tries to use RC at index 0. To do this, navigate to your ~/Documents/AirSim folder, open up settings.json and add/modify following setting. Below tells AirSim to use RC at index = 2. { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", \"RC\": { \"RemoteControlID\": 2 } } } } Vehicle seems unstable when using XBox/PS3 controller Regular gamepads are not very precise and have lot of random noise. Most of the times you may see significant offsets as well (i.e. output is not zero when sticks are at zero). So this behavior is expected. Where is RC calibration in AirSim? We haven't implemented it yet. This means your RC firmware will need to have a capability to do calibration for now. My RC is not working with PX4 setup. First you want to make sure your RC is working in QGroundControl . If it doesn't then it will sure not work in AirSim. The PX4 mode is suitable for folks who have at least intermediate level of experience to deal with various issues related to PX4 and we would generally refer you to get help from PX4 forums.","title":"Remote Control"},{"location":"remote_control/#remote-control","text":"To fly manually, you need remote control or RC. If you don't have one then you can use APIs to fly programmatically or use so-called Computer Vision mode to move around using keyboard.","title":"Remote Control"},{"location":"remote_control/#rc-setup-for-default-config","text":"By default AirSim uses simple_flight as its flight controller which connects to RC via USB port to your computer. You can either use XBox controller or FrSky Taranis X9D Plus . Note that XBox 360 controller is not precise enough and is not recommended if you wanted more real world experience. See FAQ below if things are not working.","title":"RC Setup for Default Config"},{"location":"remote_control/#other-devices","text":"AirSim can detect large variety of devices however devices other than above might need extra configuration. In future we will add ability to set this config through settings.json. For now, if things are not working then you might want to try workarounds such as x360ce or change code in SimJoystick.cpp file .","title":"Other Devices"},{"location":"remote_control/#note-on-frsky-taranis-x9d-plus","text":"FrSky Taranis X9D Plus is real UAV remote control with an advantage that it has USB port so it can be directly connected to PC. You can download AirSim config file and follow this tutorial to import it in your RC. You should then see \"sim\" model in RC with all channels configured properly.","title":"Note on FrSky Taranis X9D Plus"},{"location":"remote_control/#note-on-linux","text":"Currently default config on Linux is for using Xbox controller. This means other devices might not work properly. In future we will add ability to configure RC in settings.json but for now you might have to change code in SimJoystick.cpp file to use other devices.","title":"Note on Linux"},{"location":"remote_control/#rc-setup-for-px4","text":"AirSim supports PX4 flight controller however it requires different setup. There are many remote control options that you can use with quadrotors. We have successfully used FrSky Taranis X9D Plus, FlySky FS-TH9X and Futaba 14SG with AirSim. Following are the high level steps to configure your RC: If you are going to use Hardware-in-Loop mode, you need transmitter for your specific brand of RC and bind it. You can find this information in RC's user guide. For Hardware-in-Loop mode, you connect transmitter to Pixhawk. Usually you can find online doc or YouTube video tutorial on how to do that. Calibrate your RC in QGroundControl . See PX4 RC configuration and Please see this guide for more information.","title":"RC Setup for PX4"},{"location":"remote_control/#using-xbox-360-usb-gamepad","text":"You can also use an xbox controller in SITL mode, it just won't be as precise as a real RC controller. See xbox controller for details on how to set that up.","title":"Using XBox 360 USB Gamepad"},{"location":"remote_control/#using-playstation-3-controller","text":"A Playstation 3 controller is confirmed to work as an AirSim controller. On Windows, an emulator to make it look like an Xbox 360 controller, is required however. Many different solutions are available online, for example x360ce Xbox 360 Controller Emulator .","title":"Using Playstation 3 controller"},{"location":"remote_control/#dji-controller","text":"Nils Tijtgat wrote an excellent blog on how to get the DJI controller working with AirSim .","title":"DJI Controller"},{"location":"remote_control/#faq","text":"","title":"FAQ"},{"location":"remote_control/#im-using-default-config-and-airsim-says-my-rc-is-not-detected-on-usb","text":"This typically happens if you have multiple RCs and or XBox/Playstation gamepads etc connected. In Windows, hit Windows+S key and search for \"Set up USB Game controllers\" (in older versions of Windows try \"joystick\"). This will show you all game controllers connected to your PC. If you don't see yours than Windows haven't detected it and so you need to first solve that issue. If you do see yours but not at the top of the list (i.e. index 0) than you need to tell AirSim because AirSim by default tries to use RC at index 0. To do this, navigate to your ~/Documents/AirSim folder, open up settings.json and add/modify following setting. Below tells AirSim to use RC at index = 2. { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", \"RC\": { \"RemoteControlID\": 2 } } } }","title":"I'm using default config and AirSim says my RC is not detected on USB."},{"location":"remote_control/#vehicle-seems-unstable-when-using-xboxps3-controller","text":"Regular gamepads are not very precise and have lot of random noise. Most of the times you may see significant offsets as well (i.e. output is not zero when sticks are at zero). So this behavior is expected.","title":"Vehicle seems unstable when using XBox/PS3 controller"},{"location":"remote_control/#where-is-rc-calibration-in-airsim","text":"We haven't implemented it yet. This means your RC firmware will need to have a capability to do calibration for now.","title":"Where is RC calibration in AirSim?"},{"location":"remote_control/#my-rc-is-not-working-with-px4-setup","text":"First you want to make sure your RC is working in QGroundControl . If it doesn't then it will sure not work in AirSim. The PX4 mode is suitable for folks who have at least intermediate level of experience to deal with various issues related to PX4 and we would generally refer you to get help from PX4 forums.","title":"My RC is not working with PX4 setup."},{"location":"retexturing/","text":"Runtime Texture Swapping How to Make An Actor Retexturable To be made texture-swappable, an actor must derive from the parent class TextureShuffleActor. The parent class can be set via the settings tab in the actor's blueprint. After setting the parent class to TextureShuffActor, the object gains the member DynamicMaterial. DynamicMaterial needs to be set--on all actor instances in the scene--to TextureSwappableMaterial. Warning: Statically setting the Dynamic Material in the blueprint class may cause rendering errors. It seems to work better to set it on all the actor instances in the scene, using the details panel. How to Define the Set(s) of Textures to Choose From Typically, certain subsets of actors will share a set of texture options with each other. (e.g. walls that are part of the same building) It's easy to set up these groupings by using Unreal Engine's group editing functionality. Select all the instances that should have the same texture selection, and add the textures to all of them simultaneously via the Details panel. Use the same technique to add descriptive tags to groups of actors, which will be used to address them in the API. It's ideal to work from larger groupings to smaller groupings, simply deselecting actors to narrow down the grouping as you go, and applying any individual actor properties last. How to Swap Textures from the API The following API is available in C++ and python. (C++ shown) std::vector simSwapTextures(const std::string& tags, int tex_id); The string of \",\" or \", \" delimited tags identifies on which actors to perform the swap. The tex_id indexes the array of textures assigned to each actor undergoing a swap. The function will return the list of objects which matched the provided tags and had the texture swap perfomed. If tex_id is out-of-bounds for some object's texture set, it will be taken modulo the number of textures that were available. Demo (Python): import cosysairsim as airsim import time c = airsim.client.MultirotorClient() print(c.simSwapTextures(\"furniture\", 0)) time.sleep(2) print(c.simSwapTextures(\"chair\", 1)) time.sleep(2) print(c.simSwapTextures(\"table\", 1)) time.sleep(2) print(c.simSwapTextures(\"chair, right\", 0)) Results: ['RetexturableChair', 'RetexturableChair2', 'RetexturableTable'] ['RetexturableChair', 'RetexturableChair2'] ['RetexturableTable'] ['RetexturableChair2'] Note that in this example, different textures were chosen on each actor for the same index value. You can also use the simSetObjectMaterial and simSetObjectMaterialFromTexture APIs to set an object's material to any material asset or filepath of a texture. For more information on using these APIs, see Texture APIs .","title":"Domain Randomization"},{"location":"retexturing/#runtime-texture-swapping","text":"","title":"Runtime Texture Swapping"},{"location":"retexturing/#how-to-make-an-actor-retexturable","text":"To be made texture-swappable, an actor must derive from the parent class TextureShuffleActor. The parent class can be set via the settings tab in the actor's blueprint. After setting the parent class to TextureShuffActor, the object gains the member DynamicMaterial. DynamicMaterial needs to be set--on all actor instances in the scene--to TextureSwappableMaterial. Warning: Statically setting the Dynamic Material in the blueprint class may cause rendering errors. It seems to work better to set it on all the actor instances in the scene, using the details panel.","title":"How to Make An Actor Retexturable"},{"location":"retexturing/#how-to-define-the-sets-of-textures-to-choose-from","text":"Typically, certain subsets of actors will share a set of texture options with each other. (e.g. walls that are part of the same building) It's easy to set up these groupings by using Unreal Engine's group editing functionality. Select all the instances that should have the same texture selection, and add the textures to all of them simultaneously via the Details panel. Use the same technique to add descriptive tags to groups of actors, which will be used to address them in the API. It's ideal to work from larger groupings to smaller groupings, simply deselecting actors to narrow down the grouping as you go, and applying any individual actor properties last.","title":"How to Define the Set(s) of Textures to Choose From"},{"location":"retexturing/#how-to-swap-textures-from-the-api","text":"The following API is available in C++ and python. (C++ shown) std::vector simSwapTextures(const std::string& tags, int tex_id); The string of \",\" or \", \" delimited tags identifies on which actors to perform the swap. The tex_id indexes the array of textures assigned to each actor undergoing a swap. The function will return the list of objects which matched the provided tags and had the texture swap perfomed. If tex_id is out-of-bounds for some object's texture set, it will be taken modulo the number of textures that were available. Demo (Python): import cosysairsim as airsim import time c = airsim.client.MultirotorClient() print(c.simSwapTextures(\"furniture\", 0)) time.sleep(2) print(c.simSwapTextures(\"chair\", 1)) time.sleep(2) print(c.simSwapTextures(\"table\", 1)) time.sleep(2) print(c.simSwapTextures(\"chair, right\", 0)) Results: ['RetexturableChair', 'RetexturableChair2', 'RetexturableTable'] ['RetexturableChair', 'RetexturableChair2'] ['RetexturableTable'] ['RetexturableChair2'] Note that in this example, different textures were chosen on each actor for the same index value. You can also use the simSetObjectMaterial and simSetObjectMaterialFromTexture APIs to set an object's material to any material asset or filepath of a texture. For more information on using these APIs, see Texture APIs .","title":"How to Swap Textures from the API"},{"location":"ros_cplusplus/","text":"airsim_ros_pkgs A ROS2 wrapper over the Cosys-AirSim C++ client library. All coordinates and data are in the right-handed coordinate frame of the ROS standard and not in NED except for geo points. The following was tested on Ubuntu 22.04 with ROS2 Iron. Build Build Cosys-AirSim as per the instructions. Make sure that you have set up the environment variables for ROS. Add the source command to your .bashrc for convenience (replace iron with specific version name) - echo \"source /opt/ros/iron/setup.bash\" >> ~/.bashrc source ~/.bashrc -- Install dependencies with rosdep, if not already installed - apt-get install python3-rosdep sudo rosdep init rosdep update rosdep install --from-paths src -y --ignore-src --skip-keys pcl --skip-keys message_runtime --skip-keys message_generation Build ROS package colcon build --cmake-args -DCMAKE_BUILD_TYPE=Release Running source install/setup.bash ros2 launch airsim_ros_pkgs airsim_node.launch.py Using Cosys-Airsim ROS wrapper The ROS wrapper is composed of two ROS nodes - the first is a wrapper over Cosys-AirSim's multirotor C++ client library, and the second is a simple PD position controller. Let's look at the ROS API for both nodes: Cosys-Airsim ROS Wrapper Node Publishers: The publishers will be automatically created based on the settings in the settings.json file for all vehicles and the sensors. /airsim_node/VEHICLE-NAME/car_state airsim_interfaces::CarState The state of the car if the vehicle is of this sim-mode type. /airsim_node/VEHICLE-NAME/computervision_state airsim_interfaces::ComputerVisionState The state of the computer vision actor if the vehicle is of this sim-mode type. /airsim_node/origin_geo_point airsim_interfaces::GPSYaw GPS coordinates corresponding to global frame. This is set in the airsim's settings.json file under the OriginGeopoint key. /airsim_node/VEHICLE-NAME/global_gps sensor_msgs::NavSatFix This the current GPS coordinates of the drone in airsim. /airsim_node/VEHICLE-NAME/environment airsim_interfaces::Environment /airsim_node/VEHICLE-NAME/odom_local nav_msgs::Odometry Odometry frame (default name: odom_local, launch name and frame type are configurable) wrt take-off point. /airsim_node/VEHICLE-NAME/CAMERA-NAME_IMAGE-TYPE/camera_info sensor_msgs::CameraInfo Optionally if the image type is annotation the annotation layer name is also included in the topic name. /airsim_node/VEHICLE-NAME/CAMERA-NAME_IMAGE-TYPE/image sensor_msgs::Image RGB or float image depending on image type requested in settings.json. Optionally if the image type is annotation the annotation layer name is also included in the topic name. /tf tf2_msgs::TFMessage /airsim_node/VEHICLE-NAME/altimeter/SENSOR_NAME airsim_interfaces::Altimeter This the current altimeter reading for altitude, pressure, and QNH /airsim_node/VEHICLE-NAME/imu/SENSOR_NAME sensor_msgs::Imu IMU sensor data. /airsim_node/VEHICLE-NAME/magnetometer/SENSOR_NAME sensor_msgs::MagneticField Measurement of magnetic field vector/compass. /airsim_node/VEHICLE-NAME/distance/SENSOR_NAME sensor_msgs::Range Measurement of distance from an active ranger, such as infrared or IR /airsim_node/VEHICLE-NAME/lidar/points/SENSOR_NAME/ sensor_msgs::PointCloud2 LIDAR pointcloud /airsim_node/VEHICLE-NAME/lidar/labels/SENSOR_NAME/ airsim_interfaces::StringArray Custom message type with an array of string that are the labels for each point in the pointcloud of the lidar sensor /airsim_node/VEHICLE-NAME/gpulidar/points/SENSOR_NAME/ sensor_msgs::PointCloud2 GPU LIDAR pointcloud. The instance segmentation/annotation color data is stored in the rgb field of the pointcloud. The intensity data is stored as well in the intensity field /airsim_node/VEHICLE-NAME/echo/active/points/SENSOR_NAME/ sensor_msgs::PointCloud2 Echo sensor pointcloud for active sensing /airsim_node/VEHICLE-NAME/echo/passive/points/SENSOR_NAME/ sensor_msgs::PointCloud2 Echo sensor pointcloud for passive sensing /airsim_node/VEHICLE-NAME/echo/active/labels/SENSOR_NAME/ airsim_interfaces::StringArray Custom message type with an array of string that are the labels for each point in the pointcloud for the active echo pointcloud /airsim_node/VEHICLE-NAME/echo/passive/labels/SENSOR_NAME/ airsim_interfaces::StringArray Custom message type with an array of string that are the labels for each point in the pointcloud for the passive echo pointcloud /airsim_node/instance_segmentation_labels airsim_interfaces::InstanceSegmentationList Custom message type with an array of a custom messages that are the names, color and index of the instance segmentation system for each object in the world. /airsim_node/object_transforms airsim_interfaces::ObjectTransformsList Custom message type with an array of geometry_msgs::TransformStamped that are the transforms of all objects in the world, each child frame ID is the object name. Subscribers: /airsim_node/VEHICLE-NAME/vel_cmd_body_frame airsim_interfaces::VelCmd /airsim_node/VEHICLE-NAME/vel_cmd_world_frame airsim_interfaces::VelCmd /airsim_node/all_robots/vel_cmd_body_frame airsim_interfaces::VelCmd Set velocity command for all drones. /airsim_node/all_robots/vel_cmd_world_frame airsim_interfaces::VelCmd /airsim_node/group_of_robots/vel_cmd_body_frame airsim_interfaces::VelCmdGroup Set velocity command for a specific set of drones. /airsim_node/group_of_robots/vel_cmd_world_frame airsim_interfaces::VelCmdGroup Set velocity command for a specific set of drones. /gimbal_angle_euler_cmd airsim_interfaces::GimbalAngleEulerCmd Gimbal set point in euler angles. /gimbal_angle_quat_cmd airsim_interfaces::GimbalAngleQuatCmd Gimbal set point in quaternion. /airsim_node/VEHICLE-NAME/car_cmd airsim_interfaces::CarControls Throttle, brake, steering and gear selections for control. Both automatic and manual transmission control possible, see the car_joy.py script for use. Services: /airsim_node/VEHICLE-NAME/land airsim_interfaces::Land /airsim_node/VEHICLE-NAME/takeoff airsim_interfaces::Takeoff /airsim_node/all_robots/land airsim_interfaces::Land land all drones /airsim_node/all_robots/takeoff airsim_interfaces::Takeoff take-off all drones /airsim_node/group_of_robots/land airsim_interfaces::LandGroup land a specific set of drones /airsim_node/group_of_robots/takeoff airsim_interfaces::TakeoffGroup take-off a specific set of drones /airsim_node/reset airsim_interfaces::Reset Resets all vehicles /airsim_node/instance_segmentation_refresh airsim_interfaces::RefreshInstanceSegmentation Refresh the instance segmentation list /airsim_node/object_transforms_refresh airsim_interfaces::RefreshObjectTransforms Refresh the object transforms list Parameters: /airsim_node/host_ip [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: localhost The IP of the machine running the airsim RPC API server. /airsim_node/host_port [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 41451 The port of the machine running the airsim RPC API server. /airsim_node/enable_api_control [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: false Set the API control and arm the drones on startup. If not set to true no control is available. /airsim_node/enable_object_transforms_list [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: true Retrieve the object transforms list from the airsim API at the start or with the service to refresh. If disabled this is not available but can save time on startup. /airsim_node/host_port [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 41451 The port of the machine running the airsim RPC API server. /airsim_node/is_vulkan [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: True If using Vulkan, the image encoding is switched from rgb8 to bgr8. /airsim_node/world_frame_id [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: world /airsim_node/odom_frame_id [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: odom_local /airsim_node/update_airsim_control_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for updating drone odom and state from airsim, and sending in control commands. The current RPClib interface to unreal engine maxes out at 50 Hz. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/update_airsim_img_response_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for receiving images from all cameras in airsim. The speed will depend on number of images requested and their resolution. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/update_lidar_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for receiving images from all Lidar data in airsim. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/update_gpulidar_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for receiving images from all GPU-Lidar data in airsim. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/update_echo_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for receiving images from all echo sensor data in airsim. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/publish_clock [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: false Will publish the ros /clock topic if set to true. Simple PID Position Controller Node Parameters: PD controller parameters: /pd_position_node/kp_x [double], /pd_position_node/kp_y [double], /pd_position_node/kp_z [double], /pd_position_node/kp_yaw [double] Proportional gains /pd_position_node/kd_x [double], /pd_position_node/kd_y [double], /pd_position_node/kd_z [double], /pd_position_node/kd_yaw [double] Derivative gains /pd_position_node/reached_thresh_xyz [double] Threshold euler distance (meters) from current position to setpoint position /pd_position_node/reached_yaw_degrees [double] Threshold yaw distance (degrees) from current position to setpoint position /pd_position_node/update_control_every_n_sec [double] Default: 0.01 seconds Services: /airsim_node/VEHICLE-NAME/gps_goal [Request: airsim_interfaces::SetGPSPosition ] Target gps position + yaw. In absolute altitude. /airsim_node/VEHICLE-NAME/local_position_goal [Request: airsim_interfaces::SetLocalPosition ] Target local position + yaw in global frame. Subscribers: /airsim_node/origin_geo_point airsim_interfaces::GPSYaw Listens to home geo coordinates published by airsim_node . /airsim_node/VEHICLE-NAME/odom_local nav_msgs::Odometry Listens to odometry published by airsim_node Publishers: /vel_cmd_world_frame airsim_interfaces::VelCmd Sends velocity command to airsim_node /vel_cmd_body_frame airsim_interfaces::VelCmd Sends velocity command to airsim_node Global params Dynamic constraints. These can be changed in dynamic_constraints.launch : /max_vel_horz_abs [double] Maximum horizontal velocity of the drone (meters/second) /max_vel_vert_abs [double] Maximum vertical velocity of the drone (meters/second) /max_yaw_rate_degree [double] Maximum yaw rate (degrees/second)","title":"ROS2: AirSim ROS C++ Wrapper"},{"location":"ros_cplusplus/#airsim_ros_pkgs","text":"A ROS2 wrapper over the Cosys-AirSim C++ client library. All coordinates and data are in the right-handed coordinate frame of the ROS standard and not in NED except for geo points. The following was tested on Ubuntu 22.04 with ROS2 Iron.","title":"airsim_ros_pkgs"},{"location":"ros_cplusplus/#build","text":"Build Cosys-AirSim as per the instructions. Make sure that you have set up the environment variables for ROS. Add the source command to your .bashrc for convenience (replace iron with specific version name) - echo \"source /opt/ros/iron/setup.bash\" >> ~/.bashrc source ~/.bashrc -- Install dependencies with rosdep, if not already installed - apt-get install python3-rosdep sudo rosdep init rosdep update rosdep install --from-paths src -y --ignore-src --skip-keys pcl --skip-keys message_runtime --skip-keys message_generation Build ROS package colcon build --cmake-args -DCMAKE_BUILD_TYPE=Release","title":"Build"},{"location":"ros_cplusplus/#running","text":"source install/setup.bash ros2 launch airsim_ros_pkgs airsim_node.launch.py","title":"Running"},{"location":"ros_cplusplus/#using-cosys-airsim-ros-wrapper","text":"The ROS wrapper is composed of two ROS nodes - the first is a wrapper over Cosys-AirSim's multirotor C++ client library, and the second is a simple PD position controller. Let's look at the ROS API for both nodes:","title":"Using Cosys-Airsim ROS wrapper"},{"location":"ros_cplusplus/#cosys-airsim-ros-wrapper-node","text":"","title":"Cosys-Airsim ROS Wrapper Node"},{"location":"ros_cplusplus/#publishers","text":"The publishers will be automatically created based on the settings in the settings.json file for all vehicles and the sensors. /airsim_node/VEHICLE-NAME/car_state airsim_interfaces::CarState The state of the car if the vehicle is of this sim-mode type. /airsim_node/VEHICLE-NAME/computervision_state airsim_interfaces::ComputerVisionState The state of the computer vision actor if the vehicle is of this sim-mode type. /airsim_node/origin_geo_point airsim_interfaces::GPSYaw GPS coordinates corresponding to global frame. This is set in the airsim's settings.json file under the OriginGeopoint key. /airsim_node/VEHICLE-NAME/global_gps sensor_msgs::NavSatFix This the current GPS coordinates of the drone in airsim. /airsim_node/VEHICLE-NAME/environment airsim_interfaces::Environment /airsim_node/VEHICLE-NAME/odom_local nav_msgs::Odometry Odometry frame (default name: odom_local, launch name and frame type are configurable) wrt take-off point. /airsim_node/VEHICLE-NAME/CAMERA-NAME_IMAGE-TYPE/camera_info sensor_msgs::CameraInfo Optionally if the image type is annotation the annotation layer name is also included in the topic name. /airsim_node/VEHICLE-NAME/CAMERA-NAME_IMAGE-TYPE/image sensor_msgs::Image RGB or float image depending on image type requested in settings.json. Optionally if the image type is annotation the annotation layer name is also included in the topic name. /tf tf2_msgs::TFMessage /airsim_node/VEHICLE-NAME/altimeter/SENSOR_NAME airsim_interfaces::Altimeter This the current altimeter reading for altitude, pressure, and QNH /airsim_node/VEHICLE-NAME/imu/SENSOR_NAME sensor_msgs::Imu IMU sensor data. /airsim_node/VEHICLE-NAME/magnetometer/SENSOR_NAME sensor_msgs::MagneticField Measurement of magnetic field vector/compass. /airsim_node/VEHICLE-NAME/distance/SENSOR_NAME sensor_msgs::Range Measurement of distance from an active ranger, such as infrared or IR /airsim_node/VEHICLE-NAME/lidar/points/SENSOR_NAME/ sensor_msgs::PointCloud2 LIDAR pointcloud /airsim_node/VEHICLE-NAME/lidar/labels/SENSOR_NAME/ airsim_interfaces::StringArray Custom message type with an array of string that are the labels for each point in the pointcloud of the lidar sensor /airsim_node/VEHICLE-NAME/gpulidar/points/SENSOR_NAME/ sensor_msgs::PointCloud2 GPU LIDAR pointcloud. The instance segmentation/annotation color data is stored in the rgb field of the pointcloud. The intensity data is stored as well in the intensity field /airsim_node/VEHICLE-NAME/echo/active/points/SENSOR_NAME/ sensor_msgs::PointCloud2 Echo sensor pointcloud for active sensing /airsim_node/VEHICLE-NAME/echo/passive/points/SENSOR_NAME/ sensor_msgs::PointCloud2 Echo sensor pointcloud for passive sensing /airsim_node/VEHICLE-NAME/echo/active/labels/SENSOR_NAME/ airsim_interfaces::StringArray Custom message type with an array of string that are the labels for each point in the pointcloud for the active echo pointcloud /airsim_node/VEHICLE-NAME/echo/passive/labels/SENSOR_NAME/ airsim_interfaces::StringArray Custom message type with an array of string that are the labels for each point in the pointcloud for the passive echo pointcloud /airsim_node/instance_segmentation_labels airsim_interfaces::InstanceSegmentationList Custom message type with an array of a custom messages that are the names, color and index of the instance segmentation system for each object in the world. /airsim_node/object_transforms airsim_interfaces::ObjectTransformsList Custom message type with an array of geometry_msgs::TransformStamped that are the transforms of all objects in the world, each child frame ID is the object name.","title":"Publishers:"},{"location":"ros_cplusplus/#subscribers","text":"/airsim_node/VEHICLE-NAME/vel_cmd_body_frame airsim_interfaces::VelCmd /airsim_node/VEHICLE-NAME/vel_cmd_world_frame airsim_interfaces::VelCmd /airsim_node/all_robots/vel_cmd_body_frame airsim_interfaces::VelCmd Set velocity command for all drones. /airsim_node/all_robots/vel_cmd_world_frame airsim_interfaces::VelCmd /airsim_node/group_of_robots/vel_cmd_body_frame airsim_interfaces::VelCmdGroup Set velocity command for a specific set of drones. /airsim_node/group_of_robots/vel_cmd_world_frame airsim_interfaces::VelCmdGroup Set velocity command for a specific set of drones. /gimbal_angle_euler_cmd airsim_interfaces::GimbalAngleEulerCmd Gimbal set point in euler angles. /gimbal_angle_quat_cmd airsim_interfaces::GimbalAngleQuatCmd Gimbal set point in quaternion. /airsim_node/VEHICLE-NAME/car_cmd airsim_interfaces::CarControls Throttle, brake, steering and gear selections for control. Both automatic and manual transmission control possible, see the car_joy.py script for use.","title":"Subscribers:"},{"location":"ros_cplusplus/#services","text":"/airsim_node/VEHICLE-NAME/land airsim_interfaces::Land /airsim_node/VEHICLE-NAME/takeoff airsim_interfaces::Takeoff /airsim_node/all_robots/land airsim_interfaces::Land land all drones /airsim_node/all_robots/takeoff airsim_interfaces::Takeoff take-off all drones /airsim_node/group_of_robots/land airsim_interfaces::LandGroup land a specific set of drones /airsim_node/group_of_robots/takeoff airsim_interfaces::TakeoffGroup take-off a specific set of drones /airsim_node/reset airsim_interfaces::Reset Resets all vehicles /airsim_node/instance_segmentation_refresh airsim_interfaces::RefreshInstanceSegmentation Refresh the instance segmentation list /airsim_node/object_transforms_refresh airsim_interfaces::RefreshObjectTransforms Refresh the object transforms list","title":"Services:"},{"location":"ros_cplusplus/#parameters","text":"/airsim_node/host_ip [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: localhost The IP of the machine running the airsim RPC API server. /airsim_node/host_port [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 41451 The port of the machine running the airsim RPC API server. /airsim_node/enable_api_control [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: false Set the API control and arm the drones on startup. If not set to true no control is available. /airsim_node/enable_object_transforms_list [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: true Retrieve the object transforms list from the airsim API at the start or with the service to refresh. If disabled this is not available but can save time on startup. /airsim_node/host_port [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 41451 The port of the machine running the airsim RPC API server. /airsim_node/is_vulkan [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: True If using Vulkan, the image encoding is switched from rgb8 to bgr8. /airsim_node/world_frame_id [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: world /airsim_node/odom_frame_id [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: odom_local /airsim_node/update_airsim_control_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for updating drone odom and state from airsim, and sending in control commands. The current RPClib interface to unreal engine maxes out at 50 Hz. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/update_airsim_img_response_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for receiving images from all cameras in airsim. The speed will depend on number of images requested and their resolution. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/update_lidar_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for receiving images from all Lidar data in airsim. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/update_gpulidar_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for receiving images from all GPU-Lidar data in airsim. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/update_echo_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for receiving images from all echo sensor data in airsim. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/publish_clock [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: false Will publish the ros /clock topic if set to true.","title":"Parameters:"},{"location":"ros_cplusplus/#simple-pid-position-controller-node","text":"","title":"Simple PID Position Controller Node"},{"location":"ros_cplusplus/#parameters_1","text":"PD controller parameters: /pd_position_node/kp_x [double], /pd_position_node/kp_y [double], /pd_position_node/kp_z [double], /pd_position_node/kp_yaw [double] Proportional gains /pd_position_node/kd_x [double], /pd_position_node/kd_y [double], /pd_position_node/kd_z [double], /pd_position_node/kd_yaw [double] Derivative gains /pd_position_node/reached_thresh_xyz [double] Threshold euler distance (meters) from current position to setpoint position /pd_position_node/reached_yaw_degrees [double] Threshold yaw distance (degrees) from current position to setpoint position /pd_position_node/update_control_every_n_sec [double] Default: 0.01 seconds","title":"Parameters:"},{"location":"ros_cplusplus/#services_1","text":"/airsim_node/VEHICLE-NAME/gps_goal [Request: airsim_interfaces::SetGPSPosition ] Target gps position + yaw. In absolute altitude. /airsim_node/VEHICLE-NAME/local_position_goal [Request: airsim_interfaces::SetLocalPosition ] Target local position + yaw in global frame.","title":"Services:"},{"location":"ros_cplusplus/#subscribers_1","text":"/airsim_node/origin_geo_point airsim_interfaces::GPSYaw Listens to home geo coordinates published by airsim_node . /airsim_node/VEHICLE-NAME/odom_local nav_msgs::Odometry Listens to odometry published by airsim_node","title":"Subscribers:"},{"location":"ros_cplusplus/#publishers_1","text":"/vel_cmd_world_frame airsim_interfaces::VelCmd Sends velocity command to airsim_node /vel_cmd_body_frame airsim_interfaces::VelCmd Sends velocity command to airsim_node","title":"Publishers:"},{"location":"ros_cplusplus/#global-params","text":"Dynamic constraints. These can be changed in dynamic_constraints.launch : /max_vel_horz_abs [double] Maximum horizontal velocity of the drone (meters/second) /max_vel_vert_abs [double] Maximum vertical velocity of the drone (meters/second) /max_yaw_rate_degree [double] Maximum yaw rate (degrees/second)","title":"Global params"},{"location":"ros_python/","text":"How to use AirSim with Robot Operating System (ROS) AirSim and ROS can be integrated using Python. Some example ROS node are provided demonstrating how to publish data from AirSim as ROS topics. Prerequisites These instructions are for Ubuntu 20.04, ROS Noetic, UE 5.4 and latest Cosys-AirSim release. You should have these components installed and working before proceeding. Note that you need to install the Python module first for this to work. More information here in the section 'Installing AirSim Package'. Publish node There is one single Python script airsim_publish.py that can be used as a ROS Node. It can be used in two ways: - Get and publish the entire TF tree of map, vehicle and sensors; vehicle movement groundtruth ; all sensor data as well as the poses of world objects. - Replays a route rosbag that holds an existing trajectory of a vehicle. The script will then replay this trajectory while recording all sensor data for each pose of the trajectory. It generates a new rosbag holding both the route and sensor data as well as all TF information. This allows for better performance and deterministic datasets over the same route. Example launch files Some basic launch files are available for the ROS node in these two configurations mentioned above. - airsim_publish.launch : This shows all available parameters for the node. It also shows how to use the node in the first configuration. - record_route.launch : This is a variant of the one above but only exposing and enabling those to create a route rosbag for the second configuration. It will automatically record a rosbag as well. - replay_route_record_sensors.launch : This is the script to use a route rosbag created with the previous launch file type to replay it and record all sensor and TF data and create a single merged rosbag. Setup Setup workspace and Airsim package Option A: Create a new ROS package in your catkin workspace following these instructions. Create a new ROS package called AirSim or whatever you like. If you don't already have a catkin workspace, you should first work through the ROS beginner tutorials. In the ROS package directory you made, copy the ROS node scripts from the AirSim/ros/python_ws/src/airsimros directory to your ROS package. Change the code below to match your AirSim and catkin workspace paths. cp AirSim/ros/python_ws/src/airsim/scripts ../catkin_ws/src/airsimros Option B: Use provided workspace Airsim/ros/python_ws itself is already a workspace which can be used out of the box after building. For building see below. Build ROS AirSim package Change directory to your top level catkin workspace folder i.e. cd ~/catkin_ws and run catkin_make This will build the AirSim package. Next, run source devel/setup.bash so ROS can find the new package. You can add this command to your ~/.bashrc to load your catkin workspace automatically. s## How to run ROS AirSim nodes First make sure you are running an AirSim project and that the simulation is playing. The implemented AirSim node can be run using rosrun airsimros airsim_publish.py . Or alternatively you can use launch files such as the example ones that can be found in AirSim/ros/python_ws/src/airsim/launch like rosrun airsimros airsim_publish.launch .","title":"ROS: AirSim ROS Python Wrapper"},{"location":"ros_python/#how-to-use-airsim-with-robot-operating-system-ros","text":"AirSim and ROS can be integrated using Python. Some example ROS node are provided demonstrating how to publish data from AirSim as ROS topics.","title":"How to use AirSim with Robot Operating System (ROS)"},{"location":"ros_python/#prerequisites","text":"These instructions are for Ubuntu 20.04, ROS Noetic, UE 5.4 and latest Cosys-AirSim release. You should have these components installed and working before proceeding. Note that you need to install the Python module first for this to work. More information here in the section 'Installing AirSim Package'.","title":"Prerequisites"},{"location":"ros_python/#publish-node","text":"There is one single Python script airsim_publish.py that can be used as a ROS Node. It can be used in two ways: - Get and publish the entire TF tree of map, vehicle and sensors; vehicle movement groundtruth ; all sensor data as well as the poses of world objects. - Replays a route rosbag that holds an existing trajectory of a vehicle. The script will then replay this trajectory while recording all sensor data for each pose of the trajectory. It generates a new rosbag holding both the route and sensor data as well as all TF information. This allows for better performance and deterministic datasets over the same route.","title":"Publish node"},{"location":"ros_python/#example-launch-files","text":"Some basic launch files are available for the ROS node in these two configurations mentioned above. - airsim_publish.launch : This shows all available parameters for the node. It also shows how to use the node in the first configuration. - record_route.launch : This is a variant of the one above but only exposing and enabling those to create a route rosbag for the second configuration. It will automatically record a rosbag as well. - replay_route_record_sensors.launch : This is the script to use a route rosbag created with the previous launch file type to replay it and record all sensor and TF data and create a single merged rosbag.","title":"Example launch files"},{"location":"ros_python/#setup","text":"","title":"Setup"},{"location":"ros_python/#setup-workspace-and-airsim-package","text":"","title":"Setup workspace and Airsim package"},{"location":"ros_python/#option-a-create-a-new-ros-package-in-your-catkin-workspace-following-these-instructions","text":"Create a new ROS package called AirSim or whatever you like. If you don't already have a catkin workspace, you should first work through the ROS beginner tutorials. In the ROS package directory you made, copy the ROS node scripts from the AirSim/ros/python_ws/src/airsimros directory to your ROS package. Change the code below to match your AirSim and catkin workspace paths. cp AirSim/ros/python_ws/src/airsim/scripts ../catkin_ws/src/airsimros","title":"Option A: Create a new ROS package in your catkin workspace following these instructions."},{"location":"ros_python/#option-b-use-provided-workspace","text":"Airsim/ros/python_ws itself is already a workspace which can be used out of the box after building. For building see below.","title":"Option B: Use provided workspace"},{"location":"ros_python/#build-ros-airsim-package","text":"Change directory to your top level catkin workspace folder i.e. cd ~/catkin_ws and run catkin_make This will build the AirSim package. Next, run source devel/setup.bash so ROS can find the new package. You can add this command to your ~/.bashrc to load your catkin workspace automatically. s## How to run ROS AirSim nodes First make sure you are running an AirSim project and that the simulation is playing. The implemented AirSim node can be run using rosrun airsimros airsim_publish.py . Or alternatively you can use launch files such as the example ones that can be found in AirSim/ros/python_ws/src/airsim/launch like rosrun airsimros airsim_publish.launch .","title":"Build ROS AirSim package"},{"location":"sensors/","text":"Sensors in Cosys-AirSim Cosys-AirSim currently supports the following sensors. Each sensor is associated with a integer enum specifying its sensor type. Camera Barometer = 1 Imu = 2 Gps = 3 Magnetometer = 4 Distance Sensor = 5 Lidar = 6 Echo = 7 GPULidar = 8 Uwb = 10 Wi-Fi = 11 Note : Cameras are configured differently than the other sensors and do not have an enum associated with them. Look at general settings and image API for camera config and API. Default sensors If no sensors are specified in the settings.json , then the following sensors are enabled by default based on the sim mode. Multirotor Imu Magnetometer Gps Barometer Car Gps ComputerVision None Behind the scenes, 'createDefaultSensorSettings' method in AirSimSettings.hpp which sets up the above sensors with their default parameters, depending on the sim mode specified in the settings.json file. Configuring the default sensor list The default sensor list can be configured in settings json: \"DefaultSensors\": { \"Barometer\": { \"SensorType\": 1, \"Enabled\" : true, \"PressureFactorSigma\": 0.001825, \"PressureFactorTau\": 3600, \"UncorrelatedNoiseSigma\": 2.7, \"UpdateLatency\": 0, \"UpdateFrequency\": 50, \"StartupDelay\": 0 }, \"Imu\": { \"SensorType\": 2, \"Enabled\" : true, \"GenerateNoise\": false, \"AngularRandomWalk\": 0.3, \"GyroBiasStabilityTau\": 500, \"GyroBiasStability\": 4.6, \"VelocityRandomWalk\": 0.24, \"AccelBiasStabilityTau\": 800, \"AccelBiasStability\": 36 }, \"Gps\": { \"SensorType\": 3, \"Enabled\" : true, \"EphTimeConstant\": 0.9, \"EpvTimeConstant\": 0.9, \"EphInitial\": 25, \"EpvInitial\": 25, \"EphFinal\": 0.1, \"EpvFinal\": 0.1, \"EphMin3d\": 3, \"EphMin2d\": 4, \"UpdateLatency\": 0.2, \"UpdateFrequency\": 50, \"StartupDelay\": 1 }, \"Magnetometer\": { \"SensorType\": 4, \"Enabled\" : true, \"NoiseSigma\": 0.005, \"ScaleFactor\": 1, \"NoiseBias\": 0, \"UpdateLatency\": 0, \"UpdateFrequency\": 50, \"StartupDelay\": 0 }, \"Distance\": { \"SensorType\": 5, \"Enabled\" : true, \"MinDistance\": 0.2, \"MaxDistance\": 40, \"X\": 0, \"Y\": 0, \"Z\": -1, \"Yaw\": 0, \"Pitch\": 0, \"Roll\": 0, \"DrawDebugPoints\": false } }, Configuring vehicle-specific sensor list A vehicle can override a subset of the default sensors listed above. A Lidar and Distance sensor are not added to a vehicle by default, so those you need to add this way. Each sensor must have a valid \"SensorType\" and a subset of the properties can be defined that override the default values shown above and you can set Enabled to false to disable a specific type of sensor. \"Vehicles\": { \"Drone1\": { \"VehicleType\": \"SimpleFlight\", \"AutoCreate\": true, ... \"Sensors\": { \"MyLidar1\": { \"SensorType\": 6, \"Enabled\" : true, \"NumberOfChannels\": 16, \"PointsPerSecond\": 10000, \"X\": 0, \"Y\": 0, \"Z\": -1, \"DrawDebugPoints\": true }, \"MyLidar2\": { \"SensorType\": 6, \"Enabled\" : true, \"NumberOfChannels\": 4, \"PointsPerSecond\": 10000, \"X\": 0, \"Y\": 0, \"Z\": -1, \"DrawDebugPoints\": true } } } } Sensor specific settings Each sensor-type has its own set of settings as well. Please see lidar for example of Lidar specific settings. Please see echo for example of Echo specific settings. Please see GPU lidar for example of GPU Lidar specific settings. Sensor APIs Barometer msr::airlib::BarometerBase::Output getBarometerData(const std::string& barometer_name, const std::string& vehicle_name); barometer_data = client.getBarometerData(barometer_name = \"\", vehicle_name = \"\") IMU msr::airlib::ImuBase::Output getImuData(const std::string& imu_name = \"\", const std::string& vehicle_name = \"\"); imu_data = client.getImuData(imu_name = \"\", vehicle_name = \"\") GPS msr::airlib::GpsBase::Output getGpsData(const std::string& gps_name = \"\", const std::string& vehicle_name = \"\"); gps_data = client.getGpsData(gps_name = \"\", vehicle_name = \"\") Magnetometer msr::airlib::MagnetometerBase::Output getMagnetometerData(const std::string& magnetometer_name = \"\", const std::string& vehicle_name = \"\"); magnetometer_data = client.getMagnetometerData(magnetometer_name = \"\", vehicle_name = \"\") Distance sensor msr::airlib::DistanceSensorData getDistanceSensorData(const std::string& distance_sensor_name = \"\", const std::string& vehicle_name = \"\"); distance_sensor_data = client.getDistanceSensorData(distance_sensor_name = \"\", vehicle_name = \"\") Lidar See lidar for Lidar API. Echo See echo for Echo API. GPU Lidar See GPU Lidar for GPU Lidar API. UWB/Wi-Fi These sensors are still experimental and are currently not documented. Please refer to the source code for more information.","title":"Sensors"},{"location":"sensors/#sensors-in-cosys-airsim","text":"Cosys-AirSim currently supports the following sensors. Each sensor is associated with a integer enum specifying its sensor type. Camera Barometer = 1 Imu = 2 Gps = 3 Magnetometer = 4 Distance Sensor = 5 Lidar = 6 Echo = 7 GPULidar = 8 Uwb = 10 Wi-Fi = 11 Note : Cameras are configured differently than the other sensors and do not have an enum associated with them. Look at general settings and image API for camera config and API.","title":"Sensors in Cosys-AirSim"},{"location":"sensors/#default-sensors","text":"If no sensors are specified in the settings.json , then the following sensors are enabled by default based on the sim mode.","title":"Default sensors"},{"location":"sensors/#multirotor","text":"Imu Magnetometer Gps Barometer","title":"Multirotor"},{"location":"sensors/#car","text":"Gps","title":"Car"},{"location":"sensors/#computervision","text":"None Behind the scenes, 'createDefaultSensorSettings' method in AirSimSettings.hpp which sets up the above sensors with their default parameters, depending on the sim mode specified in the settings.json file.","title":"ComputerVision"},{"location":"sensors/#configuring-the-default-sensor-list","text":"The default sensor list can be configured in settings json: \"DefaultSensors\": { \"Barometer\": { \"SensorType\": 1, \"Enabled\" : true, \"PressureFactorSigma\": 0.001825, \"PressureFactorTau\": 3600, \"UncorrelatedNoiseSigma\": 2.7, \"UpdateLatency\": 0, \"UpdateFrequency\": 50, \"StartupDelay\": 0 }, \"Imu\": { \"SensorType\": 2, \"Enabled\" : true, \"GenerateNoise\": false, \"AngularRandomWalk\": 0.3, \"GyroBiasStabilityTau\": 500, \"GyroBiasStability\": 4.6, \"VelocityRandomWalk\": 0.24, \"AccelBiasStabilityTau\": 800, \"AccelBiasStability\": 36 }, \"Gps\": { \"SensorType\": 3, \"Enabled\" : true, \"EphTimeConstant\": 0.9, \"EpvTimeConstant\": 0.9, \"EphInitial\": 25, \"EpvInitial\": 25, \"EphFinal\": 0.1, \"EpvFinal\": 0.1, \"EphMin3d\": 3, \"EphMin2d\": 4, \"UpdateLatency\": 0.2, \"UpdateFrequency\": 50, \"StartupDelay\": 1 }, \"Magnetometer\": { \"SensorType\": 4, \"Enabled\" : true, \"NoiseSigma\": 0.005, \"ScaleFactor\": 1, \"NoiseBias\": 0, \"UpdateLatency\": 0, \"UpdateFrequency\": 50, \"StartupDelay\": 0 }, \"Distance\": { \"SensorType\": 5, \"Enabled\" : true, \"MinDistance\": 0.2, \"MaxDistance\": 40, \"X\": 0, \"Y\": 0, \"Z\": -1, \"Yaw\": 0, \"Pitch\": 0, \"Roll\": 0, \"DrawDebugPoints\": false } },","title":"Configuring the default sensor list"},{"location":"sensors/#configuring-vehicle-specific-sensor-list","text":"A vehicle can override a subset of the default sensors listed above. A Lidar and Distance sensor are not added to a vehicle by default, so those you need to add this way. Each sensor must have a valid \"SensorType\" and a subset of the properties can be defined that override the default values shown above and you can set Enabled to false to disable a specific type of sensor. \"Vehicles\": { \"Drone1\": { \"VehicleType\": \"SimpleFlight\", \"AutoCreate\": true, ... \"Sensors\": { \"MyLidar1\": { \"SensorType\": 6, \"Enabled\" : true, \"NumberOfChannels\": 16, \"PointsPerSecond\": 10000, \"X\": 0, \"Y\": 0, \"Z\": -1, \"DrawDebugPoints\": true }, \"MyLidar2\": { \"SensorType\": 6, \"Enabled\" : true, \"NumberOfChannels\": 4, \"PointsPerSecond\": 10000, \"X\": 0, \"Y\": 0, \"Z\": -1, \"DrawDebugPoints\": true } } } }","title":"Configuring vehicle-specific sensor list"},{"location":"sensors/#sensor-specific-settings","text":"Each sensor-type has its own set of settings as well. Please see lidar for example of Lidar specific settings. Please see echo for example of Echo specific settings. Please see GPU lidar for example of GPU Lidar specific settings.","title":"Sensor specific settings"},{"location":"sensors/#sensor-apis","text":"","title":"Sensor APIs"},{"location":"sensors/#barometer","text":"msr::airlib::BarometerBase::Output getBarometerData(const std::string& barometer_name, const std::string& vehicle_name); barometer_data = client.getBarometerData(barometer_name = \"\", vehicle_name = \"\")","title":"Barometer"},{"location":"sensors/#imu","text":"msr::airlib::ImuBase::Output getImuData(const std::string& imu_name = \"\", const std::string& vehicle_name = \"\"); imu_data = client.getImuData(imu_name = \"\", vehicle_name = \"\")","title":"IMU"},{"location":"sensors/#gps","text":"msr::airlib::GpsBase::Output getGpsData(const std::string& gps_name = \"\", const std::string& vehicle_name = \"\"); gps_data = client.getGpsData(gps_name = \"\", vehicle_name = \"\")","title":"GPS"},{"location":"sensors/#magnetometer","text":"msr::airlib::MagnetometerBase::Output getMagnetometerData(const std::string& magnetometer_name = \"\", const std::string& vehicle_name = \"\"); magnetometer_data = client.getMagnetometerData(magnetometer_name = \"\", vehicle_name = \"\")","title":"Magnetometer"},{"location":"sensors/#distance-sensor","text":"msr::airlib::DistanceSensorData getDistanceSensorData(const std::string& distance_sensor_name = \"\", const std::string& vehicle_name = \"\"); distance_sensor_data = client.getDistanceSensorData(distance_sensor_name = \"\", vehicle_name = \"\") Lidar See lidar for Lidar API. Echo See echo for Echo API. GPU Lidar See GPU Lidar for GPU Lidar API. UWB/Wi-Fi These sensors are still experimental and are currently not documented. Please refer to the source code for more information.","title":"Distance sensor"},{"location":"settings/","text":"Cosys-AirSim Settings A good basic settings file that works with many of the examples can be found here as settings_example.json . It shows many of the custom sensors and vehicles that were added by Cosys-Lab. Where are Settings Stored? Cosys-AirSim is searching for the settings definition in the following order. The first match will be used: Looking at the (absolute) path specified by the -settings command line argument. For example, in Windows: AirSim.exe -settings=\"C:\\path\\to\\settings.json\" In Linux ./Blocks.sh -settings=\"/home/$USER/path/to/settings.json\" Looking for a json document passed as a command line argument by the -settings argument. For example, in Windows: AirSim.exe -settings={\"foo\":\"bar\"} In Linux ./Blocks.sh -settings={\"foo\":\"bar\"} Looking in the folder of the executable for a file called settings.json . This will be a deep location where the actual executable of the Editor or binary is stored. For e.g. with the Blocks binary, the location searched is /LinuxNoEditor/Blocks/Binaries/Linux/settings.json . Searching for settings.json in the folder from where the executable is launched This is a top-level directory containing the launch script or executable. For e.g. Linux: /LinuxNoEditor/settings.json , Windows: /WindowsNoEditor/settings.json Note that this path changes depending on where its invoked from. On Linux, if executing the Blocks.sh script from inside LinuxNoEditor folder like ./Blocks.sh , then the previous mentioned path is used. However, if launched from outside LinuxNoEditor folder such as ./LinuxNoEditor/Blocks.sh , then /settings.json will be used. Looking in the AirSim subfolder for a file called settings.json . The AirSim subfolder is located at Documents\\AirSim on Windows and ~/Documents/AirSim on Linux systems. The file is in usual json format . On first startup Cosys-AirSim would create settings.json file with no settings at the users home folder. To avoid problems, always use ASCII format to save json file. How to Chose Between Car/SkidVehicle/Multirotor? The default is to use multirotor. To use car simple set \"SimMode\": \"Car\" like this: { \"SettingsVersion\": 2.0, \"SimMode\": \"Car\" } To choose multirotor or skid vehicle, set \"SimMode\": \"Multirotor\" or \"SimMode\": \"SkidVehicle\" respectively. If you want to prompt user to select vehicle type then use \"SimMode\": \"\" . Available Settings and Their Defaults Below are complete list of settings available along with their default values. If any of the settings is missing from json file, then default value is used. Some default values are simply specified as \"\" which means actual value may be chosen based on the vehicle you are using. For example, ViewMode setting has default value \"\" which translates to \"FlyWithMe\" for drones and \"SpringArmChase\" for cars. Note this does not include most sensor types. WARNING: Do not copy paste all of below in your settings.json. We strongly recommend adding only those settings that you don't want default values. Only required element is \"SettingsVersion\" . { \"SimMode\": \"\", \"ClockType\": \"\", \"ClockSpeed\": 1, \"LocalHostIp\": \"127.0.0.1\", \"ApiServerPort\": 41451, \"RecordUIVisible\": true, \"MoveWorldOrigin\": false, \"LogMessagesVisible\": true, \"ShowLosDebugLines\": false, \"ViewMode\": \"\", \"RpcEnabled\": true, \"EngineSound\": true, \"PhysicsEngineName\": \"\", \"SpeedUnitFactor\": 1.0, \"SpeedUnitLabel\": \"m/s\", \"Wind\": { \"X\": 0, \"Y\": 0, \"Z\": 0 }, \"CameraDirector\": { \"FollowDistance\": -3, \"X\": NaN, \"Y\": NaN, \"Z\": NaN, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN }, \"Recording\": { \"RecordOnMove\": false, \"RecordInterval\": 0.05, \"Folder\": \"\", \"Enabled\": false, \"Cameras\": [ { \"CameraName\": \"0\", \"ImageType\": 0, \"PixelsAsFloat\": false, \"VehicleName\": \"\", \"Compress\": true } ] }, \"CameraDefaults\": { \"CaptureSettings\": [ { \"ImageType\": 0, \"Width\": 256, \"Height\": 144, \"FOV_Degrees\": 90, \"AutoExposureSpeed\": 100, \"AutoExposureBias\": 0, \"AutoExposureMaxBrightness\": 0.64, \"AutoExposureMinBrightness\": 0.03, \"MotionBlurAmount\": 0, \"TargetGamma\": 1.0, \"ProjectionMode\": \"\", \"OrthoWidth\": 5.12, \"MotionBlurAmount\": 1, \"MotionBlurMax\": 10, \"ChromaticAberrationScale\": 2, \"IgnoreMarked\": false, \"LumenGIEnable\": true, \"LumenReflectionEnable\": true, \"LumenFinalQuality\": 1, \"LumenSceneDetail\": 1, \"LumenSceneLightningDetail\": 1 } ], \"NoiseSettings\": [ { \"Enabled\": false, \"ImageType\": 0, \"RandContrib\": 0.2, \"RandSpeed\": 100000.0, \"RandSize\": 500.0, \"RandDensity\": 2, \"HorzWaveContrib\":0.03, \"HorzWaveStrength\": 0.08, \"HorzWaveVertSize\": 1.0, \"HorzWaveScreenSize\": 1.0, \"HorzNoiseLinesContrib\": 1.0, \"HorzNoiseLinesDensityY\": 0.01, \"HorzNoiseLinesDensityXY\": 0.5, \"HorzDistortionContrib\": 1.0, \"HorzDistortionStrength\": 0.002, \"LensDistortionEnable\": true, \"LensDistortionAreaFalloff\": 2, \"LensDistortionAreaRadius\": 1, \"LensDistortionInvert\": false } ], \"Gimbal\": { \"Stabilization\": 0, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN }, \"X\": NaN, \"Y\": NaN, \"Z\": NaN, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN, \"UnrealEngine\": { \"PixelFormatOverride\": [ { \"ImageType\": 0, \"PixelFormat\": 0 } ] } }, \"OriginGeopoint\": { \"Latitude\": 47.641468, \"Longitude\": -122.140165, \"Altitude\": 122 }, \"TimeOfDay\": { \"Enabled\": false, \"StartDateTime\": \"\", \"CelestialClockSpeed\": 1, \"StartDateTimeDst\": false, \"UpdateIntervalSecs\": 60 }, \"SubWindows\": [ {\"WindowID\": 0, \"CameraName\": \"0\", \"ImageType\": 3, \"VehicleName\": \"\", \"Visible\": false}, {\"WindowID\": 1, \"CameraName\": \"0\", \"ImageType\": 5, \"VehicleName\": \"\", \"Visible\": false}, {\"WindowID\": 2, \"CameraName\": \"0\", \"ImageType\": 0, \"VehicleName\": \"\", \"Visible\": false} ], \"PawnPaths\": { \"BareboneCar\": {\"PawnBP\": \"Class'/AirSim/VehicleAdv/Vehicle/VehicleAdvPawn.VehicleAdvPawn_C'\"}, \"DefaultCar\": {\"PawnBP\": \"Class'/AirSim/VehicleAdv/SUV/SuvCarPawn.SuvCarPawn_C'\"}, \"DefaultQuadrotor\": {\"PawnBP\": \"Class'/AirSim/Blueprints/BP_FlyingPawn.BP_FlyingPawn_C'\"}, \"DefaultComputerVision\": {\"PawnBP\": \"Class'/AirSim/Blueprints/BP_ComputerVisionPawn.BP_ComputerVisionPawn_C'\"} }, \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", \"DefaultVehicleState\": \"Armed\", \"AutoCreate\": true, \"PawnPath\": \"\", \"EnableCollisionPassthrough\": false, \"EnableCollisions\": true, \"AllowAPIAlways\": true, \"EnableTrace\": false, \"RC\": { \"RemoteControlID\": 0, \"AllowAPIWhenDisconnected\": false }, \"Cameras\": { //same elements as CameraDefaults above, key as name }, \"X\": NaN, \"Y\": NaN, \"Z\": NaN, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN }, \"PhysXCar\": { \"VehicleType\": \"PhysXCar\", \"DefaultVehicleState\": \"\", \"AutoCreate\": true, \"PawnPath\": \"\", \"EnableCollisionPassthrough\": false, \"EnableCollisions\": true, \"RC\": { \"RemoteControlID\": -1 }, \"Cameras\": { \"MyCamera1\": { //same elements as elements inside CameraDefaults above }, \"MyCamera2\": { //same elements as elements inside CameraDefaults above }, }, \"X\": NaN, \"Y\": NaN, \"Z\": NaN, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN } } } SimMode SimMode determines which simulation mode will be used. Below are currently supported values: - \"\" : prompt user to select vehicle type multirotor or car - \"Multirotor\" : Use multirotor simulation - \"Car\" : Use car simulation - \"ComputerVision\" : Use only camera, no vehicle or physics - \"SkidVehicle\" : use skid-steering vehicle simulation ViewMode The ViewMode determines which camera to use as default and how camera will follow the vehicle. For multirotors, the default ViewMode is \"FlyWithMe\" while for cars the default ViewMode is \"SpringArmChase\" . FlyWithMe : Chase the vehicle from behind with 6 degrees of freedom GroundObserver : Chase the vehicle from 6' above the ground but with full freedom in XY plane. Fpv : View the scene from front camera of vehicle Manual : Don't move camera automatically. Use arrow keys and ASWD keys for move camera manually. SpringArmChase : Chase the vehicle with camera mounted on (invisible) arm that is attached to the vehicle via spring (so it has some latency in movement). NoDisplay : This will freeze rendering for main screen however rendering for subwindows, recording and APIs remain active. This mode is useful to save resources in \"headless\" mode where you are only interested in getting images and don't care about what gets rendered on main screen. This may also improve FPS for recording images. Annotation The annotation system allows you to choose different groundtruth labeling techniques to create more data from your simulation. Find more info here for defining the settings. TimeOfDay This setting controls the position of Sun in the environment. By default Enabled is false which means Sun's position is left at whatever was the default in the environment and it doesn't change over the time. If Enabled is true then Sun position is computed using longitude, latitude and altitude specified in OriginGeopoint section for the date specified in StartDateTime in the string format as %Y-%m-%d %H:%M:%S , for example, 2018-02-12 15:20:00 . If this string is empty then current date and time is used. If StartDateTimeDst is true then we adjust for day light savings time. The Sun's position is then continuously updated at the interval specified in UpdateIntervalSecs . In some cases, it might be desirable to have celestial clock run faster or slower than simulation clock. This can be specified using CelestialClockSpeed , for example, value 100 means for every 1 second of simulation clock, Sun's position is advanced by 100 seconds so Sun will move in sky much faster. Also see Time of Day API . OriginGeopoint This setting specifies the latitude, longitude and altitude of the Player Start component placed in the Unreal environment. The vehicle's home point is computed using this transformation. Note that all coordinates exposed via APIs are using NED system in SI units which means each vehicle starts at (0, 0, 0) in NED system. Time of Day settings are computed for geographical coordinates specified in OriginGeopoint . SubWindows This setting determines what is shown in each of 3 subwindows which are visible when you press 1,2,3 keys. WindowID : Can be 0 to 2 CameraName : is any available camera on the vehicle ImageType : integer value determines what kind of image gets shown according to ImageType enum . VehicleName : string allows you to specify the vehicle to use the camera from, used when multiple vehicles are specified in the settings. First vehicle's camera will be used if there are any mistakes such as incorrect vehicle name, or only a single vehicle. Annotation : string allows you to specify the annotation layer to use for the camera. This is only if using the Annotation camera type for ImageType (value is 10). For example, for a single car vehicle, below shows driver view, front bumper view and rear view as scene, depth and surface normals respectively. \"SubWindows\": [ {\"WindowID\": 0, \"ImageType\": 0, \"CameraName\": \"3\", \"Visible\": true}, {\"WindowID\": 1, \"ImageType\": 3, \"CameraName\": \"0\", \"Visible\": true}, {\"WindowID\": 2, \"ImageType\": 6, \"CameraName\": \"4\", \"Visible\": true} ] In case of multiple vehicles, different vehicles can be specified as follows- \"SubWindows\": [ {\"WindowID\": 0, \"CameraName\": \"0\", \"ImageType\": 3, \"VehicleName\": \"Car1\", \"Visible\": false}, {\"WindowID\": 1, \"CameraName\": \"0\", \"ImageType\": 5, \"VehicleName\": \"Car2\", \"Visible\": false}, {\"WindowID\": 2, \"CameraName\": \"0\", \"ImageType\": 0, \"VehicleName\": \"Car1\", \"Visible\": false} ] Recording The recording feature allows you to record data such as position, orientation, velocity along with the captured image at specified intervals. You can start recording by pressing red Record button on lower right or the R key. The data is stored in the Documents\\AirSim folder (or the folder specified using Folder ), in a time stamped subfolder for each recording session, as tab separated file. RecordInterval : specifies minimal interval in seconds between capturing two images. RecordOnMove : specifies that do not record frame if there was vehicle's position or orientation hasn't changed. Folder : Parent folder where timestamped subfolder with recordings are created. Absolute path of the directory must be specified. If not used, then Documents/AirSim folder will be used. E.g. \"Folder\": \"/home//Documents\" Enabled : Whether Recording should start from the beginning itself, setting to true will start recording automatically when the simulation starts. By default, it's set to false Cameras : this element controls which cameras are used to capture images. By default scene image from camera 0 is recorded as compressed png format. This setting is json array so you can specify multiple cameras to capture images, each with potentially different image types . When PixelsAsFloat is true, image is saved as pfm file instead of png file. VehicleName option allows you to specify separate cameras for individual vehicles. If the Cameras element isn't present, Scene image from the default camera of each vehicle will be recorded. If you don't want to record any images and just the vehicle's physics data, then specify the Cameras element but leave it empty, like this: \"Cameras\": [] add the field Annotation , a string allowing you to specify the annotation layer to use for the camera. This is only if using the Annotation camera type for ImageType . For example, the Cameras element below records scene & segmentation images for Car1 & scene for Car2 - \"Cameras\": [ { \"CameraName\": \"0\", \"ImageType\": 0, \"PixelsAsFloat\": false, \"VehicleName\": \"Car1\", \"Compress\": true }, { \"CameraName\": \"0\", \"ImageType\": 5, \"PixelsAsFloat\": false, \"VehicleName\": \"Car1\", \"Compress\": true }, { \"CameraName\": \"0\", \"ImageType\": 0, \"PixelsAsFloat\": false, \"VehicleName\": \"Car2\", \"Compress\": true } ] Check out Modifying Recording Data for details on how to modify the kinematics data being recorded. ClockSpeed This setting allows you to set the speed of simulation clock with respect to wall clock. For example, value of 5.0 would mean simulation clock has 5 seconds elapsed when wall clock has 1 second elapsed (i.e. simulation is running faster). The value of 0.1 means that simulation clock is 10X slower than wall clock. The value of 1 means simulation is running in real time. It is important to realize that quality of simulation may decrease as the simulation clock runs faster. You might see artifacts like object moving past obstacles because collision is not detected. However slowing down simulation clock (i.e. values < 1.0) generally improves the quality of simulation. Wind Settings This setting specifies the wind speed in World frame, in NED direction. Values are in m/s. By default, speed is 0, i.e. no wind. Camera Director Settings This element specifies the settings used for the camera following the vehicle in the ViewPort. FollowDistance : Distance at which camera follows the vehicle, default is -8 (8 meters) for Car, -3 for others. X, Y, Z, Yaw, Roll, Pitch : These elements allows you to specify the position and orientation of the camera relative to the vehicle. Position is in NED coordinates in SI units with origin set to Player Start location in Unreal environment. The orientation is specified in degrees. The CameraDefaults element at root level specifies defaults used for all cameras. These defaults can be overridden for individual camera in Cameras element inside Vehicles as described later. Main settings Like other sensors the pose of the sensor in the vehicle frame can be defined by X, Y, Z, Roll, Pitch, Yaw parameters. Furthermore there are some other settings available: * DrawSensor : Draw the physical sensor in the world on the vehicle with a 3D axes shown where the sensor is. * External : Uncouple the sensor from the vehicle. If enabled, the position and orientation will be relative to Unreal world coordinates. Note that if MoveWorldOrigin in the settings.json is set to true the Unreal coordinates will be moved to be the same origin as the player start location and as such this may effect where the sensor will spawn. * ExternalLocal : When in external mode, if this is enabled the retrieved pose of the sensor will be in Local NED coordinates(from starting position from vehicle) and not converted Unreal NED coordinates which is default. Note that if MoveWorldOrigin in the settings.json is set to true the Unreal coordinates will be moved to be the same origin as the player start location and as such this may effect what coordinates are returned if set to false . Note on ImageType element The ImageType element in JSON array determines which image type that settings applies to. The valid values are described in ImageType section . For example, CaptureSettings element is json array so you can add settings for multiple image types easily. CaptureSettings The CaptureSettings determines how different image types such as scene, depth, disparity, surface normals and segmentation views are rendered. The Width, Height and FOV settings should be self explanatory. The AutoExposureSpeed decides how fast eye adaptation works. We set to generally high value such as 100 to avoid artifacts in image capture. Similarly we set MotionBlurAmount to 0 by default to avoid artifacts in ground truth images. The ProjectionMode decides the projection used by the capture camera and can take value \"perspective\" (default) or \"orthographic\". If projection mode is \"orthographic\" then OrthoWidth determines width of projected area captured in meters. To disable the rendering of certain objects on specific cameras or all, use the IgnoreMarked boolean setting. This requires to mark individual objects that have to be ignore using an Unreal Tag called MarkedIgnore . You can also tweak the motion blur and chromatic Aberration here. Unreal 5 introduces Lumen lightning. Due to the cameras using scene capture components enabling Lumen for them can be costly on performance. Settings have been added specfically for the scene camera to customize the usage of Lumen for Global Illumination and Reflections. The LumenGIEnable and LumenReflectionEnable settings enable or disable Lumen for the camera. The LumenFinalQuality (0.25-2) setting determines the quality of the final image. The LumenSceneDetail (0.25-4) setting determines the quality of the scene. The LumenSceneLightningDetail (0.25-2) setting determines the quality of the lightning in the scene. For explanation of other settings, please see this article . NoiseSettings The NoiseSettings allows to add noise to the specified image type with a goal of simulating camera sensor noise, interference and other artifacts. By default no noise is added, i.e., Enabled: false . If you set Enabled: true then following different types of noise and interference artifacts are enabled, each can be further tuned using setting. The noise effects are implemented as shader created as post processing material in Unreal Engine called CameraSensorNoise . Demo of camera noise and interference simulation: Random noise This adds random noise blobs with following parameters. * RandContrib : This determines blend ratio of noise pixel with image pixel, 0 means no noise and 1 means only noise. * RandSpeed : This determines how fast noise fluctuates, 1 means no fluctuation and higher values like 1E6 means full fluctuation. * RandSize : This determines how coarse noise is, 1 means every pixel has its own noise while higher value means more than 1 pixels share same noise value. * RandDensity : This determines how many pixels out of total will have noise, 1 means all pixels while higher value means lesser number of pixels (exponentially). Horizontal bump distortion This adds horizontal bumps / flickering / ghosting effect. * HorzWaveContrib : This determines blend ratio of noise pixel with image pixel, 0 means no noise and 1 means only noise. * HorzWaveStrength : This determines overall strength of the effect. * HorzWaveVertSize : This determines how many vertical pixels would be effected by the effect. * HorzWaveScreenSize : This determines how much of the screen is effected by the effect. Horizontal noise lines This adds regions of noise on horizontal lines. * HorzNoiseLinesContrib : This determines blend ratio of noise pixel with image pixel, 0 means no noise and 1 means only noise. * HorzNoiseLinesDensityY : This determines how many pixels in horizontal line gets affected. * HorzNoiseLinesDensityXY : This determines how many lines on screen gets affected. Horizontal line distortion This adds fluctuations on horizontal line. * HorzDistortionContrib : This determines blend ratio of noise pixel with image pixel, 0 means no noise and 1 means only noise. * HorzDistortionStrength : This determines how large is the distortion. Radial Lens Distortion This adds radial lens distortion to the camera sensor. * LensDistortionEnable : Enable or disable this feature * LensDistortionAreaFalloff : The size of the area to distort * LensDistortionAreaRadius : The distortion radius * LensDistortionInvert : Set to true to invert and create 'pincushion distortion' or false for 'barrel distortion' Gimbal The Gimbal element allows to freeze camera orientation for pitch, roll and/or yaw. This setting is ignored unless ImageType is -1. The Stabilization is defaulted to 0 meaning no gimbal i.e. camera orientation changes with body orientation on all axis. The value of 1 means full stabilization. The value between 0 to 1 acts as a weight for fixed angles specified (in degrees, in world-frame) in Pitch , Roll and Yaw elements and orientation of the vehicle body. When any of the angles is omitted from json or set to NaN, that angle is not stabilized (i.e. it moves along with vehicle body). UnrealEngine This element contains settings specific to the Unreal Engine. These will be ignored in the Unity project. * PixelFormatOverride : This contains a list of elements that have both a ImageType and PixelFormat setting. Each element allows you to override the default pixel format of the UTextureRenderTarget2D object instantiated for the capture specified by the ImageType setting. Specifying this element allows you to prevent crashes caused by unexpected pixel formats (see #4120 and #4339 for examples of these crashes). A full list of pixel formats can be viewed here . Vehicles Settings Each simulation mode will go through the list of vehicles specified in this setting and create the ones that has \"AutoCreate\": true . Each vehicle specified in this setting has key which becomes the name of the vehicle. If \"Vehicles\" element is missing then this list is populated with default car named \"PhysXCar\" and default multirotor named \"SimpleFlight\". Common Vehicle Setting VehicleType : This could be either PhysXCar , ArduRover or BoxCar for the Car SimMode, SimpleFlight , ArduCopter or PX4Multirotor for the MultiRotor SimMode, ComputerVision for the ComputerVision SimMode and CPHusky or Pioneer for SkidVehicle SimMode. you can use There is no default value therefore this element must be specified. PawnPath : This allows to override the pawn blueprint to use for the vehicle. For example, you may create new pawn blueprint derived from ACarPawn for a warehouse robot in your own project outside the Cosys-AirSim code and then specify its path here. See also PawnPaths . Note that you have to specify your custom pawn blueprint class path inside the global PawnPaths object using your proprietarily defined object name, and quote that name inside the Vehicles setting. For example, { ... \"PawnPaths\": { \"CustomPawn\": {\"PawnBP\": \"Class'/Game/Assets/Blueprints/MyPawn.MyPawn_C'\"} }, \"Vehicles\": { \"MyVehicle\": { \"VehicleType\": ..., \"PawnPath\": \"CustomPawn\", ... } } } DefaultVehicleState : Possible value for multirotors is Armed or Disarmed . AutoCreate : If true then this vehicle would be spawned (if supported by selected sim mode). RC : This sub-element allows to specify which remote controller to use for vehicle using RemoteControlID . The value of -1 means use keyboard (not supported yet for multirotors). The value >= 0 specifies one of many remote controllers connected to the system. The list of available RCs can be seen in Game Controllers panel in Windows, for example. X, Y, Z, Yaw, Roll, Pitch : These elements allows you to specify the initial position and orientation of the vehicle. Position is in NED coordinates in SI units with origin set to Player Start location in Unreal environment. The orientation is specified in degrees. Sensors : This element specifies the sensors associated with the vehicle, see Sensors page for details. IsFpvVehicle : This setting allows to specify which vehicle camera will follow and the view that will be shown when ViewMode is set to Fpv. By default, Cosys-AirSim selects the first vehicle in settings as FPV vehicle. Cameras : This element specifies camera settings for vehicle. The key in this element is name of the available camera and the value is same as CameraDefaults as described above. For example, to change FOV for the front center camera to 120 degrees, you can use this for Vehicles setting: \"Vehicles\": { \"FishEyeDrone\": { \"VehicleType\": \"SimpleFlight\", \"Cameras\": { \"front-center\": { \"CaptureSettings\": [ { \"ImageType\": 0, \"FOV_Degrees\": 120 } ] } } } } Using PX4 By default we use simple_flight so you don't have to do separate HITL or SITL setups. We also support \"PX4\" for advanced users. To use PX4 with Cosys-AirSim, you can use the following for Vehicles setting: \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", } } Additional PX4 Settings The defaults for PX4 is to enable hardware-in-loop setup. There are various other settings available for PX4 as follows with their default values: \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"Lockstep\": true, \"ControlIp\": \"127.0.0.1\", \"ControlPortLocal\": 14540, \"ControlPortRemote\": 14580, \"LogViewerHostIp\": \"127.0.0.1\", \"LogViewerPort\": 14388, \"OffboardCompID\": 1, \"OffboardSysID\": 134, \"QgcHostIp\": \"127.0.0.1\", \"QgcPort\": 14550, \"SerialBaudRate\": 115200, \"SerialPort\": \"*\", \"SimCompID\": 42, \"SimSysID\": 142, \"TcpPort\": 4560, \"UdpIp\": \"127.0.0.1\", \"UdpPort\": 14560, \"UseSerial\": true, \"UseTcp\": false, \"VehicleCompID\": 1, \"VehicleSysID\": 135, \"Model\": \"Generic\", \"LocalHostIp\": \"127.0.0.1\", \"Logs\": \"d:\\\\temp\\\\mavlink\", \"Sensors\": { ... } \"Parameters\": { ... } } } These settings define the MavLink SystemId and ComponentId for the Simulator (SimSysID, SimCompID), and for the vehicle (VehicleSysID, VehicleCompID) and the node that allows remote control of the drone from another app this is called the offboard node (OffboardSysID, OffboardCompID). If you want the simulator to also forward mavlink messages to your ground control app (like QGroundControl) you can also set the UDP address for that in case you want to run that on a different machine (QgcHostIp, QgcPort). The default is local host so QGroundControl should \"just work\" if it is running on the same machine. You can connect the simulator to the LogViewer app, provided in this repo, by setting the UDP address for that (LogViewerHostIp, LogViewerPort). And for each flying drone added to the simulator there is a named block of additional settings. In the above you see the default name \"PX4\". You can change this name from the Unreal Editor when you add a new BP_FlyingPawn asset. You will see these properties grouped under the category \"MavLink\". The MavLink node for this pawn can be remote over UDP or it can be connected to a local serial port. If serial then set UseSerial to true, otherwise set UseSerial to false. For serial connections you also need to set the appropriate SerialBaudRate. The default of 115200 works with Pixhawk version 2 over USB. When communicating with the PX4 drone over serial port both the HIL_ messages and vehicle control messages share the same serial port. When communicating over UDP or TCP PX4 requires two separate channels. If UseTcp is false, then UdpIp, UdpPort are used to send HIL_ messages, otherwise the TcpPort is used. TCP support in PX4 was added in 1.9.2 with the lockstep feature because the guarantee of message delivery that TCP provides is required for the proper functioning of lockstep. Cosys-AirSim becomes a TCP server in that case, and waits for a connection from the PX4 app. The second channel for controlling the vehicle is defined by (ControlIp, ControlPort) and is always a UDP channel. The Sensors section can provide customized settings for simulated sensors, see Sensors . The Parameters section can set PX4 parameters during initialization of the PX4 connection. See Setting up PX4 Software-in-Loop for an example. Using ArduPilot ArduPilot Copter & Rover vehicles are supported in latest Cosys-AirSim main branch & releases v1.3.0 and later. For settings and how to use, please see ArduPilot SITL with Cosys-AirSim Other Settings EngineSound To turn off the engine sound use setting \"EngineSound\": false . Currently this setting applies only to car. PawnPaths This allows you to specify your own vehicle pawn blueprints, for example, you can replace the default car in AirSim with your own car. Your vehicle BP can reside in Content folder of your own Unreal project (i.e. outside of AirSim plugin folder). For example, if you have a car BP located in file Content\\MyCar\\MySedanBP.uasset in your project then you can set \"DefaultCar\": {\"PawnBP\":\"Class'/Game/MyCar/MySedanBP.MySedanBP_C'\"} . The XYZ.XYZ_C is a special notation required to specify class for BP XYZ . Please note that your BP must be derived from CarPawn class. By default this is not the case but you can re-parent the BP using the \"Class Settings\" button in toolbar in UE editor after you open the BP and then choosing \"Car Pawn\" for Parent Class settings in Class Options. It is also a good idea to disable \"Auto Possess Player\" and \"Auto Possess AI\" as well as set AI Controller Class to None in BP details. Please make sure your asset is included for cooking in packaging options if you are creating binary. PhysicsEngineName For cars, we support only PhysX for now (regardless of value in this setting). For multirotors, we support \"FastPhysicsEngine\" and \"ExternalPhysicsEngine\" . \"ExternalPhysicsEngine\" allows the drone to be controlled via setVehiclePose (), keeping the drone in place until the next call. It is especially useful for moving the AirSim drone using an external simulator or on a saved path. LocalHostIp Setting Now when connecting to remote machines you may need to pick a specific Ethernet adapter to reach those machines, for example, it might be over Ethernet or over Wi-Fi, or some other special virtual adapter or a VPN. Your PC may have multiple networks, and those networks might not be allowed to talk to each other, in which case the UDP messages from one network will not get through to the others. So the LocalHostIp allows you to configure how you are reaching those machines. The default of 127.0.0.1 is not able to reach external machines, this default is only used when everything you are talking to is contained on a single PC. ApiServerPort This setting determines the server port that used by airsim clients, default port is 41451. By specifying different ports, the user can run multiple environments in parallel to accelerate data collection process. SpeedUnitFactor Unit conversion factor for speed related to m/s , default is 1. Used in conjunction with SpeedUnitLabel. This may be only used for display purposes for example on-display speed when car is being driven. For example, to get speed in miles/hr use factor 2.23694. SpeedUnitLabel Unit label for speed, default is m/s . Used in conjunction with SpeedUnitFactor.","title":"Settings"},{"location":"settings/#cosys-airsim-settings","text":"A good basic settings file that works with many of the examples can be found here as settings_example.json . It shows many of the custom sensors and vehicles that were added by Cosys-Lab.","title":"Cosys-AirSim Settings"},{"location":"settings/#where-are-settings-stored","text":"Cosys-AirSim is searching for the settings definition in the following order. The first match will be used: Looking at the (absolute) path specified by the -settings command line argument. For example, in Windows: AirSim.exe -settings=\"C:\\path\\to\\settings.json\" In Linux ./Blocks.sh -settings=\"/home/$USER/path/to/settings.json\" Looking for a json document passed as a command line argument by the -settings argument. For example, in Windows: AirSim.exe -settings={\"foo\":\"bar\"} In Linux ./Blocks.sh -settings={\"foo\":\"bar\"} Looking in the folder of the executable for a file called settings.json . This will be a deep location where the actual executable of the Editor or binary is stored. For e.g. with the Blocks binary, the location searched is /LinuxNoEditor/Blocks/Binaries/Linux/settings.json . Searching for settings.json in the folder from where the executable is launched This is a top-level directory containing the launch script or executable. For e.g. Linux: /LinuxNoEditor/settings.json , Windows: /WindowsNoEditor/settings.json Note that this path changes depending on where its invoked from. On Linux, if executing the Blocks.sh script from inside LinuxNoEditor folder like ./Blocks.sh , then the previous mentioned path is used. However, if launched from outside LinuxNoEditor folder such as ./LinuxNoEditor/Blocks.sh , then /settings.json will be used. Looking in the AirSim subfolder for a file called settings.json . The AirSim subfolder is located at Documents\\AirSim on Windows and ~/Documents/AirSim on Linux systems. The file is in usual json format . On first startup Cosys-AirSim would create settings.json file with no settings at the users home folder. To avoid problems, always use ASCII format to save json file.","title":"Where are Settings Stored?"},{"location":"settings/#how-to-chose-between-carskidvehiclemultirotor","text":"The default is to use multirotor. To use car simple set \"SimMode\": \"Car\" like this: { \"SettingsVersion\": 2.0, \"SimMode\": \"Car\" } To choose multirotor or skid vehicle, set \"SimMode\": \"Multirotor\" or \"SimMode\": \"SkidVehicle\" respectively. If you want to prompt user to select vehicle type then use \"SimMode\": \"\" .","title":"How to Chose Between Car/SkidVehicle/Multirotor?"},{"location":"settings/#available-settings-and-their-defaults","text":"Below are complete list of settings available along with their default values. If any of the settings is missing from json file, then default value is used. Some default values are simply specified as \"\" which means actual value may be chosen based on the vehicle you are using. For example, ViewMode setting has default value \"\" which translates to \"FlyWithMe\" for drones and \"SpringArmChase\" for cars. Note this does not include most sensor types. WARNING: Do not copy paste all of below in your settings.json. We strongly recommend adding only those settings that you don't want default values. Only required element is \"SettingsVersion\" . { \"SimMode\": \"\", \"ClockType\": \"\", \"ClockSpeed\": 1, \"LocalHostIp\": \"127.0.0.1\", \"ApiServerPort\": 41451, \"RecordUIVisible\": true, \"MoveWorldOrigin\": false, \"LogMessagesVisible\": true, \"ShowLosDebugLines\": false, \"ViewMode\": \"\", \"RpcEnabled\": true, \"EngineSound\": true, \"PhysicsEngineName\": \"\", \"SpeedUnitFactor\": 1.0, \"SpeedUnitLabel\": \"m/s\", \"Wind\": { \"X\": 0, \"Y\": 0, \"Z\": 0 }, \"CameraDirector\": { \"FollowDistance\": -3, \"X\": NaN, \"Y\": NaN, \"Z\": NaN, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN }, \"Recording\": { \"RecordOnMove\": false, \"RecordInterval\": 0.05, \"Folder\": \"\", \"Enabled\": false, \"Cameras\": [ { \"CameraName\": \"0\", \"ImageType\": 0, \"PixelsAsFloat\": false, \"VehicleName\": \"\", \"Compress\": true } ] }, \"CameraDefaults\": { \"CaptureSettings\": [ { \"ImageType\": 0, \"Width\": 256, \"Height\": 144, \"FOV_Degrees\": 90, \"AutoExposureSpeed\": 100, \"AutoExposureBias\": 0, \"AutoExposureMaxBrightness\": 0.64, \"AutoExposureMinBrightness\": 0.03, \"MotionBlurAmount\": 0, \"TargetGamma\": 1.0, \"ProjectionMode\": \"\", \"OrthoWidth\": 5.12, \"MotionBlurAmount\": 1, \"MotionBlurMax\": 10, \"ChromaticAberrationScale\": 2, \"IgnoreMarked\": false, \"LumenGIEnable\": true, \"LumenReflectionEnable\": true, \"LumenFinalQuality\": 1, \"LumenSceneDetail\": 1, \"LumenSceneLightningDetail\": 1 } ], \"NoiseSettings\": [ { \"Enabled\": false, \"ImageType\": 0, \"RandContrib\": 0.2, \"RandSpeed\": 100000.0, \"RandSize\": 500.0, \"RandDensity\": 2, \"HorzWaveContrib\":0.03, \"HorzWaveStrength\": 0.08, \"HorzWaveVertSize\": 1.0, \"HorzWaveScreenSize\": 1.0, \"HorzNoiseLinesContrib\": 1.0, \"HorzNoiseLinesDensityY\": 0.01, \"HorzNoiseLinesDensityXY\": 0.5, \"HorzDistortionContrib\": 1.0, \"HorzDistortionStrength\": 0.002, \"LensDistortionEnable\": true, \"LensDistortionAreaFalloff\": 2, \"LensDistortionAreaRadius\": 1, \"LensDistortionInvert\": false } ], \"Gimbal\": { \"Stabilization\": 0, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN }, \"X\": NaN, \"Y\": NaN, \"Z\": NaN, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN, \"UnrealEngine\": { \"PixelFormatOverride\": [ { \"ImageType\": 0, \"PixelFormat\": 0 } ] } }, \"OriginGeopoint\": { \"Latitude\": 47.641468, \"Longitude\": -122.140165, \"Altitude\": 122 }, \"TimeOfDay\": { \"Enabled\": false, \"StartDateTime\": \"\", \"CelestialClockSpeed\": 1, \"StartDateTimeDst\": false, \"UpdateIntervalSecs\": 60 }, \"SubWindows\": [ {\"WindowID\": 0, \"CameraName\": \"0\", \"ImageType\": 3, \"VehicleName\": \"\", \"Visible\": false}, {\"WindowID\": 1, \"CameraName\": \"0\", \"ImageType\": 5, \"VehicleName\": \"\", \"Visible\": false}, {\"WindowID\": 2, \"CameraName\": \"0\", \"ImageType\": 0, \"VehicleName\": \"\", \"Visible\": false} ], \"PawnPaths\": { \"BareboneCar\": {\"PawnBP\": \"Class'/AirSim/VehicleAdv/Vehicle/VehicleAdvPawn.VehicleAdvPawn_C'\"}, \"DefaultCar\": {\"PawnBP\": \"Class'/AirSim/VehicleAdv/SUV/SuvCarPawn.SuvCarPawn_C'\"}, \"DefaultQuadrotor\": {\"PawnBP\": \"Class'/AirSim/Blueprints/BP_FlyingPawn.BP_FlyingPawn_C'\"}, \"DefaultComputerVision\": {\"PawnBP\": \"Class'/AirSim/Blueprints/BP_ComputerVisionPawn.BP_ComputerVisionPawn_C'\"} }, \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", \"DefaultVehicleState\": \"Armed\", \"AutoCreate\": true, \"PawnPath\": \"\", \"EnableCollisionPassthrough\": false, \"EnableCollisions\": true, \"AllowAPIAlways\": true, \"EnableTrace\": false, \"RC\": { \"RemoteControlID\": 0, \"AllowAPIWhenDisconnected\": false }, \"Cameras\": { //same elements as CameraDefaults above, key as name }, \"X\": NaN, \"Y\": NaN, \"Z\": NaN, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN }, \"PhysXCar\": { \"VehicleType\": \"PhysXCar\", \"DefaultVehicleState\": \"\", \"AutoCreate\": true, \"PawnPath\": \"\", \"EnableCollisionPassthrough\": false, \"EnableCollisions\": true, \"RC\": { \"RemoteControlID\": -1 }, \"Cameras\": { \"MyCamera1\": { //same elements as elements inside CameraDefaults above }, \"MyCamera2\": { //same elements as elements inside CameraDefaults above }, }, \"X\": NaN, \"Y\": NaN, \"Z\": NaN, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN } } }","title":"Available Settings and Their Defaults"},{"location":"settings/#simmode","text":"SimMode determines which simulation mode will be used. Below are currently supported values: - \"\" : prompt user to select vehicle type multirotor or car - \"Multirotor\" : Use multirotor simulation - \"Car\" : Use car simulation - \"ComputerVision\" : Use only camera, no vehicle or physics - \"SkidVehicle\" : use skid-steering vehicle simulation","title":"SimMode"},{"location":"settings/#viewmode","text":"The ViewMode determines which camera to use as default and how camera will follow the vehicle. For multirotors, the default ViewMode is \"FlyWithMe\" while for cars the default ViewMode is \"SpringArmChase\" . FlyWithMe : Chase the vehicle from behind with 6 degrees of freedom GroundObserver : Chase the vehicle from 6' above the ground but with full freedom in XY plane. Fpv : View the scene from front camera of vehicle Manual : Don't move camera automatically. Use arrow keys and ASWD keys for move camera manually. SpringArmChase : Chase the vehicle with camera mounted on (invisible) arm that is attached to the vehicle via spring (so it has some latency in movement). NoDisplay : This will freeze rendering for main screen however rendering for subwindows, recording and APIs remain active. This mode is useful to save resources in \"headless\" mode where you are only interested in getting images and don't care about what gets rendered on main screen. This may also improve FPS for recording images.","title":"ViewMode"},{"location":"settings/#annotation","text":"The annotation system allows you to choose different groundtruth labeling techniques to create more data from your simulation. Find more info here for defining the settings.","title":"Annotation"},{"location":"settings/#timeofday","text":"This setting controls the position of Sun in the environment. By default Enabled is false which means Sun's position is left at whatever was the default in the environment and it doesn't change over the time. If Enabled is true then Sun position is computed using longitude, latitude and altitude specified in OriginGeopoint section for the date specified in StartDateTime in the string format as %Y-%m-%d %H:%M:%S , for example, 2018-02-12 15:20:00 . If this string is empty then current date and time is used. If StartDateTimeDst is true then we adjust for day light savings time. The Sun's position is then continuously updated at the interval specified in UpdateIntervalSecs . In some cases, it might be desirable to have celestial clock run faster or slower than simulation clock. This can be specified using CelestialClockSpeed , for example, value 100 means for every 1 second of simulation clock, Sun's position is advanced by 100 seconds so Sun will move in sky much faster. Also see Time of Day API .","title":"TimeOfDay"},{"location":"settings/#origingeopoint","text":"This setting specifies the latitude, longitude and altitude of the Player Start component placed in the Unreal environment. The vehicle's home point is computed using this transformation. Note that all coordinates exposed via APIs are using NED system in SI units which means each vehicle starts at (0, 0, 0) in NED system. Time of Day settings are computed for geographical coordinates specified in OriginGeopoint .","title":"OriginGeopoint"},{"location":"settings/#subwindows","text":"This setting determines what is shown in each of 3 subwindows which are visible when you press 1,2,3 keys. WindowID : Can be 0 to 2 CameraName : is any available camera on the vehicle ImageType : integer value determines what kind of image gets shown according to ImageType enum . VehicleName : string allows you to specify the vehicle to use the camera from, used when multiple vehicles are specified in the settings. First vehicle's camera will be used if there are any mistakes such as incorrect vehicle name, or only a single vehicle. Annotation : string allows you to specify the annotation layer to use for the camera. This is only if using the Annotation camera type for ImageType (value is 10). For example, for a single car vehicle, below shows driver view, front bumper view and rear view as scene, depth and surface normals respectively. \"SubWindows\": [ {\"WindowID\": 0, \"ImageType\": 0, \"CameraName\": \"3\", \"Visible\": true}, {\"WindowID\": 1, \"ImageType\": 3, \"CameraName\": \"0\", \"Visible\": true}, {\"WindowID\": 2, \"ImageType\": 6, \"CameraName\": \"4\", \"Visible\": true} ] In case of multiple vehicles, different vehicles can be specified as follows- \"SubWindows\": [ {\"WindowID\": 0, \"CameraName\": \"0\", \"ImageType\": 3, \"VehicleName\": \"Car1\", \"Visible\": false}, {\"WindowID\": 1, \"CameraName\": \"0\", \"ImageType\": 5, \"VehicleName\": \"Car2\", \"Visible\": false}, {\"WindowID\": 2, \"CameraName\": \"0\", \"ImageType\": 0, \"VehicleName\": \"Car1\", \"Visible\": false} ]","title":"SubWindows"},{"location":"settings/#recording","text":"The recording feature allows you to record data such as position, orientation, velocity along with the captured image at specified intervals. You can start recording by pressing red Record button on lower right or the R key. The data is stored in the Documents\\AirSim folder (or the folder specified using Folder ), in a time stamped subfolder for each recording session, as tab separated file. RecordInterval : specifies minimal interval in seconds between capturing two images. RecordOnMove : specifies that do not record frame if there was vehicle's position or orientation hasn't changed. Folder : Parent folder where timestamped subfolder with recordings are created. Absolute path of the directory must be specified. If not used, then Documents/AirSim folder will be used. E.g. \"Folder\": \"/home//Documents\" Enabled : Whether Recording should start from the beginning itself, setting to true will start recording automatically when the simulation starts. By default, it's set to false Cameras : this element controls which cameras are used to capture images. By default scene image from camera 0 is recorded as compressed png format. This setting is json array so you can specify multiple cameras to capture images, each with potentially different image types . When PixelsAsFloat is true, image is saved as pfm file instead of png file. VehicleName option allows you to specify separate cameras for individual vehicles. If the Cameras element isn't present, Scene image from the default camera of each vehicle will be recorded. If you don't want to record any images and just the vehicle's physics data, then specify the Cameras element but leave it empty, like this: \"Cameras\": [] add the field Annotation , a string allowing you to specify the annotation layer to use for the camera. This is only if using the Annotation camera type for ImageType . For example, the Cameras element below records scene & segmentation images for Car1 & scene for Car2 - \"Cameras\": [ { \"CameraName\": \"0\", \"ImageType\": 0, \"PixelsAsFloat\": false, \"VehicleName\": \"Car1\", \"Compress\": true }, { \"CameraName\": \"0\", \"ImageType\": 5, \"PixelsAsFloat\": false, \"VehicleName\": \"Car1\", \"Compress\": true }, { \"CameraName\": \"0\", \"ImageType\": 0, \"PixelsAsFloat\": false, \"VehicleName\": \"Car2\", \"Compress\": true } ] Check out Modifying Recording Data for details on how to modify the kinematics data being recorded.","title":"Recording"},{"location":"settings/#clockspeed","text":"This setting allows you to set the speed of simulation clock with respect to wall clock. For example, value of 5.0 would mean simulation clock has 5 seconds elapsed when wall clock has 1 second elapsed (i.e. simulation is running faster). The value of 0.1 means that simulation clock is 10X slower than wall clock. The value of 1 means simulation is running in real time. It is important to realize that quality of simulation may decrease as the simulation clock runs faster. You might see artifacts like object moving past obstacles because collision is not detected. However slowing down simulation clock (i.e. values < 1.0) generally improves the quality of simulation.","title":"ClockSpeed"},{"location":"settings/#wind-settings","text":"This setting specifies the wind speed in World frame, in NED direction. Values are in m/s. By default, speed is 0, i.e. no wind.","title":"Wind Settings"},{"location":"settings/#camera-director-settings","text":"This element specifies the settings used for the camera following the vehicle in the ViewPort. FollowDistance : Distance at which camera follows the vehicle, default is -8 (8 meters) for Car, -3 for others. X, Y, Z, Yaw, Roll, Pitch : These elements allows you to specify the position and orientation of the camera relative to the vehicle. Position is in NED coordinates in SI units with origin set to Player Start location in Unreal environment. The orientation is specified in degrees. The CameraDefaults element at root level specifies defaults used for all cameras. These defaults can be overridden for individual camera in Cameras element inside Vehicles as described later.","title":"Camera Director Settings"},{"location":"settings/#main-settings","text":"Like other sensors the pose of the sensor in the vehicle frame can be defined by X, Y, Z, Roll, Pitch, Yaw parameters. Furthermore there are some other settings available: * DrawSensor : Draw the physical sensor in the world on the vehicle with a 3D axes shown where the sensor is. * External : Uncouple the sensor from the vehicle. If enabled, the position and orientation will be relative to Unreal world coordinates. Note that if MoveWorldOrigin in the settings.json is set to true the Unreal coordinates will be moved to be the same origin as the player start location and as such this may effect where the sensor will spawn. * ExternalLocal : When in external mode, if this is enabled the retrieved pose of the sensor will be in Local NED coordinates(from starting position from vehicle) and not converted Unreal NED coordinates which is default. Note that if MoveWorldOrigin in the settings.json is set to true the Unreal coordinates will be moved to be the same origin as the player start location and as such this may effect what coordinates are returned if set to false .","title":"Main settings"},{"location":"settings/#note-on-imagetype-element","text":"The ImageType element in JSON array determines which image type that settings applies to. The valid values are described in ImageType section . For example, CaptureSettings element is json array so you can add settings for multiple image types easily.","title":"Note on ImageType element"},{"location":"settings/#capturesettings","text":"The CaptureSettings determines how different image types such as scene, depth, disparity, surface normals and segmentation views are rendered. The Width, Height and FOV settings should be self explanatory. The AutoExposureSpeed decides how fast eye adaptation works. We set to generally high value such as 100 to avoid artifacts in image capture. Similarly we set MotionBlurAmount to 0 by default to avoid artifacts in ground truth images. The ProjectionMode decides the projection used by the capture camera and can take value \"perspective\" (default) or \"orthographic\". If projection mode is \"orthographic\" then OrthoWidth determines width of projected area captured in meters. To disable the rendering of certain objects on specific cameras or all, use the IgnoreMarked boolean setting. This requires to mark individual objects that have to be ignore using an Unreal Tag called MarkedIgnore . You can also tweak the motion blur and chromatic Aberration here. Unreal 5 introduces Lumen lightning. Due to the cameras using scene capture components enabling Lumen for them can be costly on performance. Settings have been added specfically for the scene camera to customize the usage of Lumen for Global Illumination and Reflections. The LumenGIEnable and LumenReflectionEnable settings enable or disable Lumen for the camera. The LumenFinalQuality (0.25-2) setting determines the quality of the final image. The LumenSceneDetail (0.25-4) setting determines the quality of the scene. The LumenSceneLightningDetail (0.25-2) setting determines the quality of the lightning in the scene. For explanation of other settings, please see this article .","title":"CaptureSettings"},{"location":"settings/#noisesettings","text":"The NoiseSettings allows to add noise to the specified image type with a goal of simulating camera sensor noise, interference and other artifacts. By default no noise is added, i.e., Enabled: false . If you set Enabled: true then following different types of noise and interference artifacts are enabled, each can be further tuned using setting. The noise effects are implemented as shader created as post processing material in Unreal Engine called CameraSensorNoise . Demo of camera noise and interference simulation:","title":"NoiseSettings"},{"location":"settings/#random-noise","text":"This adds random noise blobs with following parameters. * RandContrib : This determines blend ratio of noise pixel with image pixel, 0 means no noise and 1 means only noise. * RandSpeed : This determines how fast noise fluctuates, 1 means no fluctuation and higher values like 1E6 means full fluctuation. * RandSize : This determines how coarse noise is, 1 means every pixel has its own noise while higher value means more than 1 pixels share same noise value. * RandDensity : This determines how many pixels out of total will have noise, 1 means all pixels while higher value means lesser number of pixels (exponentially).","title":"Random noise"},{"location":"settings/#horizontal-bump-distortion","text":"This adds horizontal bumps / flickering / ghosting effect. * HorzWaveContrib : This determines blend ratio of noise pixel with image pixel, 0 means no noise and 1 means only noise. * HorzWaveStrength : This determines overall strength of the effect. * HorzWaveVertSize : This determines how many vertical pixels would be effected by the effect. * HorzWaveScreenSize : This determines how much of the screen is effected by the effect.","title":"Horizontal bump distortion"},{"location":"settings/#horizontal-noise-lines","text":"This adds regions of noise on horizontal lines. * HorzNoiseLinesContrib : This determines blend ratio of noise pixel with image pixel, 0 means no noise and 1 means only noise. * HorzNoiseLinesDensityY : This determines how many pixels in horizontal line gets affected. * HorzNoiseLinesDensityXY : This determines how many lines on screen gets affected.","title":"Horizontal noise lines"},{"location":"settings/#horizontal-line-distortion","text":"This adds fluctuations on horizontal line. * HorzDistortionContrib : This determines blend ratio of noise pixel with image pixel, 0 means no noise and 1 means only noise. * HorzDistortionStrength : This determines how large is the distortion.","title":"Horizontal line distortion"},{"location":"settings/#radial-lens-distortion","text":"This adds radial lens distortion to the camera sensor. * LensDistortionEnable : Enable or disable this feature * LensDistortionAreaFalloff : The size of the area to distort * LensDistortionAreaRadius : The distortion radius * LensDistortionInvert : Set to true to invert and create 'pincushion distortion' or false for 'barrel distortion'","title":"Radial Lens Distortion"},{"location":"settings/#gimbal","text":"The Gimbal element allows to freeze camera orientation for pitch, roll and/or yaw. This setting is ignored unless ImageType is -1. The Stabilization is defaulted to 0 meaning no gimbal i.e. camera orientation changes with body orientation on all axis. The value of 1 means full stabilization. The value between 0 to 1 acts as a weight for fixed angles specified (in degrees, in world-frame) in Pitch , Roll and Yaw elements and orientation of the vehicle body. When any of the angles is omitted from json or set to NaN, that angle is not stabilized (i.e. it moves along with vehicle body).","title":"Gimbal"},{"location":"settings/#unrealengine","text":"This element contains settings specific to the Unreal Engine. These will be ignored in the Unity project. * PixelFormatOverride : This contains a list of elements that have both a ImageType and PixelFormat setting. Each element allows you to override the default pixel format of the UTextureRenderTarget2D object instantiated for the capture specified by the ImageType setting. Specifying this element allows you to prevent crashes caused by unexpected pixel formats (see #4120 and #4339 for examples of these crashes). A full list of pixel formats can be viewed here .","title":"UnrealEngine"},{"location":"settings/#vehicles-settings","text":"Each simulation mode will go through the list of vehicles specified in this setting and create the ones that has \"AutoCreate\": true . Each vehicle specified in this setting has key which becomes the name of the vehicle. If \"Vehicles\" element is missing then this list is populated with default car named \"PhysXCar\" and default multirotor named \"SimpleFlight\".","title":"Vehicles Settings"},{"location":"settings/#common-vehicle-setting","text":"VehicleType : This could be either PhysXCar , ArduRover or BoxCar for the Car SimMode, SimpleFlight , ArduCopter or PX4Multirotor for the MultiRotor SimMode, ComputerVision for the ComputerVision SimMode and CPHusky or Pioneer for SkidVehicle SimMode. you can use There is no default value therefore this element must be specified. PawnPath : This allows to override the pawn blueprint to use for the vehicle. For example, you may create new pawn blueprint derived from ACarPawn for a warehouse robot in your own project outside the Cosys-AirSim code and then specify its path here. See also PawnPaths . Note that you have to specify your custom pawn blueprint class path inside the global PawnPaths object using your proprietarily defined object name, and quote that name inside the Vehicles setting. For example, { ... \"PawnPaths\": { \"CustomPawn\": {\"PawnBP\": \"Class'/Game/Assets/Blueprints/MyPawn.MyPawn_C'\"} }, \"Vehicles\": { \"MyVehicle\": { \"VehicleType\": ..., \"PawnPath\": \"CustomPawn\", ... } } } DefaultVehicleState : Possible value for multirotors is Armed or Disarmed . AutoCreate : If true then this vehicle would be spawned (if supported by selected sim mode). RC : This sub-element allows to specify which remote controller to use for vehicle using RemoteControlID . The value of -1 means use keyboard (not supported yet for multirotors). The value >= 0 specifies one of many remote controllers connected to the system. The list of available RCs can be seen in Game Controllers panel in Windows, for example. X, Y, Z, Yaw, Roll, Pitch : These elements allows you to specify the initial position and orientation of the vehicle. Position is in NED coordinates in SI units with origin set to Player Start location in Unreal environment. The orientation is specified in degrees. Sensors : This element specifies the sensors associated with the vehicle, see Sensors page for details. IsFpvVehicle : This setting allows to specify which vehicle camera will follow and the view that will be shown when ViewMode is set to Fpv. By default, Cosys-AirSim selects the first vehicle in settings as FPV vehicle. Cameras : This element specifies camera settings for vehicle. The key in this element is name of the available camera and the value is same as CameraDefaults as described above. For example, to change FOV for the front center camera to 120 degrees, you can use this for Vehicles setting: \"Vehicles\": { \"FishEyeDrone\": { \"VehicleType\": \"SimpleFlight\", \"Cameras\": { \"front-center\": { \"CaptureSettings\": [ { \"ImageType\": 0, \"FOV_Degrees\": 120 } ] } } } }","title":"Common Vehicle Setting"},{"location":"settings/#using-px4","text":"By default we use simple_flight so you don't have to do separate HITL or SITL setups. We also support \"PX4\" for advanced users. To use PX4 with Cosys-AirSim, you can use the following for Vehicles setting: \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", } }","title":"Using PX4"},{"location":"settings/#additional-px4-settings","text":"The defaults for PX4 is to enable hardware-in-loop setup. There are various other settings available for PX4 as follows with their default values: \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"Lockstep\": true, \"ControlIp\": \"127.0.0.1\", \"ControlPortLocal\": 14540, \"ControlPortRemote\": 14580, \"LogViewerHostIp\": \"127.0.0.1\", \"LogViewerPort\": 14388, \"OffboardCompID\": 1, \"OffboardSysID\": 134, \"QgcHostIp\": \"127.0.0.1\", \"QgcPort\": 14550, \"SerialBaudRate\": 115200, \"SerialPort\": \"*\", \"SimCompID\": 42, \"SimSysID\": 142, \"TcpPort\": 4560, \"UdpIp\": \"127.0.0.1\", \"UdpPort\": 14560, \"UseSerial\": true, \"UseTcp\": false, \"VehicleCompID\": 1, \"VehicleSysID\": 135, \"Model\": \"Generic\", \"LocalHostIp\": \"127.0.0.1\", \"Logs\": \"d:\\\\temp\\\\mavlink\", \"Sensors\": { ... } \"Parameters\": { ... } } } These settings define the MavLink SystemId and ComponentId for the Simulator (SimSysID, SimCompID), and for the vehicle (VehicleSysID, VehicleCompID) and the node that allows remote control of the drone from another app this is called the offboard node (OffboardSysID, OffboardCompID). If you want the simulator to also forward mavlink messages to your ground control app (like QGroundControl) you can also set the UDP address for that in case you want to run that on a different machine (QgcHostIp, QgcPort). The default is local host so QGroundControl should \"just work\" if it is running on the same machine. You can connect the simulator to the LogViewer app, provided in this repo, by setting the UDP address for that (LogViewerHostIp, LogViewerPort). And for each flying drone added to the simulator there is a named block of additional settings. In the above you see the default name \"PX4\". You can change this name from the Unreal Editor when you add a new BP_FlyingPawn asset. You will see these properties grouped under the category \"MavLink\". The MavLink node for this pawn can be remote over UDP or it can be connected to a local serial port. If serial then set UseSerial to true, otherwise set UseSerial to false. For serial connections you also need to set the appropriate SerialBaudRate. The default of 115200 works with Pixhawk version 2 over USB. When communicating with the PX4 drone over serial port both the HIL_ messages and vehicle control messages share the same serial port. When communicating over UDP or TCP PX4 requires two separate channels. If UseTcp is false, then UdpIp, UdpPort are used to send HIL_ messages, otherwise the TcpPort is used. TCP support in PX4 was added in 1.9.2 with the lockstep feature because the guarantee of message delivery that TCP provides is required for the proper functioning of lockstep. Cosys-AirSim becomes a TCP server in that case, and waits for a connection from the PX4 app. The second channel for controlling the vehicle is defined by (ControlIp, ControlPort) and is always a UDP channel. The Sensors section can provide customized settings for simulated sensors, see Sensors . The Parameters section can set PX4 parameters during initialization of the PX4 connection. See Setting up PX4 Software-in-Loop for an example.","title":"Additional PX4 Settings"},{"location":"settings/#using-ardupilot","text":"ArduPilot Copter & Rover vehicles are supported in latest Cosys-AirSim main branch & releases v1.3.0 and later. For settings and how to use, please see ArduPilot SITL with Cosys-AirSim","title":"Using ArduPilot"},{"location":"settings/#other-settings","text":"","title":"Other Settings"},{"location":"settings/#enginesound","text":"To turn off the engine sound use setting \"EngineSound\": false . Currently this setting applies only to car.","title":"EngineSound"},{"location":"settings/#pawnpaths","text":"This allows you to specify your own vehicle pawn blueprints, for example, you can replace the default car in AirSim with your own car. Your vehicle BP can reside in Content folder of your own Unreal project (i.e. outside of AirSim plugin folder). For example, if you have a car BP located in file Content\\MyCar\\MySedanBP.uasset in your project then you can set \"DefaultCar\": {\"PawnBP\":\"Class'/Game/MyCar/MySedanBP.MySedanBP_C'\"} . The XYZ.XYZ_C is a special notation required to specify class for BP XYZ . Please note that your BP must be derived from CarPawn class. By default this is not the case but you can re-parent the BP using the \"Class Settings\" button in toolbar in UE editor after you open the BP and then choosing \"Car Pawn\" for Parent Class settings in Class Options. It is also a good idea to disable \"Auto Possess Player\" and \"Auto Possess AI\" as well as set AI Controller Class to None in BP details. Please make sure your asset is included for cooking in packaging options if you are creating binary.","title":"PawnPaths"},{"location":"settings/#physicsenginename","text":"For cars, we support only PhysX for now (regardless of value in this setting). For multirotors, we support \"FastPhysicsEngine\" and \"ExternalPhysicsEngine\" . \"ExternalPhysicsEngine\" allows the drone to be controlled via setVehiclePose (), keeping the drone in place until the next call. It is especially useful for moving the AirSim drone using an external simulator or on a saved path.","title":"PhysicsEngineName"},{"location":"settings/#localhostip-setting","text":"Now when connecting to remote machines you may need to pick a specific Ethernet adapter to reach those machines, for example, it might be over Ethernet or over Wi-Fi, or some other special virtual adapter or a VPN. Your PC may have multiple networks, and those networks might not be allowed to talk to each other, in which case the UDP messages from one network will not get through to the others. So the LocalHostIp allows you to configure how you are reaching those machines. The default of 127.0.0.1 is not able to reach external machines, this default is only used when everything you are talking to is contained on a single PC.","title":"LocalHostIp Setting"},{"location":"settings/#apiserverport","text":"This setting determines the server port that used by airsim clients, default port is 41451. By specifying different ports, the user can run multiple environments in parallel to accelerate data collection process.","title":"ApiServerPort"},{"location":"settings/#speedunitfactor","text":"Unit conversion factor for speed related to m/s , default is 1. Used in conjunction with SpeedUnitLabel. This may be only used for display purposes for example on-display speed when car is being driven. For example, to get speed in miles/hr use factor 2.23694.","title":"SpeedUnitFactor"},{"location":"settings/#speedunitlabel","text":"Unit label for speed, default is m/s . Used in conjunction with SpeedUnitFactor.","title":"SpeedUnitLabel"},{"location":"simple_flight/","text":"simple_flight If you don't know what the flight controller does, see What is Flight Controller? . AirSim has a built-in flight controller called simple_flight and it is used by default. You don't need to do anything to use or configure it. AirSim also supports PX4 as another flight controller for advanced users. In the future, we also plan to support ROSFlight and Hackflight . Advantages The advantage of using simple_flight is zero additional setup you need to do and it \"just works\". Also, simple_flight uses a steppable clock which means you can pause the simulation and things are not at mercy of a high variance low precision clock that the operating system provides. Furthermore, simple_flight is simple, cross platform and consists of 100% header-only dependency-free C++ code which means you can literally switch between the simulator and the flight controller code within same code base! Design Normally flight controllers are designed to run on actual hardware of vehicles and their support for running in simulator varies widely. They are often fairly difficult to configure for non-expert users and typically have a complex build, usually lacking cross platform support. All these problems have played a significant part in the design of simple_flight. simple_flight is designed from ground up as library with clean a interface that can work onboard the vehicle as well as in the simulator. The core principle is that the flight controller has no way to specify a special simulation mode and therefore it has no way to know if it is running as a simulation or as a real vehicle. We thus view flight controllers simply as a collection of algorithms packaged in a library. Another key emphasis is to develop this code as dependency-free header-only pure standard C++11 code. This means there is no special build required to compile simple_flight. You just copy its source code to any project you wish and it just works. Control simple_flight can control vehicles by taking in the desired input as angle rate, angle level, velocity or position. Each axis of control can be specified with one of these modes. Internally, simple_flight uses a cascade of PID controllers to finally generate actuator signals. This means that the position PID drives the velocity PID, which in turn drives the angle level PID which finally drives the angle rate PID. State Estimation In the current release, we are using the ground truth from the simulator for our state estimation. We plan to add a complimentary filter-based state estimator for angular velocity and orientation using 2 sensors (gyroscope, accelerometer) in the near future. In a more longer term, we plan to integrate another library to perform velocity and position estimation using 4 sensors (gyroscope, accelerometer, magnetometer and barometer) using an Extended Kalman Filter (EKF). If you have experience in this area, we encourage you to engage with us and contribute! Supported Boards Currently, we have implemented simple_flight interfaces for the simulated board. We plan to implement it for the Pixhawk V2 board and possibly the Naze32 board. We expect all our code to remain unchanged and the implementation would mainly involve adding drivers for various sensors, handling ISRs and managing other board specific details. If you have experience in this area, we encourage you to engage with us and contribute! Configuration To have AirSim use simple_flight, you can specify it in settings.json as shown below. Note that this is default, so you don't have to do it explicitly. \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", } } By default, a vehicle using simple_flight is already armed which is why you would see its propellers spinning. However, if you don't want that then set DefaultVehicleState to Inactive like this: \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", \"DefaultVehicleState\": \"Inactive\" } } In this case, you will need to either manually arm by placing the RC sticks in the down-inward position or using the APIs. For safety reasons, flight controllers disallow API control unless a human operator has consented its use using a switch on his/her RC. Also, when RC control is lost, the vehicle should disable API control and enter hover mode for safety reasons. To simplify things a bit, simple_flight enables API control without human consent using RC and even when RC is not detected by default. However you can change this using the following setting: \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", \"AllowAPIAlways\": true, \"RC\": { \"RemoteControlID\": 0, \"AllowAPIWhenDisconnected\": true } } } Finally, simple_flight uses a steppable clock by default which means that the clock advances when the simulator tells it to advance (unlike the wall clock which advances strictly according to the passage of time). This means the clock can be paused, for example, if code hits a breakpoint and there is zero variance in the clock (clock APIs provided by operating systems might have significant variance unless it is a \"real time\" OS). If you want simple_flight to use a wall clock instead then use following settings: \"ClockType\": \"ScalableClock\"","title":"Simple Flight"},{"location":"simple_flight/#simple_flight","text":"If you don't know what the flight controller does, see What is Flight Controller? . AirSim has a built-in flight controller called simple_flight and it is used by default. You don't need to do anything to use or configure it. AirSim also supports PX4 as another flight controller for advanced users. In the future, we also plan to support ROSFlight and Hackflight .","title":"simple_flight"},{"location":"simple_flight/#advantages","text":"The advantage of using simple_flight is zero additional setup you need to do and it \"just works\". Also, simple_flight uses a steppable clock which means you can pause the simulation and things are not at mercy of a high variance low precision clock that the operating system provides. Furthermore, simple_flight is simple, cross platform and consists of 100% header-only dependency-free C++ code which means you can literally switch between the simulator and the flight controller code within same code base!","title":"Advantages"},{"location":"simple_flight/#design","text":"Normally flight controllers are designed to run on actual hardware of vehicles and their support for running in simulator varies widely. They are often fairly difficult to configure for non-expert users and typically have a complex build, usually lacking cross platform support. All these problems have played a significant part in the design of simple_flight. simple_flight is designed from ground up as library with clean a interface that can work onboard the vehicle as well as in the simulator. The core principle is that the flight controller has no way to specify a special simulation mode and therefore it has no way to know if it is running as a simulation or as a real vehicle. We thus view flight controllers simply as a collection of algorithms packaged in a library. Another key emphasis is to develop this code as dependency-free header-only pure standard C++11 code. This means there is no special build required to compile simple_flight. You just copy its source code to any project you wish and it just works.","title":"Design"},{"location":"simple_flight/#control","text":"simple_flight can control vehicles by taking in the desired input as angle rate, angle level, velocity or position. Each axis of control can be specified with one of these modes. Internally, simple_flight uses a cascade of PID controllers to finally generate actuator signals. This means that the position PID drives the velocity PID, which in turn drives the angle level PID which finally drives the angle rate PID.","title":"Control"},{"location":"simple_flight/#state-estimation","text":"In the current release, we are using the ground truth from the simulator for our state estimation. We plan to add a complimentary filter-based state estimator for angular velocity and orientation using 2 sensors (gyroscope, accelerometer) in the near future. In a more longer term, we plan to integrate another library to perform velocity and position estimation using 4 sensors (gyroscope, accelerometer, magnetometer and barometer) using an Extended Kalman Filter (EKF). If you have experience in this area, we encourage you to engage with us and contribute!","title":"State Estimation"},{"location":"simple_flight/#supported-boards","text":"Currently, we have implemented simple_flight interfaces for the simulated board. We plan to implement it for the Pixhawk V2 board and possibly the Naze32 board. We expect all our code to remain unchanged and the implementation would mainly involve adding drivers for various sensors, handling ISRs and managing other board specific details. If you have experience in this area, we encourage you to engage with us and contribute!","title":"Supported Boards"},{"location":"simple_flight/#configuration","text":"To have AirSim use simple_flight, you can specify it in settings.json as shown below. Note that this is default, so you don't have to do it explicitly. \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", } } By default, a vehicle using simple_flight is already armed which is why you would see its propellers spinning. However, if you don't want that then set DefaultVehicleState to Inactive like this: \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", \"DefaultVehicleState\": \"Inactive\" } } In this case, you will need to either manually arm by placing the RC sticks in the down-inward position or using the APIs. For safety reasons, flight controllers disallow API control unless a human operator has consented its use using a switch on his/her RC. Also, when RC control is lost, the vehicle should disable API control and enter hover mode for safety reasons. To simplify things a bit, simple_flight enables API control without human consent using RC and even when RC is not detected by default. However you can change this using the following setting: \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", \"AllowAPIAlways\": true, \"RC\": { \"RemoteControlID\": 0, \"AllowAPIWhenDisconnected\": true } } } Finally, simple_flight uses a steppable clock by default which means that the clock advances when the simulator tells it to advance (unlike the wall clock which advances strictly according to the passage of time). This means the clock can be paused, for example, if code hits a breakpoint and there is zero variance in the clock (clock APIs provided by operating systems might have significant variance unless it is a \"real time\" OS). If you want simple_flight to use a wall clock instead then use following settings: \"ClockType\": \"ScalableClock\"","title":"Configuration"},{"location":"skid_steer_vehicle/","text":"Unreal Skid Steering Vehicle Model For vehicles that can't use the normal WheeledVehicle setup with normal steering wheels and non-steering wheels, but that use the skid-steering/differential steering (like a tank) an alternative vehicle model was created. It is build using the Chaos engine of Unreal which does not support this vehicle type natively, as such, it can behave unrealistic at times. http://www.robotplatform.com/knowledge/Classification_of_Robots/wheel_control_theory.html Creating a new skid steer vehicle The steps to setup the vehicle are largely the same as a WheeledVehiclePawn with some slight adjustments. 1. Follow this guide to create the skeletal mesh and physics asset. 2. For the wheels setup, the vehicle should have 4 wheels, 2 for the left side and 2 for the right side. Please use SkidWheel as the wheel class. 3. For the vehicle blueprint to create the pawn it is also largely the same as in that tutorial however as class one should use the SkidVehiclePawn sub-class. The vehicle setup parameters are more simplified. 4. To have animated wheels, proper physics and correct steering behavior, please take a look at how the CPHusky is configured in the AirSim plugin. The Husky is a skid steer vehicle and can be used as a reference. Skid steer model within AirSim The skid steer model is a separate SimMode within AirSim. It is fully implemented in similar fashion as the normal Car SimMode. There are already two vehicle types implemented, the ClearPath Husky and Pioneer P3DX robots. To configure the SimMode and vehicle type see the settings.json file documentation . If you create a new vehicle using the Unreal skid steering vehicle model as described above, one can use the PawnPaths setting in the Common Vehicle Settings in the settings.json file to link the custom vehicle pawn. Note that due to a bug in the ChaosVehicles and setting raw YawInput values when rotating on its axis to the left will cause a small forward movement as well.","title":"Skid Steer Vehicles"},{"location":"skid_steer_vehicle/#unreal-skid-steering-vehicle-model","text":"For vehicles that can't use the normal WheeledVehicle setup with normal steering wheels and non-steering wheels, but that use the skid-steering/differential steering (like a tank) an alternative vehicle model was created. It is build using the Chaos engine of Unreal which does not support this vehicle type natively, as such, it can behave unrealistic at times. http://www.robotplatform.com/knowledge/Classification_of_Robots/wheel_control_theory.html","title":"Unreal Skid Steering Vehicle Model"},{"location":"skid_steer_vehicle/#creating-a-new-skid-steer-vehicle","text":"The steps to setup the vehicle are largely the same as a WheeledVehiclePawn with some slight adjustments. 1. Follow this guide to create the skeletal mesh and physics asset. 2. For the wheels setup, the vehicle should have 4 wheels, 2 for the left side and 2 for the right side. Please use SkidWheel as the wheel class. 3. For the vehicle blueprint to create the pawn it is also largely the same as in that tutorial however as class one should use the SkidVehiclePawn sub-class. The vehicle setup parameters are more simplified. 4. To have animated wheels, proper physics and correct steering behavior, please take a look at how the CPHusky is configured in the AirSim plugin. The Husky is a skid steer vehicle and can be used as a reference.","title":"Creating a new skid steer vehicle"},{"location":"skid_steer_vehicle/#skid-steer-model-within-airsim","text":"The skid steer model is a separate SimMode within AirSim. It is fully implemented in similar fashion as the normal Car SimMode. There are already two vehicle types implemented, the ClearPath Husky and Pioneer P3DX robots. To configure the SimMode and vehicle type see the settings.json file documentation . If you create a new vehicle using the Unreal skid steering vehicle model as described above, one can use the PawnPaths setting in the Common Vehicle Settings in the settings.json file to link the custom vehicle pawn. Note that due to a bug in the ChaosVehicles and setting raw YawInput values when rotating on its axis to the left will cause a small forward movement as well.","title":"Skid steer model within AirSim"},{"location":"steering_wheel_installation/","text":"Logitech G920 Steering Wheel Installation To use Logitech G920 steering wheel with Cosys-AirSim follow these steps: Connect the steering wheel to the computer and wait until drivers installation complete. Install Logitech Gaming Software from here Before debug, you\u2019ll have to normalize the values in Cosys-AirSim code. Perform this changes in CarPawn.cpp (according to the current update in the git): In line 382, change \u201cVal\u201d to \u201c1 \u2013 Val\u201d. (the complementary value in the range [0.0,1.0]). In line 388, change \u201cVal\u201d to \u201c5Val - 2.5\u201d (Change the range of the given input from [0.0,1.0] to [-1.0,1.0]). In line 404, change \u201cVal\u201d to \u201c4(1 \u2013 Val)\u201d. (the complementary value in the range [0.0,1.0]). Debug Cosys-AirSim project (while the steering wheel is connected \u2013 it\u2019s important). On Unreal Editor, go to Edit->plugins->input devices and enable \u201cWindows RawInput\u201d. Go to Edit->Project Settings->Raw Input, and add new device configuration: Vendor ID: 0x046d (In case of Logitech G920, otherwise you might need to check it). Product ID: 0xc261 (In case of Logitech G920, otherwise you might need to check it). Under \u201cAxis Properties\u201d, make sure that \u201cGenericUSBController Axis 2\u201d, \u201cGenericUSBController Axis 4\u201d and \u201cGenericUSBController Axis 5\u201d are all enabled with an offset of 1.0. Explanation: axis 2 is responsible for steering movement, axis 4 is for brake and axis 5 is for gas. If you need to configure the clutch, it\u2019s on axis 3. Go to Edit->Project Settings->Input, Under Bindings in \u201cAxis Mappings\u201d: Remove existing mappings from the groups \u201cMoveRight\u201d and \u201cMoveForward\u201d. Add new axis mapping to the group \u201cMoveRight\u201d, use GenericUSBController axis 2 with a scale of 1.0. Add new axis mapping to the group \u201cMoveForward\u201d, use GenericUSBController axis 5 with a scale of 1.0. Add a new group of axis mappings, name it \u201cFootBrake\u201d and add new axis mapping to this group, use GenericUSBController axis 4 with a scale of 1.0. Play and drive ! Pay Attention Notice that in the first time we \"play\" after debug, we need to touch the wheel to \u201creset\u201d the values. Tip In the gaming software, you can configure buttons as keyboard shortcuts, we used it to configure a shortcut to record dataset or to play in full screen.","title":"Steering Wheel"},{"location":"steering_wheel_installation/#logitech-g920-steering-wheel-installation","text":"To use Logitech G920 steering wheel with Cosys-AirSim follow these steps: Connect the steering wheel to the computer and wait until drivers installation complete. Install Logitech Gaming Software from here Before debug, you\u2019ll have to normalize the values in Cosys-AirSim code. Perform this changes in CarPawn.cpp (according to the current update in the git): In line 382, change \u201cVal\u201d to \u201c1 \u2013 Val\u201d. (the complementary value in the range [0.0,1.0]). In line 388, change \u201cVal\u201d to \u201c5Val - 2.5\u201d (Change the range of the given input from [0.0,1.0] to [-1.0,1.0]). In line 404, change \u201cVal\u201d to \u201c4(1 \u2013 Val)\u201d. (the complementary value in the range [0.0,1.0]). Debug Cosys-AirSim project (while the steering wheel is connected \u2013 it\u2019s important). On Unreal Editor, go to Edit->plugins->input devices and enable \u201cWindows RawInput\u201d. Go to Edit->Project Settings->Raw Input, and add new device configuration: Vendor ID: 0x046d (In case of Logitech G920, otherwise you might need to check it). Product ID: 0xc261 (In case of Logitech G920, otherwise you might need to check it). Under \u201cAxis Properties\u201d, make sure that \u201cGenericUSBController Axis 2\u201d, \u201cGenericUSBController Axis 4\u201d and \u201cGenericUSBController Axis 5\u201d are all enabled with an offset of 1.0. Explanation: axis 2 is responsible for steering movement, axis 4 is for brake and axis 5 is for gas. If you need to configure the clutch, it\u2019s on axis 3. Go to Edit->Project Settings->Input, Under Bindings in \u201cAxis Mappings\u201d: Remove existing mappings from the groups \u201cMoveRight\u201d and \u201cMoveForward\u201d. Add new axis mapping to the group \u201cMoveRight\u201d, use GenericUSBController axis 2 with a scale of 1.0. Add new axis mapping to the group \u201cMoveForward\u201d, use GenericUSBController axis 5 with a scale of 1.0. Add a new group of axis mappings, name it \u201cFootBrake\u201d and add new axis mapping to this group, use GenericUSBController axis 4 with a scale of 1.0. Play and drive !","title":"Logitech G920 Steering Wheel Installation"},{"location":"steering_wheel_installation/#pay-attention","text":"Notice that in the first time we \"play\" after debug, we need to touch the wheel to \u201creset\u201d the values.","title":"Pay Attention"},{"location":"steering_wheel_installation/#tip","text":"In the gaming software, you can configure buttons as keyboard shortcuts, we used it to configure a shortcut to record dataset or to play in full screen.","title":"Tip"},{"location":"unreal_blocks/","text":"Setup Blocks Environment for AirSim Blocks environment is available in repo in folder Unreal/Environments/Blocks and is designed to be lightweight in size. That means it is very basic but fast. Here are quick steps to get Blocks environment up and running: Windows from Source Make sure you have built or installed Unreal and built AirSim . Navigate to folder AirSim\\Unreal\\Environments\\Blocks and run update_from_git.bat . Double click on generated .sln file to open in Visual Studio. Make sure Blocks project is the startup project, build configuration is set to DevelopmentEditor_Editor and Win64 . Hit F5 to run. Press the Play button in Unreal Editor. Also see the other documentation for how to use it. Changing Code and Rebuilding For Windows, you can just change the code in Visual Studio, press F5 and re-run. There are few batch files available in folder AirSim\\Unreal\\Environments\\Blocks that lets you sync code, clean etc. Linux from Source Make sure you have built or installed the Unreal Engine and AirSim . Navigate to folder AirSim\\Unreal\\Environments\\Blocks and run update_from_git.sh . Navigate to your UnrealEngine repo folder and run Engine/Binaries/Linux/UE4Editor which will start Unreal Editor. On first start you might not see any projects in UE4 editor. Click on Projects tab, Browse button and then navigate to AirSim/Unreal/Environments/Blocks/Blocks.uproject . If you get prompted for incompatible version and conversion, select In-place conversion which is usually under \"More\" options. If you get prompted for missing modules, make sure to select No , so you don't exit. Finally, when prompted with building AirSim, select Yes. Now it might take a while so go get some coffee :). Press the Play button in Unreal Editor. Also see the other documentation for how to use it. Changing Code and Rebuilding For Linux, make code changes in AirLib or Unreal/Plugins folder and then run ./build.sh to rebuild. This step also copies the build output to Blocks sample project. You can then follow above steps again to re-run. Choosing Your Vehicle: Car or Multirotor By default, AirSim spawns multirotor. You can easily change this to car and use all of AirSim goodies. Please see using car guide. FAQ I see warnings about like \"_BuiltData\" file is missing. These are intermediate files and, you can safely ignore it.","title":"Blocks Environment"},{"location":"unreal_blocks/#setup-blocks-environment-for-airsim","text":"Blocks environment is available in repo in folder Unreal/Environments/Blocks and is designed to be lightweight in size. That means it is very basic but fast. Here are quick steps to get Blocks environment up and running:","title":"Setup Blocks Environment for AirSim"},{"location":"unreal_blocks/#windows-from-source","text":"Make sure you have built or installed Unreal and built AirSim . Navigate to folder AirSim\\Unreal\\Environments\\Blocks and run update_from_git.bat . Double click on generated .sln file to open in Visual Studio. Make sure Blocks project is the startup project, build configuration is set to DevelopmentEditor_Editor and Win64 . Hit F5 to run. Press the Play button in Unreal Editor. Also see the other documentation for how to use it.","title":"Windows from Source"},{"location":"unreal_blocks/#changing-code-and-rebuilding","text":"For Windows, you can just change the code in Visual Studio, press F5 and re-run. There are few batch files available in folder AirSim\\Unreal\\Environments\\Blocks that lets you sync code, clean etc.","title":"Changing Code and Rebuilding"},{"location":"unreal_blocks/#linux-from-source","text":"Make sure you have built or installed the Unreal Engine and AirSim . Navigate to folder AirSim\\Unreal\\Environments\\Blocks and run update_from_git.sh . Navigate to your UnrealEngine repo folder and run Engine/Binaries/Linux/UE4Editor which will start Unreal Editor. On first start you might not see any projects in UE4 editor. Click on Projects tab, Browse button and then navigate to AirSim/Unreal/Environments/Blocks/Blocks.uproject . If you get prompted for incompatible version and conversion, select In-place conversion which is usually under \"More\" options. If you get prompted for missing modules, make sure to select No , so you don't exit. Finally, when prompted with building AirSim, select Yes. Now it might take a while so go get some coffee :). Press the Play button in Unreal Editor. Also see the other documentation for how to use it.","title":"Linux from Source"},{"location":"unreal_blocks/#changing-code-and-rebuilding_1","text":"For Linux, make code changes in AirLib or Unreal/Plugins folder and then run ./build.sh to rebuild. This step also copies the build output to Blocks sample project. You can then follow above steps again to re-run.","title":"Changing Code and Rebuilding"},{"location":"unreal_blocks/#choosing-your-vehicle-car-or-multirotor","text":"By default, AirSim spawns multirotor. You can easily change this to car and use all of AirSim goodies. Please see using car guide.","title":"Choosing Your Vehicle: Car or Multirotor"},{"location":"unreal_blocks/#faq","text":"","title":"FAQ"},{"location":"unreal_blocks/#i-see-warnings-about-like-_builtdata-file-is-missing","text":"These are intermediate files and, you can safely ignore it.","title":"I see warnings about like \"_BuiltData\" file is missing."},{"location":"unreal_custenv/","text":"Creating and Setting Up Unreal Environment This page contains the complete instructions start to finish for setting up Unreal environment with AirSim. The Unreal Marketplace has several environment available that you can start using in just few minutes. It is also possible to use environments available on websites such as turbosquid.com or cgtrader.com with bit more effort (here's tutorial video ). In addition, there also several free environments available. Below we will use a freely downloadable environment from Unreal Marketplace called Landscape Mountain but the steps are same for any other environments. Note for Linux Users There is no Epic Games Launcher for Linux which means that if you need to create custom environment, you will need Windows machine to do that. Once you have Unreal project folder, just copy it over to your Linux machine. Step-by-Step Instructions when using Cosys-AirSim from Precompiled Binaries It is assumed you downloaded the right precompiled Cosys-AirSim plugin from the GitHub releases page for the right Unreal version. In Epic Games Launcher click the Samples tab then scroll down and find Landscape Mountains . Click the Create Project and download this content (~2GB download). Open LandscapeMountains.uproject , it should launch the Unreal Editor. !!!note The Landscape Mountains project is supported up to Unreal Engine version 4.24. If you do not have 4.24 installed, you should see a dialog titled `Select Unreal Engine Version` with a dropdown to select from installed versions. Select 5.4 to migrate the project to a supported engine version. If you have 4.24 installed, you can manually migrate the project by navigating to the corresponding .uproject file in Windows Explorer, right-clicking it, and selecting the `Switch Unreal Engine version...` option. Go to the LandscapeMountains project folder and create a new subfolder called Plugins . Now copy the precompiled AirSim Plugin folder into this newly created folder. This way now your own Unreal project has AirSim plugin. Edit the LandscapeMountains.uproject so that you add the AirSim plugin to the list of plugins to load. json { ... \"Plugins\": [ { \"Name\": \"AirSim\", \"Enabled\": true } ] ... } Edit the Config\\DefaultGame.ini to add the following lines at the end: +MapsToCook=(FilePath=\"/AirSim/AirSimAssets\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/HUDAssets\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Beacons\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Blueprints\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Models\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Sensors\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/StarterContent\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/VehicleAdv\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Weather\") Doing this forces Unreal to include all necessary AirSim content in packaged builds of your project. If using Unreal Engine 5.3/5.4 check here for a fix to the camera scene rendering bug in these engine versions! Close the Unreal Editor and restart it by opening the uproject file again. In Window/World Settings as shown below, set the GameMode Override to AirSimGameMode : Go to 'Edit->Editor Preferences' in Unreal Editor, in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. If you don't do this then UE will be slowed down dramatically when UE window loses focus. Be sure to Save these edits. Hit the Play button in the Unreal Editor. See the other documentation pages for how to use it. Step-by-Step Instructions when using Cosys-AirSim from Source Build Make sure AirSim is built and Unreal 5.4 is installed as described in the installation instructions . In Epic Games Launcher click the Samples tab then scroll down and find Landscape Mountains . Click the Create Project and download this content (~2GB download). Open LandscapeMountains.uproject , it should launch the Unreal Editor. !!!note The Landscape Mountains project is supported up to Unreal Engine version 4.24. If you do not have 4.24 installed, you should see a dialog titled `Select Unreal Engine Version` with a dropdown to select from installed versions. Select 5.4 to migrate the project to a supported engine version. If you have 4.24 installed, you can manually migrate the project by navigating to the corresponding .uproject file in Windows Explorer, right-clicking it, and selecting the `Switch Unreal Engine version...` option. From the File menu select New C++ class , leave default None on the type of class, click Next , leave default name MyClass , and click Create Class . We need to do this because Unreal requires at least one source file in project. It should trigger compile and open up Visual Studio solution LandscapeMountains.sln . Go to your folder for AirSim repo and copy Unreal\\Plugins folder in to your LandscapeMountains folder. This way now your own Unreal project has AirSim plugin. !!!note If the AirSim installation is fresh, i.e, hasn't been built before, make sure that you run `build.cmd` from the root directory once before copying `Unreal\\Plugins` folder so that `AirLib` files are also included. If you have made some changes in the Blocks environment, make sure to run `update_to_git.bat` from `Unreal\\Environments\\Blocks` to update the files in `Unreal\\Plugins`. Edit the LandscapeMountains.uproject so that it looks like this json { \"FileVersion\": 3, \"EngineAssociation\": \"\", \"Category\": \"Samples\", \"Description\": \"\", \"Modules\": [ { \"Name\": \"LandscapeMountains\", \"Type\": \"Runtime\", \"LoadingPhase\": \"Default\", \"AdditionalDependencies\": [ \"AirSim\" ] } ], \"TargetPlatforms\": [ \"MacNoEditor\", \"WindowsNoEditor\" ], \"Plugins\": [ { \"Name\": \"AirSim\", \"Enabled\": true } ] } Edit the Config\\DefaultGame.ini to add the following lines at the end: +MapsToCook=(FilePath=\"/AirSim/AirSimAssets\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/HUDAssets\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Beacons\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Blueprints\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Models\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Sensors\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/StarterContent\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/VehicleAdv\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Weather\") Doing this forces Unreal to include all necessary AirSim content in packaged builds of your project. If using Unreal Engine 5.3/5.4 check here for a fix to the camera scene rendering bug in these engine versions! Close Visual Studio and the Unreal Editor and right-click the LandscapeMountains.uproject in Windows Explorer and select Generate Visual Studio Project Files . This step detects all plugins and source files in your Unreal project and generates .sln file for Visual Studio. !!!tip If the `Generate Visual Studio Project Files` option is missing you may need to reboot your machine for the Unreal Shell extensions to take effect. If it is still missing then open the LandscapeMountains.uproject in the Unreal Editor and select `Refresh Visual Studio Project` from the `File` menu. Reopen LandscapeMountains.sln in Visual Studio, and make sure \"DebugGame Editor\" and \"Win64\" build configuration is the active build configuration. Press F5 to run . This will start the Unreal Editor. The Unreal Editor allows you to edit the environment, assets and other game related settings. First thing you want to do in your environment is set up PlayerStart object. In Landscape Mountains environment, PlayerStart object already exist, and you can find it in the World Outliner . Make sure its location is set up as shown. This is where AirSim plugin will create and place the vehicle. If its too high up then vehicle will fall down as soon as you press play giving potentially random behavior In Window/World Settings as shown below, set the GameMode Override to AirSimGameMode : Go to 'Edit->Editor Preferences' in Unreal Editor, in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. If you don't do this then UE will be slowed down dramatically when UE window loses focus. Be sure to Save these edits. Hit the Play button in the Unreal Editor. See the other documentation pages for how to use it. Congratulations! You are now running AirSim in your own Unreal environment. Updating Your Environment to Latest Version of AirSim Once you have your environment using above instructions, you should frequently update your local AirSim code to latest version from GitHub. Below are the instructions to do this: First put clean.bat (or clean.sh for Linux users) in the root folder of your environment. Run this file to clean up all intermediate files in your Unreal project. Do git pull in your AirSim repo followed by build.cmd (or ./build.sh for Linux users). Replace [your project]/Plugins folder with AirSim/Unreal/Plugins folder. Right-click on your .uproject file and chose \"Generate Visual Studio project files\" option. This is not required for Linux. Choosing Your Vehicle: Car or Multirotor By default, AirSim prompts user for which vehicle to use. You can easily change this by setting SimMode . Please see using car guide. Unreal 5.3/5.4 Scene camera bug Note that Unreal 5.3 and 5.4 breaks camera scene rendering when Effects is not set to the Epic scalability preset. You can use the console command r.DetailMode 2 to fix this at runtime! For the Blocks and other available environments we have made a fix for this. By placing a DefaultScalability.ini file in the Config folder of your Unreal project, you can set the scalability settings to custom values for each one (low, medium, high, epic, cine). As you can see in the Blocks environment, we have added the following to it to fix this bug automatically. You can find the DefaultScalability.ini file in the Unreal/Environments/Blocks folder. Copy this file to your Unreal project's Config folder. [EffectsQuality@0] r.DetailMode=2 [EffectsQuality@1] r.DetailMode=2 [EffectsQuality@2] r.DetailMode=2 [EffectsQuality@3] r.DetailMode=2 [EffectsQuality@Cine] r.DetailMode=2 FAQ What are other cool environments? Unreal Marketplace has dozens of prebuilt extraordinarily detailed environments ranging from Moon to Mars and everything in between. The one we have used for testing is called Modular Neighborhood Pack but you can use any environment. Another free environment is Infinity Blade series . Alternatively, if you look under the Learn tab in Epic Game Launcher, you will find many free samples that you can use. One of our favorites is \"A Boy and His Kite\" which is 100 square miles of highly detailed environment (caution: you will need very beefy PC to run it!). When I press Play button some kind of video starts instead of my vehicle. If the environment comes with MatineeActor, delete it to avoid any startup demo sequences. There might be other ways to remove it as well, for example, click on Blueprints button, then Level Blueprint and then look at Begin Play event in Event Graph. You might want to disconnect any connections that may be starting \"matinee\". Is there easy way to sync code in my Unreal project with code in AirSim repo? Sure, there is! You can find a bunch of .bat files (for linux, .sh ) in AirSim\\Unreal\\Environments\\Blocks . Just copy them over to your own Unreal project. Most of these are quite simple and self-explanatory. I get some error about map. You might have to set default map for your project. For example, if you are using Modular Neighborhood Pack, set the Editor Starter Map as well as Game Default Map to Demo_Map in Project Settings > Maps & Modes. I see \"Add to project\" option for environment but not \"Create project\" option. In this case, create a new blank C++ project with no Starter Content and add your environment in to it. I already have my own Unreal project. How do I use AirSim with it? Copy the Unreal\\Plugins folder from the build you did in the above section into the root of your Unreal project's folder. In your Unreal project's .uproject file, add the key AdditionalDependencies to the \"Modules\" object as we showed in the LandscapeMountains.uproject above. \"AdditionalDependencies\": [ \"AirSim\" ] and the Plugins section to the top level object: \"Plugins\": [ { \"Name\": \"AirSim\", \"Enabled\": true } ]","title":"Custom Unreal Environment"},{"location":"unreal_custenv/#creating-and-setting-up-unreal-environment","text":"This page contains the complete instructions start to finish for setting up Unreal environment with AirSim. The Unreal Marketplace has several environment available that you can start using in just few minutes. It is also possible to use environments available on websites such as turbosquid.com or cgtrader.com with bit more effort (here's tutorial video ). In addition, there also several free environments available. Below we will use a freely downloadable environment from Unreal Marketplace called Landscape Mountain but the steps are same for any other environments.","title":"Creating and Setting Up Unreal Environment"},{"location":"unreal_custenv/#note-for-linux-users","text":"There is no Epic Games Launcher for Linux which means that if you need to create custom environment, you will need Windows machine to do that. Once you have Unreal project folder, just copy it over to your Linux machine.","title":"Note for Linux Users"},{"location":"unreal_custenv/#step-by-step-instructions-when-using-cosys-airsim-from-precompiled-binaries","text":"It is assumed you downloaded the right precompiled Cosys-AirSim plugin from the GitHub releases page for the right Unreal version. In Epic Games Launcher click the Samples tab then scroll down and find Landscape Mountains . Click the Create Project and download this content (~2GB download). Open LandscapeMountains.uproject , it should launch the Unreal Editor. !!!note The Landscape Mountains project is supported up to Unreal Engine version 4.24. If you do not have 4.24 installed, you should see a dialog titled `Select Unreal Engine Version` with a dropdown to select from installed versions. Select 5.4 to migrate the project to a supported engine version. If you have 4.24 installed, you can manually migrate the project by navigating to the corresponding .uproject file in Windows Explorer, right-clicking it, and selecting the `Switch Unreal Engine version...` option. Go to the LandscapeMountains project folder and create a new subfolder called Plugins . Now copy the precompiled AirSim Plugin folder into this newly created folder. This way now your own Unreal project has AirSim plugin. Edit the LandscapeMountains.uproject so that you add the AirSim plugin to the list of plugins to load. json { ... \"Plugins\": [ { \"Name\": \"AirSim\", \"Enabled\": true } ] ... } Edit the Config\\DefaultGame.ini to add the following lines at the end: +MapsToCook=(FilePath=\"/AirSim/AirSimAssets\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/HUDAssets\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Beacons\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Blueprints\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Models\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Sensors\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/StarterContent\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/VehicleAdv\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Weather\") Doing this forces Unreal to include all necessary AirSim content in packaged builds of your project. If using Unreal Engine 5.3/5.4 check here for a fix to the camera scene rendering bug in these engine versions! Close the Unreal Editor and restart it by opening the uproject file again. In Window/World Settings as shown below, set the GameMode Override to AirSimGameMode : Go to 'Edit->Editor Preferences' in Unreal Editor, in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. If you don't do this then UE will be slowed down dramatically when UE window loses focus. Be sure to Save these edits. Hit the Play button in the Unreal Editor. See the other documentation pages for how to use it.","title":"Step-by-Step Instructions when using Cosys-AirSim from Precompiled Binaries"},{"location":"unreal_custenv/#step-by-step-instructions-when-using-cosys-airsim-from-source-build","text":"Make sure AirSim is built and Unreal 5.4 is installed as described in the installation instructions . In Epic Games Launcher click the Samples tab then scroll down and find Landscape Mountains . Click the Create Project and download this content (~2GB download). Open LandscapeMountains.uproject , it should launch the Unreal Editor. !!!note The Landscape Mountains project is supported up to Unreal Engine version 4.24. If you do not have 4.24 installed, you should see a dialog titled `Select Unreal Engine Version` with a dropdown to select from installed versions. Select 5.4 to migrate the project to a supported engine version. If you have 4.24 installed, you can manually migrate the project by navigating to the corresponding .uproject file in Windows Explorer, right-clicking it, and selecting the `Switch Unreal Engine version...` option. From the File menu select New C++ class , leave default None on the type of class, click Next , leave default name MyClass , and click Create Class . We need to do this because Unreal requires at least one source file in project. It should trigger compile and open up Visual Studio solution LandscapeMountains.sln . Go to your folder for AirSim repo and copy Unreal\\Plugins folder in to your LandscapeMountains folder. This way now your own Unreal project has AirSim plugin. !!!note If the AirSim installation is fresh, i.e, hasn't been built before, make sure that you run `build.cmd` from the root directory once before copying `Unreal\\Plugins` folder so that `AirLib` files are also included. If you have made some changes in the Blocks environment, make sure to run `update_to_git.bat` from `Unreal\\Environments\\Blocks` to update the files in `Unreal\\Plugins`. Edit the LandscapeMountains.uproject so that it looks like this json { \"FileVersion\": 3, \"EngineAssociation\": \"\", \"Category\": \"Samples\", \"Description\": \"\", \"Modules\": [ { \"Name\": \"LandscapeMountains\", \"Type\": \"Runtime\", \"LoadingPhase\": \"Default\", \"AdditionalDependencies\": [ \"AirSim\" ] } ], \"TargetPlatforms\": [ \"MacNoEditor\", \"WindowsNoEditor\" ], \"Plugins\": [ { \"Name\": \"AirSim\", \"Enabled\": true } ] } Edit the Config\\DefaultGame.ini to add the following lines at the end: +MapsToCook=(FilePath=\"/AirSim/AirSimAssets\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/HUDAssets\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Beacons\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Blueprints\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Models\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Sensors\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/StarterContent\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/VehicleAdv\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Weather\") Doing this forces Unreal to include all necessary AirSim content in packaged builds of your project. If using Unreal Engine 5.3/5.4 check here for a fix to the camera scene rendering bug in these engine versions! Close Visual Studio and the Unreal Editor and right-click the LandscapeMountains.uproject in Windows Explorer and select Generate Visual Studio Project Files . This step detects all plugins and source files in your Unreal project and generates .sln file for Visual Studio. !!!tip If the `Generate Visual Studio Project Files` option is missing you may need to reboot your machine for the Unreal Shell extensions to take effect. If it is still missing then open the LandscapeMountains.uproject in the Unreal Editor and select `Refresh Visual Studio Project` from the `File` menu. Reopen LandscapeMountains.sln in Visual Studio, and make sure \"DebugGame Editor\" and \"Win64\" build configuration is the active build configuration. Press F5 to run . This will start the Unreal Editor. The Unreal Editor allows you to edit the environment, assets and other game related settings. First thing you want to do in your environment is set up PlayerStart object. In Landscape Mountains environment, PlayerStart object already exist, and you can find it in the World Outliner . Make sure its location is set up as shown. This is where AirSim plugin will create and place the vehicle. If its too high up then vehicle will fall down as soon as you press play giving potentially random behavior In Window/World Settings as shown below, set the GameMode Override to AirSimGameMode : Go to 'Edit->Editor Preferences' in Unreal Editor, in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. If you don't do this then UE will be slowed down dramatically when UE window loses focus. Be sure to Save these edits. Hit the Play button in the Unreal Editor. See the other documentation pages for how to use it. Congratulations! You are now running AirSim in your own Unreal environment.","title":"Step-by-Step Instructions when using Cosys-AirSim from Source Build"},{"location":"unreal_custenv/#updating-your-environment-to-latest-version-of-airsim","text":"Once you have your environment using above instructions, you should frequently update your local AirSim code to latest version from GitHub. Below are the instructions to do this: First put clean.bat (or clean.sh for Linux users) in the root folder of your environment. Run this file to clean up all intermediate files in your Unreal project. Do git pull in your AirSim repo followed by build.cmd (or ./build.sh for Linux users). Replace [your project]/Plugins folder with AirSim/Unreal/Plugins folder. Right-click on your .uproject file and chose \"Generate Visual Studio project files\" option. This is not required for Linux.","title":"Updating Your Environment to Latest Version of AirSim"},{"location":"unreal_custenv/#choosing-your-vehicle-car-or-multirotor","text":"By default, AirSim prompts user for which vehicle to use. You can easily change this by setting SimMode . Please see using car guide.","title":"Choosing Your Vehicle: Car or Multirotor"},{"location":"unreal_custenv/#unreal-5354-scene-camera-bug","text":"Note that Unreal 5.3 and 5.4 breaks camera scene rendering when Effects is not set to the Epic scalability preset. You can use the console command r.DetailMode 2 to fix this at runtime! For the Blocks and other available environments we have made a fix for this. By placing a DefaultScalability.ini file in the Config folder of your Unreal project, you can set the scalability settings to custom values for each one (low, medium, high, epic, cine). As you can see in the Blocks environment, we have added the following to it to fix this bug automatically. You can find the DefaultScalability.ini file in the Unreal/Environments/Blocks folder. Copy this file to your Unreal project's Config folder. [EffectsQuality@0] r.DetailMode=2 [EffectsQuality@1] r.DetailMode=2 [EffectsQuality@2] r.DetailMode=2 [EffectsQuality@3] r.DetailMode=2 [EffectsQuality@Cine] r.DetailMode=2","title":"Unreal 5.3/5.4 Scene camera bug"},{"location":"unreal_custenv/#faq","text":"","title":"FAQ"},{"location":"unreal_custenv/#what-are-other-cool-environments","text":"Unreal Marketplace has dozens of prebuilt extraordinarily detailed environments ranging from Moon to Mars and everything in between. The one we have used for testing is called Modular Neighborhood Pack but you can use any environment. Another free environment is Infinity Blade series . Alternatively, if you look under the Learn tab in Epic Game Launcher, you will find many free samples that you can use. One of our favorites is \"A Boy and His Kite\" which is 100 square miles of highly detailed environment (caution: you will need very beefy PC to run it!).","title":"What are other cool environments?"},{"location":"unreal_custenv/#when-i-press-play-button-some-kind-of-video-starts-instead-of-my-vehicle","text":"If the environment comes with MatineeActor, delete it to avoid any startup demo sequences. There might be other ways to remove it as well, for example, click on Blueprints button, then Level Blueprint and then look at Begin Play event in Event Graph. You might want to disconnect any connections that may be starting \"matinee\".","title":"When I press Play button some kind of video starts instead of my vehicle."},{"location":"unreal_custenv/#is-there-easy-way-to-sync-code-in-my-unreal-project-with-code-in-airsim-repo","text":"Sure, there is! You can find a bunch of .bat files (for linux, .sh ) in AirSim\\Unreal\\Environments\\Blocks . Just copy them over to your own Unreal project. Most of these are quite simple and self-explanatory.","title":"Is there easy way to sync code in my Unreal project with code in AirSim repo?"},{"location":"unreal_custenv/#i-get-some-error-about-map","text":"You might have to set default map for your project. For example, if you are using Modular Neighborhood Pack, set the Editor Starter Map as well as Game Default Map to Demo_Map in Project Settings > Maps & Modes.","title":"I get some error about map."},{"location":"unreal_custenv/#i-see-add-to-project-option-for-environment-but-not-create-project-option","text":"In this case, create a new blank C++ project with no Starter Content and add your environment in to it.","title":"I see \"Add to project\" option for environment but not \"Create project\" option."},{"location":"unreal_custenv/#i-already-have-my-own-unreal-project-how-do-i-use-airsim-with-it","text":"Copy the Unreal\\Plugins folder from the build you did in the above section into the root of your Unreal project's folder. In your Unreal project's .uproject file, add the key AdditionalDependencies to the \"Modules\" object as we showed in the LandscapeMountains.uproject above. \"AdditionalDependencies\": [ \"AirSim\" ] and the Plugins section to the top level object: \"Plugins\": [ { \"Name\": \"AirSim\", \"Enabled\": true } ]","title":"I already have my own Unreal project. How do I use AirSim with it?"},{"location":"unreal_proj/","text":"Unreal Environment Setting Up the Unreal Project Option 1: Built-in Blocks Environment To get up and running fast, you can use the Blocks project that already comes with Cosys-AirSim. This is not very highly detailed environment to keep the repo size reasonable but we use it for various testing all the times and it is the easiest way to get your feet wet in this strange land. Follow these quick steps . Option 2: Create Your Own Unreal Environment If you want to setup photo-realistic high quality environments, then you will need to create your own Unreal project. This is little bit more involved but worthwhile! Follow this step-by-step guide .","title":"Setting up Unreal Environment"},{"location":"unreal_proj/#unreal-environment","text":"","title":"Unreal Environment"},{"location":"unreal_proj/#setting-up-the-unreal-project","text":"","title":"Setting Up the Unreal Project"},{"location":"unreal_proj/#option-1-built-in-blocks-environment","text":"To get up and running fast, you can use the Blocks project that already comes with Cosys-AirSim. This is not very highly detailed environment to keep the repo size reasonable but we use it for various testing all the times and it is the easiest way to get your feet wet in this strange land. Follow these quick steps .","title":"Option 1: Built-in Blocks Environment"},{"location":"unreal_proj/#option-2-create-your-own-unreal-environment","text":"If you want to setup photo-realistic high quality environments, then you will need to create your own Unreal project. This is little bit more involved but worthwhile! Follow this step-by-step guide .","title":"Option 2: Create Your Own Unreal Environment"},{"location":"using_car/","text":"How to Use Car in Cosys-AirSim By default Cosys-AirSim prompts user for which vehicle to use. You can easily change this by setting SimMode . For example, if you want to use car instead then just set the SimMode in your settings.json which you can find in your ~/Documents/AirSim folder, like this: { \"SettingsVersion\": 2.0, \"SimMode\": \"Car\" } Now when you restart Cosys-AirSim, you should see the car spawned automatically. Manual Driving Please use the keyboard arrow keys to drive manually. Spacebar for the handbrake. In manual drive mode, gears are set in \"auto\". Using APIs You can control the car, get state and images by calling APIs in variety of client languages including C++ and Python. Please see APIs doc for more details. Changing Views By default camera will chase the car from the back. You can get the FPV view by pressing F key and switch back to chasing from back view by pressing / key. More keyboard shortcuts can be seen by pressing F1. Cameras By default car is installed with 5 cameras: center, left and right, driver and reverse. You can chose the images from these camera by specifying the name .","title":"Car Mode"},{"location":"using_car/#how-to-use-car-in-cosys-airsim","text":"By default Cosys-AirSim prompts user for which vehicle to use. You can easily change this by setting SimMode . For example, if you want to use car instead then just set the SimMode in your settings.json which you can find in your ~/Documents/AirSim folder, like this: { \"SettingsVersion\": 2.0, \"SimMode\": \"Car\" } Now when you restart Cosys-AirSim, you should see the car spawned automatically.","title":"How to Use Car in Cosys-AirSim"},{"location":"using_car/#manual-driving","text":"Please use the keyboard arrow keys to drive manually. Spacebar for the handbrake. In manual drive mode, gears are set in \"auto\".","title":"Manual Driving"},{"location":"using_car/#using-apis","text":"You can control the car, get state and images by calling APIs in variety of client languages including C++ and Python. Please see APIs doc for more details.","title":"Using APIs"},{"location":"using_car/#changing-views","text":"By default camera will chase the car from the back. You can get the FPV view by pressing F key and switch back to chasing from back view by pressing / key. More keyboard shortcuts can be seen by pressing F1.","title":"Changing Views"},{"location":"using_car/#cameras","text":"By default car is installed with 5 cameras: center, left and right, driver and reverse. You can chose the images from these camera by specifying the name .","title":"Cameras"},{"location":"voxel_grid/","text":"AirSim provides a feature that constructs ground truth voxel grids of the world directly from Unreal Engine. A voxel grid is a representation of the occupancy of a given world/map, by discretizing into cells of a certain size; and recording a voxel if that particular location is occupied. The logic for constructing the voxel grid is in WorldSimApi.cpp->createVoxelGrid(). For now, the assumption is that the voxel grid is a cube - and the API call from Python is of the structure: simCreateVoxelGrid(self, position, x, y, z, res, of) position (Vector3r): Global position around which voxel grid is centered in m x, y, z (float): Size of each voxel grid dimension in m res (float): Resolution of voxel grid in m of (str): Name of output file to save voxel grid as Within createVoxelGrid() , the main Unreal Engine function that returns occupancy is OverlapBlockingTestByChannel . OverlapBlockingTestByChannel(position, rotation, ECollisionChannel, FCollisionShape, params); This function is called on the positions of all the 'cells' we wish to discretize the map into, and the returned occupancy result is collected into an array voxel_grid_ . The indexing of the cell occupancy values follows the convention of the binvox format. for (float i = 0; i < ncells_x; i++) { for (float k = 0; k < ncells_z; k++) { for (float j = 0; j < ncells_y; j++) { int idx = i + ncells_x * (k + ncells_z * j); FVector position = FVector((i - ncells_x /2) * scale_cm, (j - ncells_y /2) * scale_cm, (k - ncells_z /2) * scale_cm) + position_in_UE_frame; voxel_grid_[idx] = simmode_->GetWorld()->OverlapBlockingTestByChannel(position, FQuat::Identity, ECollisionChannel::ECC_Pawn, FCollisionShape::MakeBox(FVector(scale_cm /2)), params); } } } The occupancy of the map is calculated iteratively over all discretized cells, which can make it an intensive operation depending on the resolution of the cells, and the total size of the area being measured. If the user's map of interest does not change much, it is possible to run the voxel grid operation once on this map, and save the voxel grid and reuse it. For performance, or with dynamic environments, we recommend running the voxel grid generation for a small area around the robot; and subsequently use it for local planning purposes. The voxel grids are stored in the binvox format which can then be converted by the user into an octomap .bt or any other relevant, desired format. Subsequently, these voxel grids/octomaps can be used within mapping/planning. One nifty little utility to visualize a created binvox files is viewvox . Similarly, binvox2bt can convert the binvox to an octomap file. Example voxel grid in Blocks: Blocks voxel grid converted to Octomap format (visualized in rviz): As an example, a voxel grid can be constructed as follows, once the Blocks environment is up and running: import cosysairsim as airsim c = airsim.VehicleClient() center = airsim.Vector3r(0, 0, 0) output_path = os.path.join(os.getcwd(), \"map.binvox\") c.simCreateVoxelGrid(center, 100, 100, 100, 0.5, output_path) And visualized through viewvox map.binvox .","title":"Voxel Grid Generator"},{"location":"voxel_grid/#example-voxel-grid-in-blocks","text":"","title":"Example voxel grid in Blocks:"},{"location":"voxel_grid/#blocks-voxel-grid-converted-to-octomap-format-visualized-in-rviz","text":"As an example, a voxel grid can be constructed as follows, once the Blocks environment is up and running: import cosysairsim as airsim c = airsim.VehicleClient() center = airsim.Vector3r(0, 0, 0) output_path = os.path.join(os.getcwd(), \"map.binvox\") c.simCreateVoxelGrid(center, 100, 100, 100, 0.5, output_path) And visualized through viewvox map.binvox .","title":"Blocks voxel grid converted to Octomap format (visualized in rviz):"},{"location":"working_with_plugin_contents/","text":"How to use plugin contents Plugin contents are not shown in Unreal projects by default. To view plugin content, you need to click on few semi-hidden buttons: Causion Changes you make in content folder are changes to binary files so be careful.","title":"Working with UE Plugin Contents"},{"location":"working_with_plugin_contents/#how-to-use-plugin-contents","text":"Plugin contents are not shown in Unreal projects by default. To view plugin content, you need to click on few semi-hidden buttons: Causion Changes you make in content folder are changes to binary files so be careful.","title":"How to use plugin contents"},{"location":"xbox_controller/","text":"XBox Controller To use an XBox controller with AirSim follow these steps: Connect XBox controller so it shows up in your PC Game Controllers: Launch QGroundControl and you should see a new Joystick tab under settings: Now calibrate the radio, and setup some handy button actions. For example, I set mine so that the 'A' button arms the drone, 'B' put it in manual flight mode, 'X' puts it in altitude hold mode and 'Y' puts it in position hold mode. I also prefer the feel of the controller when I check the box labelled \"Use exponential curve on roll,pitch, yaw\" because this gives me more sensitivity for small movements. QGroundControl will find your Pixhawk via the UDP proxy port 14550 setup by MavLinkTest above. AirSim will find your Pixhawk via the other UDP server port 14570 also setup by MavLinkTest above. You can also use all the QGroundControl controls for autonomous flying at this point too. Connect to Pixhawk serial port using MavLinkTest.exe like this: MavLinkTest.exe -serial:*,115200 -proxy:127.0.0.1:14550 -server:127.0.0.1:14570 Run AirSim Unreal simulator with these ~/Documents/AirSim/settings.json settings: \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"SitlIp\": \"\", \"SitlPort\": 14560, \"UdpIp\": \"127.0.0.1\", \"UdpPort\": 14570, \"UseSerial\": false } } Advanced If the Joystick tab doesn't show up in QGroundControl then Click on the purple \"Q\" icon on left in tool bar to reveal the Preferences panel. Go to General tab and check the Virtual Joystick checkbox. Go back to settings screen (gears icon), click on Parameters tab, type COM_RC_IN_MODE in search box and change its value to either Joystick/No RC Checks or Virtual RC by Joystick . Other Options See remote controller options","title":"XBox Controller"},{"location":"xbox_controller/#xbox-controller","text":"To use an XBox controller with AirSim follow these steps: Connect XBox controller so it shows up in your PC Game Controllers: Launch QGroundControl and you should see a new Joystick tab under settings: Now calibrate the radio, and setup some handy button actions. For example, I set mine so that the 'A' button arms the drone, 'B' put it in manual flight mode, 'X' puts it in altitude hold mode and 'Y' puts it in position hold mode. I also prefer the feel of the controller when I check the box labelled \"Use exponential curve on roll,pitch, yaw\" because this gives me more sensitivity for small movements. QGroundControl will find your Pixhawk via the UDP proxy port 14550 setup by MavLinkTest above. AirSim will find your Pixhawk via the other UDP server port 14570 also setup by MavLinkTest above. You can also use all the QGroundControl controls for autonomous flying at this point too. Connect to Pixhawk serial port using MavLinkTest.exe like this: MavLinkTest.exe -serial:*,115200 -proxy:127.0.0.1:14550 -server:127.0.0.1:14570 Run AirSim Unreal simulator with these ~/Documents/AirSim/settings.json settings: \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"SitlIp\": \"\", \"SitlPort\": 14560, \"UdpIp\": \"127.0.0.1\", \"UdpPort\": 14570, \"UseSerial\": false } }","title":"XBox Controller"},{"location":"xbox_controller/#advanced","text":"If the Joystick tab doesn't show up in QGroundControl then Click on the purple \"Q\" icon on left in tool bar to reveal the Preferences panel. Go to General tab and check the Virtual Joystick checkbox. Go back to settings screen (gears icon), click on Parameters tab, type COM_RC_IN_MODE in search box and change its value to either Joystick/No RC Checks or Virtual RC by Joystick .","title":"Advanced"},{"location":"xbox_controller/#other-options","text":"See remote controller options","title":"Other Options"}]}
\ No newline at end of file
+{"config":{"indexing":"full","lang":["en"],"min_search_length":3,"prebuild_index":false,"separator":"[\\s\\-]+"},"docs":[{"location":"","text":"Cosys-AirSim Cosys-AirSim is a simulator for drones, cars and more, with extensive API support, built on Unreal Engine . It is open-source, cross platform, and supports hardware-in-loop with popular flight controllers such as PX4 for physically and visually realistic simulations. It is developed as an Unreal plugin that can simply be dropped into any Unreal environment. This fork is based on last public AirSim release from Microsoft's GitHub. Cosys-Lab made extensive modifications to the AirSim platform to support multiple projects and research goals. Please contact a Cosys-Lab researcher to get more in depth information on our work or if you wish to collaborate. The original AirSim MIT license applies to all native AirSim source files. Please note that we use that same MIT license as which applies to all changes made by Cosys-Lab in case you plan to do anything within this repository. Do note that this repository is provided as is, will not be actively updated and comes without warranty or support. Please contact a Cosys-Lab researcher to get more in depth information on which branch or version is best for your work. Associated publications Cosys-AirSim: A Real-Time Simulation Framework Expanded for Complex Industrial Applications @inproceedings{cosysairsim2023jansen, author={Jansen, Wouter and Verreycken, Erik and Schenck, Anthony and Blanquart, Jean-Edouard and Verhulst, Connor and Huebel, Nico and Steckel, Jan}, booktitle={2023 Annual Modeling and Simulation Conference (ANNSIM)}, title={COSYS-AIRSIM: A Real-Time Simulation Framework Expanded for Complex Industrial Applications}, year={2023}, volume={}, number={}, pages={37-48}, doi={}} You can also find the presentation of the live tutorial of Cosys-AirSim at ANNSIM '23 conference here together with the associated videos. Physical LiDAR Simulation in Real-Time Engine @inproceedings{lidarsim2022jansen, author={Jansen, Wouter and Huebel, Nico and Steckel, Jan}, booktitle={2022 IEEE Sensors}, title={Physical LiDAR Simulation in Real-Time Engine}, year={2022}, volume={}, number={}, pages={1-4}, doi={10.1109/SENSORS52175.2022.9967197}} } Simulation of Pulse-Echo Radar for Vehicle Control and SLAM @Article{echosim2021schouten, author={Schouten, Girmi and Jansen, Wouter and Steckel, Jan}, title={Simulation of Pulse-Echo Radar for Vehicle Control and SLAM}, JOURNAL={Sensors}, volume={21}, year={2021}, number={2}, article-number={523}, doi={10.3390/s21020523} } Cosys-Lab Modifications Added support for Unreal up to 5.4 ( Note that Unreal 5.3/5.4 breaks camera scene rendering by default in custom environments ) Added multi-layer annotation for groundtruth label generation with RGB, greyscale and texture options. Extensive API integration and available for camera and GPU-LiDAR sensors. Added Instance Segmentation . Added Echo sensor type for simulation of sensors like sonar and radar. Added GPU LIDAR sensor type : Uses GPU acceleration to simulate a LiDAR sensor. Can support much higher point density then normal LiDAR and behaves more authentic and has realistic intensity generation. Added skid steering SimMode and vehicle type . ClearPath Husky and Pioneer P3DX implemented as vehicle types using this new vehicle model. Added Matlab API Client implementation as an easy to install Matlab toolbox. Added various random but deterministic dynamic object types and world configuration options . Added BoxCar vehicle model to the Car SimMode to have a smaller vehicle to use in indoor spaces. Updated ComputerVision mode : Now has full API and Simulation just like other vehicle types. It mostly means it can now have sensors attached (outside of IMU). Improved handling and camera operation. Updated LIDAR sensor type : Fixed not tracing correctly, added ground truth (point labels) generation, added range-noise generation. Improved API pointcloud delivery to be full scan instead of being frame-rate dependent and partial. Updated the camera, Echo and (GPU-)LiDAR sensors to be uncoupled from the vehicle and be placed as external world sensors. Updated sensors like cameras, Echo sensor and GPU-LiDAR to ignore certain objects with the MarkedIgnore Unreal tag and enabling the \"IgnoreMarked\" setting in the settings file . Updated cameras sensor with more distortion features such as chromatic aberration, motion blur and lens distortion. Updated Python ROS implementation with completely new implementation and feature set. Updated C++ ROS2 implementation to support custom Cosys-AirSim features. Dropped support for Unity Environments. Some more details on our changes can be found in the changelog . How to Get It Download and install from precompiled plugin - Windows/Linux Download and install it Install and use from source - Windows Install/Build it Install and use from source - Linux Install/Build it How to Use It Documentation View our detailed documentation on all aspects of Cosys-AirSim. Original AirSim Paper More technical details are available in AirSim paper (FSR 2017 Conference) . Please cite this as: @inproceedings{airsim2017fsr, author = {Shital Shah and Debadeepta Dey and Chris Lovett and Ashish Kapoor}, title = {AirSim: High-Fidelity Visual and Physical Simulation for Autonomous Vehicles}, year = {2017}, booktitle = {Field and Service Robotics}, eprint = {arXiv:1705.05065}, url = {https://arxiv.org/abs/1705.05065} } License This project is released under the MIT License. Please review the License file for more details.","title":"Home"},{"location":"#cosys-airsim","text":"Cosys-AirSim is a simulator for drones, cars and more, with extensive API support, built on Unreal Engine . It is open-source, cross platform, and supports hardware-in-loop with popular flight controllers such as PX4 for physically and visually realistic simulations. It is developed as an Unreal plugin that can simply be dropped into any Unreal environment. This fork is based on last public AirSim release from Microsoft's GitHub. Cosys-Lab made extensive modifications to the AirSim platform to support multiple projects and research goals. Please contact a Cosys-Lab researcher to get more in depth information on our work or if you wish to collaborate. The original AirSim MIT license applies to all native AirSim source files. Please note that we use that same MIT license as which applies to all changes made by Cosys-Lab in case you plan to do anything within this repository. Do note that this repository is provided as is, will not be actively updated and comes without warranty or support. Please contact a Cosys-Lab researcher to get more in depth information on which branch or version is best for your work.","title":"Cosys-AirSim"},{"location":"#associated-publications","text":"Cosys-AirSim: A Real-Time Simulation Framework Expanded for Complex Industrial Applications @inproceedings{cosysairsim2023jansen, author={Jansen, Wouter and Verreycken, Erik and Schenck, Anthony and Blanquart, Jean-Edouard and Verhulst, Connor and Huebel, Nico and Steckel, Jan}, booktitle={2023 Annual Modeling and Simulation Conference (ANNSIM)}, title={COSYS-AIRSIM: A Real-Time Simulation Framework Expanded for Complex Industrial Applications}, year={2023}, volume={}, number={}, pages={37-48}, doi={}} You can also find the presentation of the live tutorial of Cosys-AirSim at ANNSIM '23 conference here together with the associated videos. Physical LiDAR Simulation in Real-Time Engine @inproceedings{lidarsim2022jansen, author={Jansen, Wouter and Huebel, Nico and Steckel, Jan}, booktitle={2022 IEEE Sensors}, title={Physical LiDAR Simulation in Real-Time Engine}, year={2022}, volume={}, number={}, pages={1-4}, doi={10.1109/SENSORS52175.2022.9967197}} } Simulation of Pulse-Echo Radar for Vehicle Control and SLAM @Article{echosim2021schouten, author={Schouten, Girmi and Jansen, Wouter and Steckel, Jan}, title={Simulation of Pulse-Echo Radar for Vehicle Control and SLAM}, JOURNAL={Sensors}, volume={21}, year={2021}, number={2}, article-number={523}, doi={10.3390/s21020523} }","title":"Associated publications"},{"location":"#cosys-lab-modifications","text":"Added support for Unreal up to 5.4 ( Note that Unreal 5.3/5.4 breaks camera scene rendering by default in custom environments ) Added multi-layer annotation for groundtruth label generation with RGB, greyscale and texture options. Extensive API integration and available for camera and GPU-LiDAR sensors. Added Instance Segmentation . Added Echo sensor type for simulation of sensors like sonar and radar. Added GPU LIDAR sensor type : Uses GPU acceleration to simulate a LiDAR sensor. Can support much higher point density then normal LiDAR and behaves more authentic and has realistic intensity generation. Added skid steering SimMode and vehicle type . ClearPath Husky and Pioneer P3DX implemented as vehicle types using this new vehicle model. Added Matlab API Client implementation as an easy to install Matlab toolbox. Added various random but deterministic dynamic object types and world configuration options . Added BoxCar vehicle model to the Car SimMode to have a smaller vehicle to use in indoor spaces. Updated ComputerVision mode : Now has full API and Simulation just like other vehicle types. It mostly means it can now have sensors attached (outside of IMU). Improved handling and camera operation. Updated LIDAR sensor type : Fixed not tracing correctly, added ground truth (point labels) generation, added range-noise generation. Improved API pointcloud delivery to be full scan instead of being frame-rate dependent and partial. Updated the camera, Echo and (GPU-)LiDAR sensors to be uncoupled from the vehicle and be placed as external world sensors. Updated sensors like cameras, Echo sensor and GPU-LiDAR to ignore certain objects with the MarkedIgnore Unreal tag and enabling the \"IgnoreMarked\" setting in the settings file . Updated cameras sensor with more distortion features such as chromatic aberration, motion blur and lens distortion. Updated Python ROS implementation with completely new implementation and feature set. Updated C++ ROS2 implementation to support custom Cosys-AirSim features. Dropped support for Unity Environments. Some more details on our changes can be found in the changelog .","title":"Cosys-Lab Modifications"},{"location":"#how-to-get-it","text":"","title":"How to Get It"},{"location":"#download-and-install-from-precompiled-plugin-windowslinux","text":"Download and install it","title":"Download and install from precompiled plugin - Windows/Linux"},{"location":"#install-and-use-from-source-windows","text":"Install/Build it","title":"Install and use from source - Windows"},{"location":"#install-and-use-from-source-linux","text":"Install/Build it","title":"Install and use from source - Linux"},{"location":"#how-to-use-it","text":"","title":"How to Use It"},{"location":"#documentation","text":"View our detailed documentation on all aspects of Cosys-AirSim.","title":"Documentation"},{"location":"#original-airsim-paper","text":"More technical details are available in AirSim paper (FSR 2017 Conference) . Please cite this as: @inproceedings{airsim2017fsr, author = {Shital Shah and Debadeepta Dey and Chris Lovett and Ashish Kapoor}, title = {AirSim: High-Fidelity Visual and Physical Simulation for Autonomous Vehicles}, year = {2017}, booktitle = {Field and Service Robotics}, eprint = {arXiv:1705.05065}, url = {https://arxiv.org/abs/1705.05065} }","title":"Original AirSim Paper"},{"location":"#license","text":"This project is released under the MIT License. Please review the License file for more details.","title":"License"},{"location":"InfraredCamera/","text":"This is a tutorial for generating simulated thermal infrared (IR) images using Cosys-AirSim. To generate your own data, you may use two python files: create_ir_segmentation_map.py and capture_ir_segmentation.py . create_ir_segmentation_map.py uses temperature, emissivity, and camera response information to estimate the thermal digital count that could be expected for the objects in the environment, and then reassigns the segmentation IDs in Cosys-AirSim to match these digital counts. It should be run before starting to capture thermal IR data. Otherwise, digital counts in the IR images will be incorrect. The camera response, temperature, and emissivity data are all included for the Africa environment. capture_ir_segmentation.py is run after the segmentation IDs have been reassigned. It tracks objects of interest and records the infrared and scene images from the multirotor. It uses Computer Vision mode. Finally, the details about how temperatures were estimated for plants and animals in the Africa environment, etc. can be found in this paper: @inproceedings{bondi2018airsim, title={AirSim-W: A Simulation Environment for Wildlife Conservation with UAVs}, author={Bondi, Elizabeth and Dey, Debadeepta and Kapoor, Ashish and Piavis, Jim and Shah, Shital and Fang, Fei and Dilkina, Bistra and Hannaford, Robert and Iyer, Arvind and Joppa, Lucas and others}, booktitle={Proceedings of the 1st ACM SIGCAS Conference on Computing and Sustainable Societies}, pages={40}, year={2018}, organization={ACM} } nb","title":"Infrared Camera"},{"location":"adding_new_apis/","text":"Adding New APIs to AirSim Adding new APIs requires modifying the source code. Much of the changes are mechanical and required for various levels of abstractions that AirSim supports. The main files required to be modified are described below along with some commits and PRs for demonstration. Specific sections of the PRs or commits might be linked in some places, but it'll be helpful to have a look at the entire diff to get a better sense of the workflow. Also, don't hesitate in opening an issue or a draft PR also if unsure about how to go about making changes or to get feedback. Implementing the API Before adding the wrapper code to call and handle the API, it needs to be implemented first. The exact files where this will occur varies depending on what it does. Few examples are given below which might help you in getting started. Vehicle-based APIs moveByVelocityBodyFrameAsync API for velocity-based movement in the multirotor's X-Y frame. The main implementation is done in MultirotorBaseApi.cpp , where most of the multirotor APIs are implemented. In some cases, additional structures might be needed for storing data, getRotorStates API is a good example for this, here the RotorStates struct is defined in 2 places for conversion from RPC to internal code. It also requires modifications in AirLib as well as Unreal/Plugins for the implementation. Environment-related APIs These APIs need to interact with the simulation environment itself, hence it's likely that it'll be implemented inside the Unreal/Plugins folder. simCreateVoxelGrid API to generate and save a binvox-formatted grid of the environment - WorldSimApi.cpp simAddVehicle API to create vehicles at runtime - SimMode*, WorldSimApi files Physics-related APIs simSetWind API shows an example of modifying the physics behaviour and adding an API + settings field for the same. See the PR for details about the code. RPC Wrappers The APIs use msgpack-rpc protocol over TCP/IP through rpclib developed by Tam\u00c3\u00a1s Szelei which allows you to use variety of programming languages including C++, C#, Python, Java etc. When AirSim starts, it opens port 41451 (this can be changed via settings ) and listens for incoming request. The Python or C++ client code connects to this port and sends RPC calls using msgpack serialization format . To add the RPC code to call the new API, follow the steps below. Follow the implementation of other APIs defined in the files. Add an RPC handler in the server which calls your implemented method in RpcLibServerBase.cpp . Vehicle-specific APIs are in their respective vehicle subfolder. Add the C++ client API method in RpcClientBase.cpp Add the Python client API method in client.py . If needed, add or modify a structure definition in types.py Testing Testing is required to ensure that the API is working as expected. For this, as expected, you'll have to use the source-built AirSim and Blocks environment. Apart from this, if using the Python APIs, you'll have to use the airsim package from source rather than the PyPI package. Below are 2 ways described to go about using the package from source - Use setup_path.py . It will setup the path such that the local airsim module is used instead of the pip installed package. This is the method used in many of the scripts since the user doesn't need to do anything other than run the script. Place your example script in one of the folders inside PythonClient like multirotor , car , etc. You can also create one to keep things separate, and copy the setup_path.py file from another folder. Add import setup_path before import cosysairsim as airsim in your files. Now the latest main API (or any branch currently checked out) will be used. Use a local project pip install . Regular install would create a copy of the current source and use it, whereas Editable install ( pip install -e . from inside the PythonClient folder) would change the package whenever the Python API files are changed. Editable install has the benefit when working on several branches or API is not finalized. It is recommended to use a virtual environment for dealing with Python packaging so as to not break any existing setup.","title":"Adding new APIs"},{"location":"adding_new_apis/#adding-new-apis-to-airsim","text":"Adding new APIs requires modifying the source code. Much of the changes are mechanical and required for various levels of abstractions that AirSim supports. The main files required to be modified are described below along with some commits and PRs for demonstration. Specific sections of the PRs or commits might be linked in some places, but it'll be helpful to have a look at the entire diff to get a better sense of the workflow. Also, don't hesitate in opening an issue or a draft PR also if unsure about how to go about making changes or to get feedback.","title":"Adding New APIs to AirSim"},{"location":"adding_new_apis/#implementing-the-api","text":"Before adding the wrapper code to call and handle the API, it needs to be implemented first. The exact files where this will occur varies depending on what it does. Few examples are given below which might help you in getting started.","title":"Implementing the API"},{"location":"adding_new_apis/#vehicle-based-apis","text":"moveByVelocityBodyFrameAsync API for velocity-based movement in the multirotor's X-Y frame. The main implementation is done in MultirotorBaseApi.cpp , where most of the multirotor APIs are implemented. In some cases, additional structures might be needed for storing data, getRotorStates API is a good example for this, here the RotorStates struct is defined in 2 places for conversion from RPC to internal code. It also requires modifications in AirLib as well as Unreal/Plugins for the implementation.","title":"Vehicle-based APIs"},{"location":"adding_new_apis/#environment-related-apis","text":"These APIs need to interact with the simulation environment itself, hence it's likely that it'll be implemented inside the Unreal/Plugins folder. simCreateVoxelGrid API to generate and save a binvox-formatted grid of the environment - WorldSimApi.cpp simAddVehicle API to create vehicles at runtime - SimMode*, WorldSimApi files","title":"Environment-related APIs"},{"location":"adding_new_apis/#physics-related-apis","text":"simSetWind API shows an example of modifying the physics behaviour and adding an API + settings field for the same. See the PR for details about the code.","title":"Physics-related APIs"},{"location":"adding_new_apis/#rpc-wrappers","text":"The APIs use msgpack-rpc protocol over TCP/IP through rpclib developed by Tam\u00c3\u00a1s Szelei which allows you to use variety of programming languages including C++, C#, Python, Java etc. When AirSim starts, it opens port 41451 (this can be changed via settings ) and listens for incoming request. The Python or C++ client code connects to this port and sends RPC calls using msgpack serialization format . To add the RPC code to call the new API, follow the steps below. Follow the implementation of other APIs defined in the files. Add an RPC handler in the server which calls your implemented method in RpcLibServerBase.cpp . Vehicle-specific APIs are in their respective vehicle subfolder. Add the C++ client API method in RpcClientBase.cpp Add the Python client API method in client.py . If needed, add or modify a structure definition in types.py","title":"RPC Wrappers"},{"location":"adding_new_apis/#testing","text":"Testing is required to ensure that the API is working as expected. For this, as expected, you'll have to use the source-built AirSim and Blocks environment. Apart from this, if using the Python APIs, you'll have to use the airsim package from source rather than the PyPI package. Below are 2 ways described to go about using the package from source - Use setup_path.py . It will setup the path such that the local airsim module is used instead of the pip installed package. This is the method used in many of the scripts since the user doesn't need to do anything other than run the script. Place your example script in one of the folders inside PythonClient like multirotor , car , etc. You can also create one to keep things separate, and copy the setup_path.py file from another folder. Add import setup_path before import cosysairsim as airsim in your files. Now the latest main API (or any branch currently checked out) will be used. Use a local project pip install . Regular install would create a copy of the current source and use it, whereas Editable install ( pip install -e . from inside the PythonClient folder) would change the package whenever the Python API files are changed. Editable install has the benefit when working on several branches or API is not finalized. It is recommended to use a virtual environment for dealing with Python packaging so as to not break any existing setup.","title":"Testing"},{"location":"annotation/","text":"Annotation in Cosys-AirSim A multi-layer annotation system is implemented into Cosys-AirSim. It uses Proxy Mesh rendering to allow for each object in the world to be annotated by a greyscale value, an RGB color or a texture that fits the mesh. An annotation layer allows the user to tag individual actors and/or their child-components with a certain annotation component. This can be used to create ground truth data for machine learning models or to create a visual representation of the environment. Let's say you want to train a model to detect cars or pedestrians, you create an RGB annotation layer where you can tag all the cars and pedestrians in the environment with a certain RGB color respectively. Through the API you can then get the image of this RGB annotation layer (GPU LiDAR is also supported next to cameras). Or you want to assign a ripeness value to all the apples in your environment, you can create a greyscale annotation layer where you can tag all the apples with a certain greyscale value between 0 and 1. Similarly, you can also load a texture to a specific mesh component only visible in the annotation layer. For example when trying to show where defects are in a mesh. The annotation system uses actor and/or component tags to set these values for the 3 modes (greyscale, RGB, texture). You can add these manually or use the APIs (RPC API, Unreal Blueprint, Unreal c++). Limitations 2744000 different RGB colors are currently available to be assigned to unique objects. If your environment during a run requires more colors, you will generate errors and new objects will be assigned color [0,0,0]. Only static and skeletal meshes are supported. Landscape objects aren't supported. This is the special object type in Unreal to make terrain with. As a work-around, StaticMesh terrain must be used. Foliage objects aren't supported. This is the special object type in Unreal to place trees, grass and other plants that move with the wind. As a work-around, StaticMesh objects must be used. Brush objects aren't supported. This is a special object type in Unreal to create your own meshes with. As a work-around, you can convert them to a StaticMesh. These and other unsupported object types that are less common that either will not be rendered (decals, text, foliage, ...) or will by default be given the RGB color value of [149,149,149] or [0,0,0]. (brush objects, landscape,...). Usage Settings JSON definition of layers To use the annotation system, you need to set the annotation mode in the settings.json file. You can define as many as you want and use them simultaneously. You will always have to ID them by the name. Here you define each layer with a name, the type and some other settings, often specific to the type. For example: { ... \"Annotation\": [ { \"Name\": \"RGBTestDirect\", \"Type\": 0, \"Default\": true, \"SetDirect\": true, \"ViewDistance\": 10 }, { \"Name\": \"RGBTestIndex\", \"Type\": 0, \"Default\": true, \"SetDirect\": false }, { \"Name\": \"GreyscaleTest\", \"Type\": 1, \"Default\": true, \"ViewDistance\": 5 }, { \"Name\": \"TextureTestDirect\", \"Type\": 2, \"Default\": true, \"SetDirect\": true }, { \"Name\": \"TextureTestRelativePath\", \"Type\": 2, \"Default\": false, \"SetDirect\": false, \"TexturePath\": \"/Game/AnnotationTest\", \"TexturePrefix\": \"Test1\" } ], ... } The types are: RGB = 0, Greyscale = 1, Texture = 2 The Default setting applies to all types and is what happens when no tag is set for na actor/component. When set to false, the mesh will not be rendered in the annotation layer. When set to true, the mesh will be rendered in the annotation layer with the default value of the layer. The ViewDistance setting applies to all types and allows you to set the maximum distance in meters at which the annotation layer is rendered. This only applies to the camera sensor output as for LiDAR you can set the maximum range distance of the sensor differently. This value is by default set to -1 which means infinite draw distance. Type 1: RGB Similar to instance segmentation , you can use the RGB annotation layer to tag objects in the environment with a unique color. You can do this by directly setting the color yourself (direct mode), or by assigning the object an index (0-2744000 unique colors) that will be linked to the colormap. To use direct mode, set the settings of this layer with SetDirect to true . For index mode, set to false . Actor/component tags have the following format: annotationName_R_G_B for direct mode or annotationName_ID for direct mode. So if for example your RGB annotation layer is called RGBTestDirect , you can tag an actor with the tag RGBTestDirect_255_0_0 to give it a red color. Or for index mode, RGBTest_5 to give it the fifth color in the colormap. When Default is set to 1, all objects without a tag for this layer will be rendered in black. The instance segmentation API function to get the colormap also applies to the RGB index mode. For example in Python you can use: colorMap = client.simGetSegmentationColorMap() Several RPC API functions are available to influence or retrieve the RGB annotation layer. Currently, it is not possible to use the RPC API to add new actors or components to the annotation system, you can only update their values. For example in Python: simSetAnnotationObjectID(annotation_name, mesh_name, object_id, is_name_regex=False/True) to update the color of an object in index mode (regex allows to set multiple with wildcards for example) when it already exists in the annotation system simSetAnnotationObjectColor(annotation_name, mesh_name, r, g, b, is_name_regex=False/True) to update the color of an object in direct mode (regex allows to set multiple with wildcards for example) when it already exists in the annotation system simGetAnnotationObjectID(annotation_name, mesh_name) to get the ID of an object in index mode simGetAnnotationObjectColor(annotation_name, mesh_name) to get the color of an object in direct mode simIsValidColor(r,g,b) You can check if a color is valid using this function The same is available in Unreal Blueprint and Unreal c++. You can find the functions in the Annotation category. Add RGBDirect Annotation Tag to Component/Actor(annotation_name, component/actor, color, update_annotation=true/false) to set the color of an object in direct mode Update RGBDirect Annotation Tag to Component/Actor(annotation_name, component/actor, color, update_annotation=true/false) to update the color of an object in direct mode already in the system Add RGBIndex Annotation Tag to Component/Actor(annotation_name, component/actor, object_id, update_annotation=true/false) to set the index of an object in index mode Update RGBIndex Annotation Tag to Component/Actor(annotation_name, component/actor, object_id, update_annotation=true/false) to update the index of an object in index mode already in the system Is Annotation RGB Valid(color) You can check if a color is valid using this function Note that enabling update_annotation is a relatively slow process, specially on actors with lots of annotated components. Ideally set update_annotation to false during the process of adding tags to the actor and only turn on update_annotation for the last component or actor you want to update. Alternatively, you can use the Add New Actor To Annotation() blueprint function to update the annotation layer for this actor after you have added all tags. Type 2: Greyscale You can use the greyscale annotation layer to tag objects in the environment with a float value between 0 and 1. Note that this has the precision of uint8. Actor/component tags have the following format: annotationName_value . So if for example your RGB annotation layer is called GreyscaleTest , you can tag an actor with the tag GreyscaleTest_0.76 to give it a value of 0.76 which would result in a color of (194, 194, 194). When Default is set to 1, all objects without a tag for this layer will be rendered in black. Several RPC API functions are available to influence or retrieve the RGB annotation layer. Currently, it is not possible to use the RPC API to add new actors or components to the annotation system, you can only update their values. For example in Python: simSetAnnotationObjectValue(annotation_name, mesh_name, greyscale_value, is_name_regex=False/True) to update the value of an object (regex allows to set multiple with wildcards for example) when it already exists in the annotation system simGetAnnotationObjectValue(annotation_name, mesh_name) to get the value of an object The same is available in Unreal Blueprint and Unreal c++. You can find the functions in the Annotation category. Add Greyscale Annotation Tag to Component/Actor(annotation_name, component/actor, value, update_annotation=true/false) to update the value of an object when it already exists in the annotation system Update Greyscale Annotation Tag to Component/Actor(annotation_name, component/actor, value, update_annotation=true/false) to update the value of an object Note that enabling update_annotation is a relatively slow process, specially on actors with lots of annotated components. Ideally set update_annotation to false during the process of adding tags to the actor and only turn on update_annotation for the last component or actor you want to update. Alternatively, you can use the Add New Actor To Annotation() blueprint function to update the annotation layer for this actor after you have added all tags. Type 3: Texture You can use the texture annotation layer to tag objects in the environment with a specific texture. This can be a color or greyscale texture, or you can mix them. Choice is up to you. You can do this by directly setting the texture yourself (direct mode), or by assigning a texture that is loaded based on a set path and the name of the mesh. To use direct mode, set the settings of this layer with SetDirect to true . For path reference mode, set to false . Actor/component tags have the following format: annotationName_texturepath for direct mode. The Unreal texture path name has to be rather specific: - If your texture is in the environment content folder, you must add /Game/ in front of the path. - If it is in the Cosys-AirSim plugin content folder, you must add /AirSim/ in front of the path. - For Engine textures, you must add /Engine/ in front of the path. So if for example your texture annotation layer is called TextureTestDirect , and your texture TestTexture is in the game content folder under a subfolder AnnotationTest you can tag an actor with the tag TextureTest_/Game/AnnotationTest/TestTexture to give it this texture. For path reference mod, the content of the tag is not really important as long as it contains the name of the annotation layer and an underscore, for example annotationName_enable . What is important is in reference mode is that you have a texture in the content folder with the name of the mesh if you do enable this object by setting a tag. You must place your textures in the folder defined by the TexturePath setting in the settings.json file for this layer. And the texture must have the same name as the mesh and start with the prefix set by the TexturePrefix setting in the settings.json file for this layer followed by a hyphen. So for example if you have a static mesh called Cylinder and your texture layer is called TextureTestDirect with the settings TexturePath set to /Game/AnnotationTest and TexturePrefix set to Test1 , you must have a texture called Test1-Cylinder in the folder /Game/AnnotationTest . When Default is set to 1, all objects without a tag for this layer will be rendered in black. Several RPC API functions are available to influence or retrieve the RGB annotation layer. Currently, it is not possible to use the RPC API to add new actors or components to the annotation system, you can only update their values. For example in Python: simSetAnnotationObjectTextureByPath(annotation_name, mesh_name, texture_path, is_name_regex=False/True) to set the texture of an object in direct mode, the texture path should be same format as described above, for example /Game/MyTextures/TestTexture1 (regex allows to set multiple with wildcards for example) simEnableAnnotationObjectTextureByPath(annotation_name, mesh_name, r, g, b, is_name_regex=False/True) to enable the texture of an object in relative path mode, this does require a texture in the relative path as described above! (regex allows to set multiple with wildcards for example) simGetAnnotationObjectTexturePath(annotation_name, mesh_name) to get the texture path of an object The same is available in Unreal Blueprint and Unreal c++. You can find the functions in the Annotation category. Add Texture Direct Annotation Tag to Component/Actor By Path(annotation_name, component/actor, texture_path, update_annotation=true/false) to set the texture of an object in direct mode, the texture path should be same format as described above, for example /Game/MyTextures/TestTexture1 Update Texture Direct Annotation Tag to Component/Actor By Path(annotation_name, component/actor, texture_path, update_annotation=true/false) to update texture of an object in direct mode that is already in the system, the texture path should be same format as described above, for example /Game/MyTextures/TestTexture1 Add Texture Direct Annotation Tag to Component/Actor(annotation_name, component/actor, texture, update_annotation=true/false) to set the texture of an object in direct mode, the texture can be directly referenced as UTexture* Object Update Texture Direct Annotation Tag to Component/Actor(annotation_name, component/actor, texture, update_annotation=true/false) to update texture of an object in direct mode that is already in the system, the texture can be directly referenced as UTexture* Object Enable Texture By Path Annotation Tag to Component/Actor(annotation_name, component/actor, update_annotation=true/false) to enable the texture of an object in relative path mode, this does require a texture in the relative path as described above! Note that enabling update_annotation is a relatively slow process, specially on actors with lots of annotated components. Ideally set update_annotation to false during the process of adding tags to the actor and only turn on update_annotation for the last component or actor you want to update. Alternatively, you can use the Add New Actor To Annotation() blueprint function to update the annotation layer for this actor after you have added all tags. Common functionality By default, when the world loads, all meshes are checked for tags and the annotation layers are updated accordingly. With the unreal blueprint and c++ functions however, you can also decide to update the annotation layer only when you want to with the update_annotation argument. If you have many objects to update, this can save a lot of time by doing it only for the last object. Some API functions exist for all types, for example in Python: simListAnnotationObjects(annotation_name) to get a list of all objects within this annotation layer. simListAnnotationPoses(annotation_name, ned=True/False, only_visible=False/True) to get the 3D poses of all objects in this annotation layer. The returned pose is in NED coordinates in SI units with its origin at Player Start by default or in Unreal NED frame if the ned boolean argument is set to talse . Similarly, for Unreal Blueprint and Unreal c++. You can find the functions in the Annotation category. Does Annotation Layer Exist(annotation_name) to figure out if a layer exists or not Add New Actor To Annotation(annotation_name, actor, update_annotation=true/false) if you manually added a tag, and want to update the annotation layer with this actor. This is useful to run after adding multiple tags to the actor and its components with the other api calls, and you want to update the annotation layer only once, otherwise it will be much slower. Delete Actor From Annotation(annotation_name, actor, update_annotation=true/false) if you manually remove all tags from an actor for this layer and remove it from the annotation layer Force Update Annotation(annotation_name) to force an update of the annotation layer. Getting annotation data from sensors The easiest way to get the images from annotation cameras, is through the image API. See the Image API documentation for more information. GPU LiDAR is also supported, but each GPU Lidar can only render one annotation layer. See the GPU LiDAR documentation for more information. You can also display the annotation layers in the subwindows. See the Settings documentation for more information. For example: { ... \"SubWindows\": [ { \"WindowID\": 0, \"CameraName\": \"front_center\", \"ImageType\": 10, \"VehicleName\": \"robot1\", \"Annotation\": \"GreyscaleTest\", \"Visible\": false }, ... Credits The method used to use Proxy meshes to segment object is a derivative of and inspired by the work of UnrealCV . Their work is licensed under the MIT License. It is made by students from Johns Hopkins University and Peking University under the supervision of Prof. Alan Yuille and Prof. Yizhou Wang. You can read the paper on their work here .","title":"Annotation"},{"location":"annotation/#annotation-in-cosys-airsim","text":"A multi-layer annotation system is implemented into Cosys-AirSim. It uses Proxy Mesh rendering to allow for each object in the world to be annotated by a greyscale value, an RGB color or a texture that fits the mesh. An annotation layer allows the user to tag individual actors and/or their child-components with a certain annotation component. This can be used to create ground truth data for machine learning models or to create a visual representation of the environment. Let's say you want to train a model to detect cars or pedestrians, you create an RGB annotation layer where you can tag all the cars and pedestrians in the environment with a certain RGB color respectively. Through the API you can then get the image of this RGB annotation layer (GPU LiDAR is also supported next to cameras). Or you want to assign a ripeness value to all the apples in your environment, you can create a greyscale annotation layer where you can tag all the apples with a certain greyscale value between 0 and 1. Similarly, you can also load a texture to a specific mesh component only visible in the annotation layer. For example when trying to show where defects are in a mesh. The annotation system uses actor and/or component tags to set these values for the 3 modes (greyscale, RGB, texture). You can add these manually or use the APIs (RPC API, Unreal Blueprint, Unreal c++).","title":"Annotation in Cosys-AirSim"},{"location":"annotation/#limitations","text":"2744000 different RGB colors are currently available to be assigned to unique objects. If your environment during a run requires more colors, you will generate errors and new objects will be assigned color [0,0,0]. Only static and skeletal meshes are supported. Landscape objects aren't supported. This is the special object type in Unreal to make terrain with. As a work-around, StaticMesh terrain must be used. Foliage objects aren't supported. This is the special object type in Unreal to place trees, grass and other plants that move with the wind. As a work-around, StaticMesh objects must be used. Brush objects aren't supported. This is a special object type in Unreal to create your own meshes with. As a work-around, you can convert them to a StaticMesh. These and other unsupported object types that are less common that either will not be rendered (decals, text, foliage, ...) or will by default be given the RGB color value of [149,149,149] or [0,0,0]. (brush objects, landscape,...).","title":"Limitations"},{"location":"annotation/#usage","text":"","title":"Usage"},{"location":"annotation/#settings-json-definition-of-layers","text":"To use the annotation system, you need to set the annotation mode in the settings.json file. You can define as many as you want and use them simultaneously. You will always have to ID them by the name. Here you define each layer with a name, the type and some other settings, often specific to the type. For example: { ... \"Annotation\": [ { \"Name\": \"RGBTestDirect\", \"Type\": 0, \"Default\": true, \"SetDirect\": true, \"ViewDistance\": 10 }, { \"Name\": \"RGBTestIndex\", \"Type\": 0, \"Default\": true, \"SetDirect\": false }, { \"Name\": \"GreyscaleTest\", \"Type\": 1, \"Default\": true, \"ViewDistance\": 5 }, { \"Name\": \"TextureTestDirect\", \"Type\": 2, \"Default\": true, \"SetDirect\": true }, { \"Name\": \"TextureTestRelativePath\", \"Type\": 2, \"Default\": false, \"SetDirect\": false, \"TexturePath\": \"/Game/AnnotationTest\", \"TexturePrefix\": \"Test1\" } ], ... } The types are: RGB = 0, Greyscale = 1, Texture = 2 The Default setting applies to all types and is what happens when no tag is set for na actor/component. When set to false, the mesh will not be rendered in the annotation layer. When set to true, the mesh will be rendered in the annotation layer with the default value of the layer. The ViewDistance setting applies to all types and allows you to set the maximum distance in meters at which the annotation layer is rendered. This only applies to the camera sensor output as for LiDAR you can set the maximum range distance of the sensor differently. This value is by default set to -1 which means infinite draw distance.","title":"Settings JSON definition of layers"},{"location":"annotation/#type-1-rgb","text":"Similar to instance segmentation , you can use the RGB annotation layer to tag objects in the environment with a unique color. You can do this by directly setting the color yourself (direct mode), or by assigning the object an index (0-2744000 unique colors) that will be linked to the colormap. To use direct mode, set the settings of this layer with SetDirect to true . For index mode, set to false . Actor/component tags have the following format: annotationName_R_G_B for direct mode or annotationName_ID for direct mode. So if for example your RGB annotation layer is called RGBTestDirect , you can tag an actor with the tag RGBTestDirect_255_0_0 to give it a red color. Or for index mode, RGBTest_5 to give it the fifth color in the colormap. When Default is set to 1, all objects without a tag for this layer will be rendered in black. The instance segmentation API function to get the colormap also applies to the RGB index mode. For example in Python you can use: colorMap = client.simGetSegmentationColorMap() Several RPC API functions are available to influence or retrieve the RGB annotation layer. Currently, it is not possible to use the RPC API to add new actors or components to the annotation system, you can only update their values. For example in Python: simSetAnnotationObjectID(annotation_name, mesh_name, object_id, is_name_regex=False/True) to update the color of an object in index mode (regex allows to set multiple with wildcards for example) when it already exists in the annotation system simSetAnnotationObjectColor(annotation_name, mesh_name, r, g, b, is_name_regex=False/True) to update the color of an object in direct mode (regex allows to set multiple with wildcards for example) when it already exists in the annotation system simGetAnnotationObjectID(annotation_name, mesh_name) to get the ID of an object in index mode simGetAnnotationObjectColor(annotation_name, mesh_name) to get the color of an object in direct mode simIsValidColor(r,g,b) You can check if a color is valid using this function The same is available in Unreal Blueprint and Unreal c++. You can find the functions in the Annotation category. Add RGBDirect Annotation Tag to Component/Actor(annotation_name, component/actor, color, update_annotation=true/false) to set the color of an object in direct mode Update RGBDirect Annotation Tag to Component/Actor(annotation_name, component/actor, color, update_annotation=true/false) to update the color of an object in direct mode already in the system Add RGBIndex Annotation Tag to Component/Actor(annotation_name, component/actor, object_id, update_annotation=true/false) to set the index of an object in index mode Update RGBIndex Annotation Tag to Component/Actor(annotation_name, component/actor, object_id, update_annotation=true/false) to update the index of an object in index mode already in the system Is Annotation RGB Valid(color) You can check if a color is valid using this function Note that enabling update_annotation is a relatively slow process, specially on actors with lots of annotated components. Ideally set update_annotation to false during the process of adding tags to the actor and only turn on update_annotation for the last component or actor you want to update. Alternatively, you can use the Add New Actor To Annotation() blueprint function to update the annotation layer for this actor after you have added all tags.","title":"Type 1: RGB"},{"location":"annotation/#type-2-greyscale","text":"You can use the greyscale annotation layer to tag objects in the environment with a float value between 0 and 1. Note that this has the precision of uint8. Actor/component tags have the following format: annotationName_value . So if for example your RGB annotation layer is called GreyscaleTest , you can tag an actor with the tag GreyscaleTest_0.76 to give it a value of 0.76 which would result in a color of (194, 194, 194). When Default is set to 1, all objects without a tag for this layer will be rendered in black. Several RPC API functions are available to influence or retrieve the RGB annotation layer. Currently, it is not possible to use the RPC API to add new actors or components to the annotation system, you can only update their values. For example in Python: simSetAnnotationObjectValue(annotation_name, mesh_name, greyscale_value, is_name_regex=False/True) to update the value of an object (regex allows to set multiple with wildcards for example) when it already exists in the annotation system simGetAnnotationObjectValue(annotation_name, mesh_name) to get the value of an object The same is available in Unreal Blueprint and Unreal c++. You can find the functions in the Annotation category. Add Greyscale Annotation Tag to Component/Actor(annotation_name, component/actor, value, update_annotation=true/false) to update the value of an object when it already exists in the annotation system Update Greyscale Annotation Tag to Component/Actor(annotation_name, component/actor, value, update_annotation=true/false) to update the value of an object Note that enabling update_annotation is a relatively slow process, specially on actors with lots of annotated components. Ideally set update_annotation to false during the process of adding tags to the actor and only turn on update_annotation for the last component or actor you want to update. Alternatively, you can use the Add New Actor To Annotation() blueprint function to update the annotation layer for this actor after you have added all tags.","title":"Type 2: Greyscale"},{"location":"annotation/#type-3-texture","text":"You can use the texture annotation layer to tag objects in the environment with a specific texture. This can be a color or greyscale texture, or you can mix them. Choice is up to you. You can do this by directly setting the texture yourself (direct mode), or by assigning a texture that is loaded based on a set path and the name of the mesh. To use direct mode, set the settings of this layer with SetDirect to true . For path reference mode, set to false . Actor/component tags have the following format: annotationName_texturepath for direct mode. The Unreal texture path name has to be rather specific: - If your texture is in the environment content folder, you must add /Game/ in front of the path. - If it is in the Cosys-AirSim plugin content folder, you must add /AirSim/ in front of the path. - For Engine textures, you must add /Engine/ in front of the path. So if for example your texture annotation layer is called TextureTestDirect , and your texture TestTexture is in the game content folder under a subfolder AnnotationTest you can tag an actor with the tag TextureTest_/Game/AnnotationTest/TestTexture to give it this texture. For path reference mod, the content of the tag is not really important as long as it contains the name of the annotation layer and an underscore, for example annotationName_enable . What is important is in reference mode is that you have a texture in the content folder with the name of the mesh if you do enable this object by setting a tag. You must place your textures in the folder defined by the TexturePath setting in the settings.json file for this layer. And the texture must have the same name as the mesh and start with the prefix set by the TexturePrefix setting in the settings.json file for this layer followed by a hyphen. So for example if you have a static mesh called Cylinder and your texture layer is called TextureTestDirect with the settings TexturePath set to /Game/AnnotationTest and TexturePrefix set to Test1 , you must have a texture called Test1-Cylinder in the folder /Game/AnnotationTest . When Default is set to 1, all objects without a tag for this layer will be rendered in black. Several RPC API functions are available to influence or retrieve the RGB annotation layer. Currently, it is not possible to use the RPC API to add new actors or components to the annotation system, you can only update their values. For example in Python: simSetAnnotationObjectTextureByPath(annotation_name, mesh_name, texture_path, is_name_regex=False/True) to set the texture of an object in direct mode, the texture path should be same format as described above, for example /Game/MyTextures/TestTexture1 (regex allows to set multiple with wildcards for example) simEnableAnnotationObjectTextureByPath(annotation_name, mesh_name, r, g, b, is_name_regex=False/True) to enable the texture of an object in relative path mode, this does require a texture in the relative path as described above! (regex allows to set multiple with wildcards for example) simGetAnnotationObjectTexturePath(annotation_name, mesh_name) to get the texture path of an object The same is available in Unreal Blueprint and Unreal c++. You can find the functions in the Annotation category. Add Texture Direct Annotation Tag to Component/Actor By Path(annotation_name, component/actor, texture_path, update_annotation=true/false) to set the texture of an object in direct mode, the texture path should be same format as described above, for example /Game/MyTextures/TestTexture1 Update Texture Direct Annotation Tag to Component/Actor By Path(annotation_name, component/actor, texture_path, update_annotation=true/false) to update texture of an object in direct mode that is already in the system, the texture path should be same format as described above, for example /Game/MyTextures/TestTexture1 Add Texture Direct Annotation Tag to Component/Actor(annotation_name, component/actor, texture, update_annotation=true/false) to set the texture of an object in direct mode, the texture can be directly referenced as UTexture* Object Update Texture Direct Annotation Tag to Component/Actor(annotation_name, component/actor, texture, update_annotation=true/false) to update texture of an object in direct mode that is already in the system, the texture can be directly referenced as UTexture* Object Enable Texture By Path Annotation Tag to Component/Actor(annotation_name, component/actor, update_annotation=true/false) to enable the texture of an object in relative path mode, this does require a texture in the relative path as described above! Note that enabling update_annotation is a relatively slow process, specially on actors with lots of annotated components. Ideally set update_annotation to false during the process of adding tags to the actor and only turn on update_annotation for the last component or actor you want to update. Alternatively, you can use the Add New Actor To Annotation() blueprint function to update the annotation layer for this actor after you have added all tags.","title":"Type 3: Texture"},{"location":"annotation/#common-functionality","text":"By default, when the world loads, all meshes are checked for tags and the annotation layers are updated accordingly. With the unreal blueprint and c++ functions however, you can also decide to update the annotation layer only when you want to with the update_annotation argument. If you have many objects to update, this can save a lot of time by doing it only for the last object. Some API functions exist for all types, for example in Python: simListAnnotationObjects(annotation_name) to get a list of all objects within this annotation layer. simListAnnotationPoses(annotation_name, ned=True/False, only_visible=False/True) to get the 3D poses of all objects in this annotation layer. The returned pose is in NED coordinates in SI units with its origin at Player Start by default or in Unreal NED frame if the ned boolean argument is set to talse . Similarly, for Unreal Blueprint and Unreal c++. You can find the functions in the Annotation category. Does Annotation Layer Exist(annotation_name) to figure out if a layer exists or not Add New Actor To Annotation(annotation_name, actor, update_annotation=true/false) if you manually added a tag, and want to update the annotation layer with this actor. This is useful to run after adding multiple tags to the actor and its components with the other api calls, and you want to update the annotation layer only once, otherwise it will be much slower. Delete Actor From Annotation(annotation_name, actor, update_annotation=true/false) if you manually remove all tags from an actor for this layer and remove it from the annotation layer Force Update Annotation(annotation_name) to force an update of the annotation layer.","title":"Common functionality"},{"location":"annotation/#getting-annotation-data-from-sensors","text":"The easiest way to get the images from annotation cameras, is through the image API. See the Image API documentation for more information. GPU LiDAR is also supported, but each GPU Lidar can only render one annotation layer. See the GPU LiDAR documentation for more information. You can also display the annotation layers in the subwindows. See the Settings documentation for more information. For example: { ... \"SubWindows\": [ { \"WindowID\": 0, \"CameraName\": \"front_center\", \"ImageType\": 10, \"VehicleName\": \"robot1\", \"Annotation\": \"GreyscaleTest\", \"Visible\": false }, ...","title":"Getting annotation data from sensors"},{"location":"annotation/#credits","text":"The method used to use Proxy meshes to segment object is a derivative of and inspired by the work of UnrealCV . Their work is licensed under the MIT License. It is made by students from Johns Hopkins University and Peking University under the supervision of Prof. Alan Yuille and Prof. Yizhou Wang. You can read the paper on their work here .","title":"Credits"},{"location":"apis/","text":"AirSim APIs Introduction AirSim exposes APIs so you can interact with vehicle in the simulation programmatically. You can use these APIs to retrieve images, get state, control the vehicle and so on. Python Quickstart If you want to use Python to call AirSim APIs, we recommend using Anaconda with Python 3.5 or later versions however some code may also work with Python 2.7. First install this package: pip install msgpack-rpc-python Once you can run AirSim, choose Car as vehicle and then navigate to PythonClient\\car\\ folder and run: python hello_car.py If you are using Visual Studio 2019 then just open AirSim.sln, set PythonClient as startup project and choose car\\hello_car.py as your startup script. Installing AirSim Package You can also install the AirSim python module to your Python environment to use anywhere by running pip install . in the PythonClient folder. Notes 1. You may notice a file setup_path.py in our example folders. This file has simple code to detect if airsim package is available in parent folder and in that case we use that instead of pip installed package so you always use latest code. 2. AirSim is still under heavy development which means you might frequently need to update the package to use new APIs. C++ Users If you want to use C++ APIs and examples, please see C++ APIs Guide . Hello Car Here's how to use AirSim APIs using Python to control simulated car (see also C++ example ): # ready to run example: PythonClient/car/hello_car.py import cosysairsim as airsim import time # connect to the AirSim simulator client = airsim.CarClient() client.confirmConnection() client.enableApiControl(True) car_controls = airsim.CarControls() while True: # get state of the car car_state = client.getCarState() print(\"Speed %d, Gear %d\" % (car_state.speed, car_state.gear)) # set the controls for car car_controls.throttle = 1 car_controls.steering = 1 client.setCarControls(car_controls) # let car drive a bit time.sleep(1) # get camera images from the car responses = client.simGetImages([ airsim.ImageRequest(0, airsim.ImageType.DepthVis), airsim.ImageRequest(1, airsim.ImageType.DepthPlanar, True)]) print('Retrieved images: %d', len(responses)) # do something with images for response in responses: if response.pixels_as_float: print(\"Type %d, size %d\" % (response.image_type, len(response.image_data_float))) airsim.write_pfm('py1.pfm', airsim.get_pfm_array(response)) else: print(\"Type %d, size %d\" % (response.image_type, len(response.image_data_uint8))) airsim.write_file('py1.png', response.image_data_uint8) Hello Drone Here's how to use AirSim APIs using Python to control simulated quadrotor (see also C++ example ): # ready to run example: PythonClient/multirotor/hello_drone.py import cosysairsim as airsim import os # connect to the AirSim simulator client = airsim.MultirotorClient() client.confirmConnection() client.enableApiControl(True) client.armDisarm(True) # Async methods returns Future. Call join() to wait for task to complete. client.takeoffAsync().join() client.moveToPositionAsync(-10, 10, -10, 5).join() # take images responses = client.simGetImages([ airsim.ImageRequest(\"0\", airsim.ImageType.DepthVis), airsim.ImageRequest(\"1\", airsim.ImageType.DepthPlanar, True)]) print('Retrieved images: %d', len(responses)) # do something with the images for response in responses: if response.pixels_as_float: print(\"Type %d, size %d\" % (response.image_type, len(response.image_data_float))) airsim.write_pfm(os.path.normpath('/temp/py1.pfm'), airsim.get_pfm_array(response)) else: print(\"Type %d, size %d\" % (response.image_type, len(response.image_data_uint8))) airsim.write_file(os.path.normpath('/temp/py1.png'), response.image_data_uint8) Common APIs reset : This resets the vehicle to its original starting state. Note that you must call enableApiControl and armDisarm again after the call to reset . confirmConnection : Checks state of connection every 1 sec and reports it in Console so user can see the progress for connection. enableApiControl : For safety reasons, by default API control for autonomous vehicle is not enabled and human operator has full control (usually via RC or joystick in simulator). The client must make this call to request control via API. It is likely that human operator of vehicle might have disallowed API control which would mean that enableApiControl has no effect. This can be checked by isApiControlEnabled . isApiControlEnabled : Returns true if API control is established. If false (which is default) then API calls would be ignored. After a successful call to enableApiControl , the isApiControlEnabled should return true. ping : If connection is established then this call will return true otherwise it will be blocked until timeout. simPrintLogMessage : Prints the specified message in the simulator's window. If message_param is also supplied then its printed next to the message and in that case if this API is called with same message value but different message_param again then previous line is overwritten with new line (instead of API creating new line on display). For example, simPrintLogMessage(\"Iteration: \", to_string(i)) keeps updating same line on display when API is called with different values of i. The valid values of severity parameter is 0 to 3 inclusive that corresponds to different colors. simGetObjectPose(ned=true) , simSetObjectPose : Gets and sets the pose of specified object in Unreal environment. Here the object means \"actor\" in Unreal terminology. They are searched by tag as well as name. Please note that the names shown in UE Editor are auto-generated in each run and are not permanent. So if you want to refer to actor by name, you must change its auto-generated name in UE Editor. Alternatively you can add a tag to actor which can be done by clicking on that actor in Unreal Editor and then going to Tags property , click \"+\" sign and add some string value. If multiple actors have same tag then the first match is returned. If no matches are found then NaN pose is returned. The returned pose is in NED coordinates in SI units with its origin at Player Start by default or in Unreal NED frame if the ned boolean argument is set to talse . For simSetObjectPose , the specified actor must have Mobility set to Movable or otherwise you will get undefined behavior. The simSetObjectPose has parameter teleport which means object is moved through other objects in its way and it returns true if move was successful simListSceneObjects : Provides a list of all objects in the environment. You can also use regular expression to filter specific objects by name. For example, the code below sets all meshes which have names starting with \"wall\" you can use simListSceneObjects(\"wall[\\w]*\") . Image/Computer Vision/Instance segmentation APIs AirSim offers comprehensive images APIs to retrieve synchronized images from multiple cameras along with ground truth including depth, disparity, surface normals and vision. You can set the resolution, FOV, motion blur etc parameters in settings.json . There is also API for detecting collision state. See also complete code that generates specified number of stereo images and ground truth depth with normalization to camera plan, computation of disparity image and saving it to pfm format . Furthermore, the Instance Segmentation system can also be manipulated through the API. More on image APIs, Computer Vision mode and instance segmentation configuration . Pause and Continue APIs AirSim allows to pause and continue the simulation through pause(is_paused) API. To pause the simulation call pause(True) and to continue the simulation call pause(False) . You may have scenario, especially while using reinforcement learning, to run the simulation for specified amount of time and then automatically pause. While simulation is paused, you may then do some expensive computation, send a new command and then again run the simulation for specified amount of time. This can be achieved by API continueForTime(seconds) . This API runs the simulation for the specified number of seconds and then pauses the simulation. For example usage, please see pause_continue_car.py and pause_continue_drone.py . Collision API The collision information can be obtained using simGetCollisionInfo API. This call returns a struct that has information not only whether collision occurred but also collision position, surface normal, penetration depth and so on. Time of Day API AirSim assumes there exist sky sphere of class EngineSky/BP_Sky_Sphere in your environment with ADirectionalLight actor. By default, the position of the sun in the scene doesn't move with time. You can use settings to set up latitude, longitude, date and time which AirSim uses to compute the position of sun in the scene. You can also use following API call to set the sun position according to given date time: simSetTimeOfDay(self, is_enabled, start_datetime = \"\", is_start_datetime_dst = False, celestial_clock_speed = 1, update_interval_secs = 60, move_sun = True) The is_enabled parameter must be True to enable time of day effect. If it is False then sun position is reset to its original in the environment. Other parameters are same as in settings . Line-of-sight and world extent APIs To test line-of-sight in the sim from a vehicle to a point or between two points, see simTestLineOfSightToPoint(point, vehicle_name) and simTestLineOfSightBetweenPoints(point1, point2), respectively. Sim world extent, in the form of a vector of two GeoPoints, can be retrieved using simGetWorldExtents(). Weather APIs By default all weather effects are disabled. To enable weather effect, first call: simEnableWeather(True) Various weather effects can be enabled by using simSetWeatherParameter method which takes WeatherParameter , for example, client.simSetWeatherParameter(airsim.WeatherParameter.Rain, 0.25); The second parameter value is from 0 to 1. The first parameter provides following options: class WeatherParameter: Rain = 0 Roadwetness = 1 Snow = 2 RoadSnow = 3 MapleLeaf = 4 RoadLeaf = 5 Dust = 6 Fog = 7 Please note that Roadwetness , RoadSnow and RoadLeaf effects requires adding materials to your scene. Please see example code for more details. Recording APIs Recording APIs can be used to start recording data through APIs. Data to be recorded can be specified using settings . To start recording, use - client.startRecording() Similarly, to stop recording, use client.stopRecording() . To check whether Recording is running, call client.isRecording() , returns a bool . This API works alongwith toggling Recording using R button, therefore if it's enabled using R key, isRecording() will return True , and recording can be stopped via API using stopRecording() . Similarly, recording started using API will be stopped if R key is pressed in Viewport. LogMessage will also appear in the top-left of the viewport if recording is started or stopped using API. Note that this will only save the data as specfied in the settings. For full freedom in storing data such as certain sensor information, or in a different format or layout, use the other APIs to fetch the data and save as desired. Check out Modifying Recording Data for details on how to modify the kinematics data being recorded. Wind API Wind can be changed during simulation using simSetWind() . Wind is specified in World frame, NED direction and m/s values E.g. To set 20m/s wind in North (forward) direction - # Set wind to (20,0,0) in NED (forward direction) wind = airsim.Vector3r(20, 0, 0) client.simSetWind(wind) Also see example script in set_wind.py Lidar APIs AirSim offers API to retrieve point cloud data from (GPU)Lidar sensors on vehicles. You can set the number of channels, points per second, horizontal and vertical FOV, etc parameters in settings.json . More on lidar APIs and settings , GPUlidar APIs and settings and sensor settings Light Control APIs Lights that can be manipulated inside AirSim can be created via the simSpawnObject() API by passing either PointLightBP or SpotLightBP as the asset_name parameter and True as the is_blueprint parameter. Once a light has been spawned, it can be manipulated using the following API: simSetLightIntensity : This allows you to edit a light's intensity or brightness. It takes two parameters, light_name , the name of the light object returned by a previous call to simSpawnObject() , and intensity , a float value. Texture APIs Textures can be dynamically set on objects via these APIs: simSetObjectMaterial : This sets an object's material using an existing Unreal material asset. It takes two string parameters, object_name and material_name . simSetObjectMaterialFromTexture : This sets an object's material using a path to a texture. It takes two string parameters, object_name and texture_path . Multiple Vehicles AirSim supports multiple vehicles and control them through APIs. Please Multiple Vehicles doc. Coordinate System All AirSim API uses NED coordinate system, i.e., +X is North, +Y is East and +Z is Down. All units are in SI system. Please note that this is different from coordinate system used internally by Unreal Engine. In Unreal Engine, +Z is up instead of down and length unit is in centimeters instead of meters. AirSim APIs takes care of the appropriate conversions. The starting point of the vehicle is always coordinates (0, 0, 0) in NED system. Thus when converting from Unreal coordinates to NED, we first subtract the starting offset and then scale by 100 for cm to m conversion. The vehicle is spawned in Unreal environment where the Player Start component is placed. There is a setting called OriginGeopoint in settings.json which assigns geographic longitude, longitude and altitude to the Player Start component. If wanted, one can move the Unreal origin to the same location as the AirSim origin player start position by setting the MoveWorldOrigin in the settings.json to true . Vehicle Specific APIs APIs for Car Car has followings APIs available: setCarControls : This allows you to set throttle, steering, handbrake and auto or manual gear. getCarState : This retrieves the state information including speed, current gear and 6 kinematics quantities: position, orientation, linear and angular velocity, linear and angular acceleration. All quantities are in NED coordinate system, SI units in world frame except for angular velocity and accelerations which are in body frame. Image APIs . APIs for Multirotor Multirotor can be controlled by specifying angles, velocity vector, destination position or some combination of these. There are corresponding move* APIs for this purpose. When doing position control, we need to use some path following algorithm. By default AirSim uses carrot following algorithm. This is often referred to as \"high level control\" because you just need to specify high level goal and the firmware takes care of the rest. Currently lowest level control available in AirSim is moveByAngleThrottleAsync API. getMultirotorState This API returns the state of the vehicle in one call. The state includes, collision, estimated kinematics (i.e. kinematics computed by fusing sensors), and timestamp (nano seconds since epoch). The kinematics here means 6 quantities: position, orientation, linear and angular velocity, linear and angular acceleration. Please note that simple_slight currently doesn't support state estimator which means estimated and ground truth kinematics values would be same for simple_flight. Estimated kinematics are however available for PX4 except for angular acceleration. All quantities are in NED coordinate system, SI units in world frame except for angular velocity and accelerations which are in body frame. Async methods, duration and max_wait_seconds Many API methods has parameters named duration or max_wait_seconds and they have Async as suffix, for example, takeoffAsync . These methods will return immediately after starting the task in AirSim so that your client code can do something else while that task is being executed. If you want to wait for this task to complete then you can call waitOnLastTask like this: //C++ client.takeoffAsync()->waitOnLastTask(); # Python client.takeoffAsync().join() If you start another command then it automatically cancels the previous task and starts new command. This allows to use pattern where your coded continuously does the sensing, computes a new trajectory to follow and issues that path to vehicle in AirSim. Each newly issued trajectory cancels the previous trajectory allowing your code to continuously do the update as new sensor data arrives. All Async method returns concurrent.futures.Future in Python ( std::future in C++). Please note that these future classes currently do not allow to check status or cancel the task; they only allow to wait for task to complete. AirSim does provide API cancelLastTask , however. drivetrain There are two modes you can fly vehicle: drivetrain parameter is set to airsim.DrivetrainType.ForwardOnly or airsim.DrivetrainType.MaxDegreeOfFreedom . When you specify ForwardOnly, you are saying that vehicle's front should always point in the direction of travel. So if you want drone to take left turn then it would first rotate so front points to left. This mode is useful when you have only front camera and you are operating vehicle using FPV view. This is more or less like travelling in car where you always have front view. The MaxDegreeOfFreedom means you don't care where the front points to. So when you take left turn, you just start going left like crab. Quadrotors can go in any direction regardless of where front points to. The MaxDegreeOfFreedom enables this mode. yaw_mode yaw_mode is a struct YawMode with two fields, yaw_or_rate and is_rate . If is_rate field is True then yaw_or_rate field is interpreted as angular velocity in degrees/sec which means you want vehicle to rotate continuously around its axis at that angular velocity while moving. If is_rate is False then yaw_or_rate is interpreted as angle in degrees which means you want vehicle to rotate to specific angle (i.e. yaw) and keep that angle while moving. You can probably see that when yaw_mode.is_rate == true , the drivetrain parameter shouldn't be set to ForwardOnly because you are contradicting by saying that keep front pointing ahead but also rotate continuously. However if you have yaw_mode.is_rate = false in ForwardOnly mode then you can do some funky stuff. For example, you can have drone do circles and have yaw_or_rate set to 90 so camera is always pointed to center (\"super cool selfie mode\"). In MaxDegreeofFreedom also you can get some funky stuff by setting yaw_mode.is_rate = true and say yaw_mode.yaw_or_rate = 20 . This will cause drone to go in its path while rotating which may allow to do 360 scanning. In most cases, you just don't want yaw to change which you can do by setting yaw rate of 0. The shorthand for this is airsim.YawMode.Zero() (or in C++: YawMode::Zero() ). lookahead and adaptive_lookahead When you ask vehicle to follow a path, AirSim uses \"carrot following\" algorithm. This algorithm operates by looking ahead on path and adjusting its velocity vector. The parameters for this algorithm is specified by lookahead and adaptive_lookahead . For most of the time you want algorithm to auto-decide the values by simply setting lookahead = -1 and adaptive_lookahead = 0 . Using APIs on Real Vehicles We want to be able to run same code that runs in simulation as on real vehicle. This allows you to test your code in simulator and deploy to real vehicle. Generally speaking, APIs therefore shouldn't allow you to do something that cannot be done on real vehicle (for example, getting the ground truth). But, of course, simulator has much more information and it would be useful in applications that may not care about running things on real vehicle. For this reason, we clearly delineate between sim-only APIs by attaching sim prefix, for example, simGetGroundTruthKinematics . This way you can avoid using these simulation-only APIs if you care about running your code on real vehicles. The AirLib is self-contained library that you can put on an offboard computing module such as the Gigabyte barebone Mini PC. This module then can talk to the flight controllers such as PX4 using exact same code and flight controller protocol. The code you write for testing in the simulator remains unchanged. See AirLib on custom drones . Adding New APIs to AirSim See the Adding New APIs page References and Examples C++ API Examples Car Examples Multirotor Examples Computer Vision Examples Move on Path demo showing video of fast multirotor flight through Modular Neighborhood environment Building a Hexacopter Building Point Clouds FAQ Unreal is slowed down dramatically when I run API If you see Unreal getting slowed down dramatically when Unreal Engine window loses focus then go to 'Edit->Editor Preferences' in Unreal Editor, in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. Do I need anything else on Windows? You should install VS2019 with VC++, Windows SDK 10.0 and Python. To use Python APIs you will need Python 3.5 or later (install it using Anaconda). Which version of Python should I use? We recommend Anaconda to get Python tools and libraries. Our code is tested with Python 3.5.3 :: Anaconda 4.4.0. This is important because older version have been known to have problems . I get error on import cv2 You can install OpenCV using: conda install opencv pip install opencv-python TypeError: unsupported operand type(s) for *: 'AsyncIOLoop' and 'float' This error happens if you install Jupyter, which somehow breaks the msgpackrpc library. Create a new python environment which the minimal required packages.","title":"Core APIs"},{"location":"apis/#airsim-apis","text":"","title":"AirSim APIs"},{"location":"apis/#introduction","text":"AirSim exposes APIs so you can interact with vehicle in the simulation programmatically. You can use these APIs to retrieve images, get state, control the vehicle and so on.","title":"Introduction"},{"location":"apis/#python-quickstart","text":"If you want to use Python to call AirSim APIs, we recommend using Anaconda with Python 3.5 or later versions however some code may also work with Python 2.7. First install this package: pip install msgpack-rpc-python Once you can run AirSim, choose Car as vehicle and then navigate to PythonClient\\car\\ folder and run: python hello_car.py If you are using Visual Studio 2019 then just open AirSim.sln, set PythonClient as startup project and choose car\\hello_car.py as your startup script.","title":"Python Quickstart"},{"location":"apis/#installing-airsim-package","text":"You can also install the AirSim python module to your Python environment to use anywhere by running pip install . in the PythonClient folder. Notes 1. You may notice a file setup_path.py in our example folders. This file has simple code to detect if airsim package is available in parent folder and in that case we use that instead of pip installed package so you always use latest code. 2. AirSim is still under heavy development which means you might frequently need to update the package to use new APIs.","title":"Installing AirSim Package"},{"location":"apis/#c-users","text":"If you want to use C++ APIs and examples, please see C++ APIs Guide .","title":"C++ Users"},{"location":"apis/#hello-car","text":"Here's how to use AirSim APIs using Python to control simulated car (see also C++ example ): # ready to run example: PythonClient/car/hello_car.py import cosysairsim as airsim import time # connect to the AirSim simulator client = airsim.CarClient() client.confirmConnection() client.enableApiControl(True) car_controls = airsim.CarControls() while True: # get state of the car car_state = client.getCarState() print(\"Speed %d, Gear %d\" % (car_state.speed, car_state.gear)) # set the controls for car car_controls.throttle = 1 car_controls.steering = 1 client.setCarControls(car_controls) # let car drive a bit time.sleep(1) # get camera images from the car responses = client.simGetImages([ airsim.ImageRequest(0, airsim.ImageType.DepthVis), airsim.ImageRequest(1, airsim.ImageType.DepthPlanar, True)]) print('Retrieved images: %d', len(responses)) # do something with images for response in responses: if response.pixels_as_float: print(\"Type %d, size %d\" % (response.image_type, len(response.image_data_float))) airsim.write_pfm('py1.pfm', airsim.get_pfm_array(response)) else: print(\"Type %d, size %d\" % (response.image_type, len(response.image_data_uint8))) airsim.write_file('py1.png', response.image_data_uint8)","title":"Hello Car"},{"location":"apis/#hello-drone","text":"Here's how to use AirSim APIs using Python to control simulated quadrotor (see also C++ example ): # ready to run example: PythonClient/multirotor/hello_drone.py import cosysairsim as airsim import os # connect to the AirSim simulator client = airsim.MultirotorClient() client.confirmConnection() client.enableApiControl(True) client.armDisarm(True) # Async methods returns Future. Call join() to wait for task to complete. client.takeoffAsync().join() client.moveToPositionAsync(-10, 10, -10, 5).join() # take images responses = client.simGetImages([ airsim.ImageRequest(\"0\", airsim.ImageType.DepthVis), airsim.ImageRequest(\"1\", airsim.ImageType.DepthPlanar, True)]) print('Retrieved images: %d', len(responses)) # do something with the images for response in responses: if response.pixels_as_float: print(\"Type %d, size %d\" % (response.image_type, len(response.image_data_float))) airsim.write_pfm(os.path.normpath('/temp/py1.pfm'), airsim.get_pfm_array(response)) else: print(\"Type %d, size %d\" % (response.image_type, len(response.image_data_uint8))) airsim.write_file(os.path.normpath('/temp/py1.png'), response.image_data_uint8)","title":"Hello Drone"},{"location":"apis/#common-apis","text":"reset : This resets the vehicle to its original starting state. Note that you must call enableApiControl and armDisarm again after the call to reset . confirmConnection : Checks state of connection every 1 sec and reports it in Console so user can see the progress for connection. enableApiControl : For safety reasons, by default API control for autonomous vehicle is not enabled and human operator has full control (usually via RC or joystick in simulator). The client must make this call to request control via API. It is likely that human operator of vehicle might have disallowed API control which would mean that enableApiControl has no effect. This can be checked by isApiControlEnabled . isApiControlEnabled : Returns true if API control is established. If false (which is default) then API calls would be ignored. After a successful call to enableApiControl , the isApiControlEnabled should return true. ping : If connection is established then this call will return true otherwise it will be blocked until timeout. simPrintLogMessage : Prints the specified message in the simulator's window. If message_param is also supplied then its printed next to the message and in that case if this API is called with same message value but different message_param again then previous line is overwritten with new line (instead of API creating new line on display). For example, simPrintLogMessage(\"Iteration: \", to_string(i)) keeps updating same line on display when API is called with different values of i. The valid values of severity parameter is 0 to 3 inclusive that corresponds to different colors. simGetObjectPose(ned=true) , simSetObjectPose : Gets and sets the pose of specified object in Unreal environment. Here the object means \"actor\" in Unreal terminology. They are searched by tag as well as name. Please note that the names shown in UE Editor are auto-generated in each run and are not permanent. So if you want to refer to actor by name, you must change its auto-generated name in UE Editor. Alternatively you can add a tag to actor which can be done by clicking on that actor in Unreal Editor and then going to Tags property , click \"+\" sign and add some string value. If multiple actors have same tag then the first match is returned. If no matches are found then NaN pose is returned. The returned pose is in NED coordinates in SI units with its origin at Player Start by default or in Unreal NED frame if the ned boolean argument is set to talse . For simSetObjectPose , the specified actor must have Mobility set to Movable or otherwise you will get undefined behavior. The simSetObjectPose has parameter teleport which means object is moved through other objects in its way and it returns true if move was successful simListSceneObjects : Provides a list of all objects in the environment. You can also use regular expression to filter specific objects by name. For example, the code below sets all meshes which have names starting with \"wall\" you can use simListSceneObjects(\"wall[\\w]*\") .","title":"Common APIs"},{"location":"apis/#imagecomputer-visioninstance-segmentation-apis","text":"AirSim offers comprehensive images APIs to retrieve synchronized images from multiple cameras along with ground truth including depth, disparity, surface normals and vision. You can set the resolution, FOV, motion blur etc parameters in settings.json . There is also API for detecting collision state. See also complete code that generates specified number of stereo images and ground truth depth with normalization to camera plan, computation of disparity image and saving it to pfm format . Furthermore, the Instance Segmentation system can also be manipulated through the API. More on image APIs, Computer Vision mode and instance segmentation configuration .","title":"Image/Computer Vision/Instance segmentation APIs"},{"location":"apis/#pause-and-continue-apis","text":"AirSim allows to pause and continue the simulation through pause(is_paused) API. To pause the simulation call pause(True) and to continue the simulation call pause(False) . You may have scenario, especially while using reinforcement learning, to run the simulation for specified amount of time and then automatically pause. While simulation is paused, you may then do some expensive computation, send a new command and then again run the simulation for specified amount of time. This can be achieved by API continueForTime(seconds) . This API runs the simulation for the specified number of seconds and then pauses the simulation. For example usage, please see pause_continue_car.py and pause_continue_drone.py .","title":"Pause and Continue APIs"},{"location":"apis/#collision-api","text":"The collision information can be obtained using simGetCollisionInfo API. This call returns a struct that has information not only whether collision occurred but also collision position, surface normal, penetration depth and so on.","title":"Collision API"},{"location":"apis/#time-of-day-api","text":"AirSim assumes there exist sky sphere of class EngineSky/BP_Sky_Sphere in your environment with ADirectionalLight actor. By default, the position of the sun in the scene doesn't move with time. You can use settings to set up latitude, longitude, date and time which AirSim uses to compute the position of sun in the scene. You can also use following API call to set the sun position according to given date time: simSetTimeOfDay(self, is_enabled, start_datetime = \"\", is_start_datetime_dst = False, celestial_clock_speed = 1, update_interval_secs = 60, move_sun = True) The is_enabled parameter must be True to enable time of day effect. If it is False then sun position is reset to its original in the environment. Other parameters are same as in settings .","title":"Time of Day API"},{"location":"apis/#line-of-sight-and-world-extent-apis","text":"To test line-of-sight in the sim from a vehicle to a point or between two points, see simTestLineOfSightToPoint(point, vehicle_name) and simTestLineOfSightBetweenPoints(point1, point2), respectively. Sim world extent, in the form of a vector of two GeoPoints, can be retrieved using simGetWorldExtents().","title":"Line-of-sight and world extent APIs"},{"location":"apis/#weather-apis","text":"By default all weather effects are disabled. To enable weather effect, first call: simEnableWeather(True) Various weather effects can be enabled by using simSetWeatherParameter method which takes WeatherParameter , for example, client.simSetWeatherParameter(airsim.WeatherParameter.Rain, 0.25); The second parameter value is from 0 to 1. The first parameter provides following options: class WeatherParameter: Rain = 0 Roadwetness = 1 Snow = 2 RoadSnow = 3 MapleLeaf = 4 RoadLeaf = 5 Dust = 6 Fog = 7 Please note that Roadwetness , RoadSnow and RoadLeaf effects requires adding materials to your scene. Please see example code for more details.","title":"Weather APIs"},{"location":"apis/#recording-apis","text":"Recording APIs can be used to start recording data through APIs. Data to be recorded can be specified using settings . To start recording, use - client.startRecording() Similarly, to stop recording, use client.stopRecording() . To check whether Recording is running, call client.isRecording() , returns a bool . This API works alongwith toggling Recording using R button, therefore if it's enabled using R key, isRecording() will return True , and recording can be stopped via API using stopRecording() . Similarly, recording started using API will be stopped if R key is pressed in Viewport. LogMessage will also appear in the top-left of the viewport if recording is started or stopped using API. Note that this will only save the data as specfied in the settings. For full freedom in storing data such as certain sensor information, or in a different format or layout, use the other APIs to fetch the data and save as desired. Check out Modifying Recording Data for details on how to modify the kinematics data being recorded.","title":"Recording APIs"},{"location":"apis/#wind-api","text":"Wind can be changed during simulation using simSetWind() . Wind is specified in World frame, NED direction and m/s values E.g. To set 20m/s wind in North (forward) direction - # Set wind to (20,0,0) in NED (forward direction) wind = airsim.Vector3r(20, 0, 0) client.simSetWind(wind) Also see example script in set_wind.py","title":"Wind API"},{"location":"apis/#lidar-apis","text":"AirSim offers API to retrieve point cloud data from (GPU)Lidar sensors on vehicles. You can set the number of channels, points per second, horizontal and vertical FOV, etc parameters in settings.json . More on lidar APIs and settings , GPUlidar APIs and settings and sensor settings","title":"Lidar APIs"},{"location":"apis/#light-control-apis","text":"Lights that can be manipulated inside AirSim can be created via the simSpawnObject() API by passing either PointLightBP or SpotLightBP as the asset_name parameter and True as the is_blueprint parameter. Once a light has been spawned, it can be manipulated using the following API: simSetLightIntensity : This allows you to edit a light's intensity or brightness. It takes two parameters, light_name , the name of the light object returned by a previous call to simSpawnObject() , and intensity , a float value.","title":"Light Control APIs"},{"location":"apis/#texture-apis","text":"Textures can be dynamically set on objects via these APIs: simSetObjectMaterial : This sets an object's material using an existing Unreal material asset. It takes two string parameters, object_name and material_name . simSetObjectMaterialFromTexture : This sets an object's material using a path to a texture. It takes two string parameters, object_name and texture_path .","title":"Texture APIs"},{"location":"apis/#multiple-vehicles","text":"AirSim supports multiple vehicles and control them through APIs. Please Multiple Vehicles doc.","title":"Multiple Vehicles"},{"location":"apis/#coordinate-system","text":"All AirSim API uses NED coordinate system, i.e., +X is North, +Y is East and +Z is Down. All units are in SI system. Please note that this is different from coordinate system used internally by Unreal Engine. In Unreal Engine, +Z is up instead of down and length unit is in centimeters instead of meters. AirSim APIs takes care of the appropriate conversions. The starting point of the vehicle is always coordinates (0, 0, 0) in NED system. Thus when converting from Unreal coordinates to NED, we first subtract the starting offset and then scale by 100 for cm to m conversion. The vehicle is spawned in Unreal environment where the Player Start component is placed. There is a setting called OriginGeopoint in settings.json which assigns geographic longitude, longitude and altitude to the Player Start component. If wanted, one can move the Unreal origin to the same location as the AirSim origin player start position by setting the MoveWorldOrigin in the settings.json to true .","title":"Coordinate System"},{"location":"apis/#vehicle-specific-apis","text":"","title":"Vehicle Specific APIs"},{"location":"apis/#apis-for-car","text":"Car has followings APIs available: setCarControls : This allows you to set throttle, steering, handbrake and auto or manual gear. getCarState : This retrieves the state information including speed, current gear and 6 kinematics quantities: position, orientation, linear and angular velocity, linear and angular acceleration. All quantities are in NED coordinate system, SI units in world frame except for angular velocity and accelerations which are in body frame. Image APIs .","title":"APIs for Car"},{"location":"apis/#apis-for-multirotor","text":"Multirotor can be controlled by specifying angles, velocity vector, destination position or some combination of these. There are corresponding move* APIs for this purpose. When doing position control, we need to use some path following algorithm. By default AirSim uses carrot following algorithm. This is often referred to as \"high level control\" because you just need to specify high level goal and the firmware takes care of the rest. Currently lowest level control available in AirSim is moveByAngleThrottleAsync API.","title":"APIs for Multirotor"},{"location":"apis/#getmultirotorstate","text":"This API returns the state of the vehicle in one call. The state includes, collision, estimated kinematics (i.e. kinematics computed by fusing sensors), and timestamp (nano seconds since epoch). The kinematics here means 6 quantities: position, orientation, linear and angular velocity, linear and angular acceleration. Please note that simple_slight currently doesn't support state estimator which means estimated and ground truth kinematics values would be same for simple_flight. Estimated kinematics are however available for PX4 except for angular acceleration. All quantities are in NED coordinate system, SI units in world frame except for angular velocity and accelerations which are in body frame.","title":"getMultirotorState"},{"location":"apis/#async-methods-duration-and-max_wait_seconds","text":"Many API methods has parameters named duration or max_wait_seconds and they have Async as suffix, for example, takeoffAsync . These methods will return immediately after starting the task in AirSim so that your client code can do something else while that task is being executed. If you want to wait for this task to complete then you can call waitOnLastTask like this: //C++ client.takeoffAsync()->waitOnLastTask(); # Python client.takeoffAsync().join() If you start another command then it automatically cancels the previous task and starts new command. This allows to use pattern where your coded continuously does the sensing, computes a new trajectory to follow and issues that path to vehicle in AirSim. Each newly issued trajectory cancels the previous trajectory allowing your code to continuously do the update as new sensor data arrives. All Async method returns concurrent.futures.Future in Python ( std::future in C++). Please note that these future classes currently do not allow to check status or cancel the task; they only allow to wait for task to complete. AirSim does provide API cancelLastTask , however.","title":"Async methods, duration and max_wait_seconds"},{"location":"apis/#drivetrain","text":"There are two modes you can fly vehicle: drivetrain parameter is set to airsim.DrivetrainType.ForwardOnly or airsim.DrivetrainType.MaxDegreeOfFreedom . When you specify ForwardOnly, you are saying that vehicle's front should always point in the direction of travel. So if you want drone to take left turn then it would first rotate so front points to left. This mode is useful when you have only front camera and you are operating vehicle using FPV view. This is more or less like travelling in car where you always have front view. The MaxDegreeOfFreedom means you don't care where the front points to. So when you take left turn, you just start going left like crab. Quadrotors can go in any direction regardless of where front points to. The MaxDegreeOfFreedom enables this mode.","title":"drivetrain"},{"location":"apis/#yaw_mode","text":"yaw_mode is a struct YawMode with two fields, yaw_or_rate and is_rate . If is_rate field is True then yaw_or_rate field is interpreted as angular velocity in degrees/sec which means you want vehicle to rotate continuously around its axis at that angular velocity while moving. If is_rate is False then yaw_or_rate is interpreted as angle in degrees which means you want vehicle to rotate to specific angle (i.e. yaw) and keep that angle while moving. You can probably see that when yaw_mode.is_rate == true , the drivetrain parameter shouldn't be set to ForwardOnly because you are contradicting by saying that keep front pointing ahead but also rotate continuously. However if you have yaw_mode.is_rate = false in ForwardOnly mode then you can do some funky stuff. For example, you can have drone do circles and have yaw_or_rate set to 90 so camera is always pointed to center (\"super cool selfie mode\"). In MaxDegreeofFreedom also you can get some funky stuff by setting yaw_mode.is_rate = true and say yaw_mode.yaw_or_rate = 20 . This will cause drone to go in its path while rotating which may allow to do 360 scanning. In most cases, you just don't want yaw to change which you can do by setting yaw rate of 0. The shorthand for this is airsim.YawMode.Zero() (or in C++: YawMode::Zero() ).","title":"yaw_mode"},{"location":"apis/#lookahead-and-adaptive_lookahead","text":"When you ask vehicle to follow a path, AirSim uses \"carrot following\" algorithm. This algorithm operates by looking ahead on path and adjusting its velocity vector. The parameters for this algorithm is specified by lookahead and adaptive_lookahead . For most of the time you want algorithm to auto-decide the values by simply setting lookahead = -1 and adaptive_lookahead = 0 .","title":"lookahead and adaptive_lookahead"},{"location":"apis/#using-apis-on-real-vehicles","text":"We want to be able to run same code that runs in simulation as on real vehicle. This allows you to test your code in simulator and deploy to real vehicle. Generally speaking, APIs therefore shouldn't allow you to do something that cannot be done on real vehicle (for example, getting the ground truth). But, of course, simulator has much more information and it would be useful in applications that may not care about running things on real vehicle. For this reason, we clearly delineate between sim-only APIs by attaching sim prefix, for example, simGetGroundTruthKinematics . This way you can avoid using these simulation-only APIs if you care about running your code on real vehicles. The AirLib is self-contained library that you can put on an offboard computing module such as the Gigabyte barebone Mini PC. This module then can talk to the flight controllers such as PX4 using exact same code and flight controller protocol. The code you write for testing in the simulator remains unchanged. See AirLib on custom drones .","title":"Using APIs on Real Vehicles"},{"location":"apis/#adding-new-apis-to-airsim","text":"See the Adding New APIs page","title":"Adding New APIs to AirSim"},{"location":"apis/#references-and-examples","text":"C++ API Examples Car Examples Multirotor Examples Computer Vision Examples Move on Path demo showing video of fast multirotor flight through Modular Neighborhood environment Building a Hexacopter Building Point Clouds","title":"References and Examples"},{"location":"apis/#faq","text":"","title":"FAQ"},{"location":"apis/#unreal-is-slowed-down-dramatically-when-i-run-api","text":"If you see Unreal getting slowed down dramatically when Unreal Engine window loses focus then go to 'Edit->Editor Preferences' in Unreal Editor, in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked.","title":"Unreal is slowed down dramatically when I run API"},{"location":"apis/#do-i-need-anything-else-on-windows","text":"You should install VS2019 with VC++, Windows SDK 10.0 and Python. To use Python APIs you will need Python 3.5 or later (install it using Anaconda).","title":"Do I need anything else on Windows?"},{"location":"apis/#which-version-of-python-should-i-use","text":"We recommend Anaconda to get Python tools and libraries. Our code is tested with Python 3.5.3 :: Anaconda 4.4.0. This is important because older version have been known to have problems .","title":"Which version of Python should I use?"},{"location":"apis/#i-get-error-on-import-cv2","text":"You can install OpenCV using: conda install opencv pip install opencv-python","title":"I get error on import cv2"},{"location":"apis/#typeerror-unsupported-operand-types-for-asyncioloop-and-float","text":"This error happens if you install Jupyter, which somehow breaks the msgpackrpc library. Create a new python environment which the minimal required packages.","title":"TypeError: unsupported operand type(s) for *: 'AsyncIOLoop' and 'float'"},{"location":"apis_cpp/","text":"Using C++ APIs for AirSim Please read general API doc first if you haven't already. This document describes C++ examples and other C++ specific details. Quick Start Fastest way to get started is to open AirSim.sln in Visual Studio 2017. You will see Hello Car and Hello Drone examples in the solution. These examples will show you the include paths and lib paths you will need to setup in your VC++ projects. If you are using Linux then you will specify these paths either in your cmake file or on compiler command line. Include and Lib Folders Include folders: $(ProjectDir)..\\AirLib\\deps\\rpclib\\include;include;$(ProjectDir)..\\AirLib\\deps\\eigen3;$(ProjectDir)..\\AirLib\\include Dependencies: rpc.lib Lib folders: $(ProjectDir)\\..\\AirLib\\deps\\MavLinkCom\\lib\\$(Platform)\\$(Configuration);$(ProjectDir)\\..\\AirLib\\deps\\rpclib\\lib\\$(Platform)\\$(Configuration);$(ProjectDir)\\..\\AirLib\\lib\\$(Platform)\\$(Configuration) Hello Car Here's how to use AirSim APIs using Python to control simulated car (see also Python example ): // ready to run example: https://github.com/Cosys-Lab/Cosys-AirSim/blob/main/HelloCar/main.cpp #include #include \"vehicles/car/api/CarRpcLibClient.hpp\" int main() { msr::airlib::CarRpcLibClient client; client.enableApiControl(true); //this disables manual control CarControllerBase::CarControls controls; std::cout << \"Press enter to drive forward\" << std::endl; std::cin.get(); controls.throttle = 1; client.setCarControls(controls); std::cout << \"Press Enter to activate handbrake\" << std::endl; std::cin.get(); controls.handbrake = true; client.setCarControls(controls); std::cout << \"Press Enter to take turn and drive backward\" << std::endl; std::cin.get(); controls.handbrake = false; controls.throttle = -1; controls.steering = 1; client.setCarControls(controls); std::cout << \"Press Enter to stop\" << std::endl; std::cin.get(); client.setCarControls(CarControllerBase::CarControls()); return 0; } Hello Drone Here's how to use AirSim APIs using Python to control simulated car (see also Python example ): // ready to run example: https://github.com/Cosys-Lab/Cosys-AirSim/blob/main/HelloDrone/main.cpp #include #include \"vehicles/multirotor/api/MultirotorRpcLibClient.hpp\" int main() { using namespace std; msr::airlib::MultirotorRpcLibClient client; cout << \"Press Enter to enable API control\" << endl; cin.get(); client.enableApiControl(true); cout << \"Press Enter to arm the drone\" << endl; cin.get(); client.armDisarm(true); cout << \"Press Enter to takeoff\" << endl; cin.get(); client.takeoffAsync(5)->waitOnLastTask(); cout << \"Press Enter to move 5 meters in x direction with 1 m/s velocity\" << endl; cin.get(); auto position = client.getMultirotorState().getPosition(); // from current location client.moveToPositionAsync(position.x() + 5, position.y(), position.z(), 1)->waitOnLastTask(); cout << \"Press Enter to land\" << endl; cin.get(); client.landAsync()->waitOnLastTask(); return 0; } See Also Examples of how to use internal infrastructure in AirSim in your other projects DroneShell app shows how to make simple interface using C++ APIs to control drones Python APIs","title":"C++ APIs"},{"location":"apis_cpp/#using-c-apis-for-airsim","text":"Please read general API doc first if you haven't already. This document describes C++ examples and other C++ specific details.","title":"Using C++ APIs for AirSim"},{"location":"apis_cpp/#quick-start","text":"Fastest way to get started is to open AirSim.sln in Visual Studio 2017. You will see Hello Car and Hello Drone examples in the solution. These examples will show you the include paths and lib paths you will need to setup in your VC++ projects. If you are using Linux then you will specify these paths either in your cmake file or on compiler command line.","title":"Quick Start"},{"location":"apis_cpp/#include-and-lib-folders","text":"Include folders: $(ProjectDir)..\\AirLib\\deps\\rpclib\\include;include;$(ProjectDir)..\\AirLib\\deps\\eigen3;$(ProjectDir)..\\AirLib\\include Dependencies: rpc.lib Lib folders: $(ProjectDir)\\..\\AirLib\\deps\\MavLinkCom\\lib\\$(Platform)\\$(Configuration);$(ProjectDir)\\..\\AirLib\\deps\\rpclib\\lib\\$(Platform)\\$(Configuration);$(ProjectDir)\\..\\AirLib\\lib\\$(Platform)\\$(Configuration)","title":"Include and Lib Folders"},{"location":"apis_cpp/#hello-car","text":"Here's how to use AirSim APIs using Python to control simulated car (see also Python example ): // ready to run example: https://github.com/Cosys-Lab/Cosys-AirSim/blob/main/HelloCar/main.cpp #include #include \"vehicles/car/api/CarRpcLibClient.hpp\" int main() { msr::airlib::CarRpcLibClient client; client.enableApiControl(true); //this disables manual control CarControllerBase::CarControls controls; std::cout << \"Press enter to drive forward\" << std::endl; std::cin.get(); controls.throttle = 1; client.setCarControls(controls); std::cout << \"Press Enter to activate handbrake\" << std::endl; std::cin.get(); controls.handbrake = true; client.setCarControls(controls); std::cout << \"Press Enter to take turn and drive backward\" << std::endl; std::cin.get(); controls.handbrake = false; controls.throttle = -1; controls.steering = 1; client.setCarControls(controls); std::cout << \"Press Enter to stop\" << std::endl; std::cin.get(); client.setCarControls(CarControllerBase::CarControls()); return 0; }","title":"Hello Car"},{"location":"apis_cpp/#hello-drone","text":"Here's how to use AirSim APIs using Python to control simulated car (see also Python example ): // ready to run example: https://github.com/Cosys-Lab/Cosys-AirSim/blob/main/HelloDrone/main.cpp #include #include \"vehicles/multirotor/api/MultirotorRpcLibClient.hpp\" int main() { using namespace std; msr::airlib::MultirotorRpcLibClient client; cout << \"Press Enter to enable API control\" << endl; cin.get(); client.enableApiControl(true); cout << \"Press Enter to arm the drone\" << endl; cin.get(); client.armDisarm(true); cout << \"Press Enter to takeoff\" << endl; cin.get(); client.takeoffAsync(5)->waitOnLastTask(); cout << \"Press Enter to move 5 meters in x direction with 1 m/s velocity\" << endl; cin.get(); auto position = client.getMultirotorState().getPosition(); // from current location client.moveToPositionAsync(position.x() + 5, position.y(), position.z(), 1)->waitOnLastTask(); cout << \"Press Enter to land\" << endl; cin.get(); client.landAsync()->waitOnLastTask(); return 0; }","title":"Hello Drone"},{"location":"apis_cpp/#see-also","text":"Examples of how to use internal infrastructure in AirSim in your other projects DroneShell app shows how to make simple interface using C++ APIs to control drones Python APIs","title":"See Also"},{"location":"camera_views/","text":"Camera Views The camera views that are shown on screen are the camera views you can fetch via the simGetImages API . From left to right is the depth view, segmentation view and the FPV view. See Image APIs for description of various available views. Turning ON/OFF Views Press F1 key to see keyboard shortcuts for turning on/off any or all views. You can also select various view modes there, such as \"Fly with Me\" mode, FPV mode and \"Ground View\" mode. Controlling Manual Camera You can switch to manual camera control by pressing the M key. While manual camera control mode is selected, you can use the following keys to control the camera: |Key|Action| ---|--- |Arrow keys|move the camera forward/back and left/right| |Page up/down|move the camera up/down| |W/A/S/D|control pitch up/down and yaw left/right| |Left shift|increase movement speed| |Left control|decrease movement speed| Configuring Sub-Windows Now you can select what is shown by each of above sub windows. For instance, you can chose to show surface normals in first window (instead of depth) and disparity in second window (instead of segmentation). Below is the settings value you can use in settings.json : { \"SubWindows\": [ {\"WindowID\": 1, \"CameraName\": \"0\", \"ImageType\": 5, \"VehicleName\": \"\", \"Visible\": false}, {\"WindowID\": 2, \"CameraName\": \"0\", \"ImageType\": 3, \"VehicleName\": \"\", \"Visible\": false} ] } Performance Impact Note : This section is outdated and has not been updated for new performance enhancement changes. Now rendering these views does impact the FPS performance of the game, since this is additional work for the GPU. The following shows the impact on FPS when you open these views. This is measured on Intel core i7 computer with 32 gb RAM and a GeForce GTX 1080 graphics card running the Modular Neighborhood map, using cooked debug bits, no debugger or GameEditor open. The normal state with no subviews open is measuring around 16 ms per frame, which means it is keeping a nice steady 60 FPS (which is the target FPS). As it climbs up to 35ms the FPS drops to around 28 frames per second, spiking to 40ms means a few drops to 25 fps. The simulator can still function and fly correctly when all this is going on even in the worse case because the physics is decoupled from the rendering. However if the delay gets too high such that the communication with PX4 hardware is interrupted due to overly busy CPU then the flight can stall due to timeout in the offboard control messages. On the computer where this was measured the drone could fly the path.py program without any problems with all views open, and with 3 python scripts running to capture each view type. But there was one stall during this flight, but it recovered gracefully and completed the path. So it was right on the limit. The following shows the impact on CPU, perhaps a bit surprisingly, the CPU impact is also non trivial.","title":"Camera Views"},{"location":"camera_views/#camera-views","text":"The camera views that are shown on screen are the camera views you can fetch via the simGetImages API . From left to right is the depth view, segmentation view and the FPV view. See Image APIs for description of various available views.","title":"Camera Views"},{"location":"camera_views/#turning-onoff-views","text":"Press F1 key to see keyboard shortcuts for turning on/off any or all views. You can also select various view modes there, such as \"Fly with Me\" mode, FPV mode and \"Ground View\" mode.","title":"Turning ON/OFF Views"},{"location":"camera_views/#controlling-manual-camera","text":"You can switch to manual camera control by pressing the M key. While manual camera control mode is selected, you can use the following keys to control the camera: |Key|Action| ---|--- |Arrow keys|move the camera forward/back and left/right| |Page up/down|move the camera up/down| |W/A/S/D|control pitch up/down and yaw left/right| |Left shift|increase movement speed| |Left control|decrease movement speed|","title":"Controlling Manual Camera"},{"location":"camera_views/#configuring-sub-windows","text":"Now you can select what is shown by each of above sub windows. For instance, you can chose to show surface normals in first window (instead of depth) and disparity in second window (instead of segmentation). Below is the settings value you can use in settings.json : { \"SubWindows\": [ {\"WindowID\": 1, \"CameraName\": \"0\", \"ImageType\": 5, \"VehicleName\": \"\", \"Visible\": false}, {\"WindowID\": 2, \"CameraName\": \"0\", \"ImageType\": 3, \"VehicleName\": \"\", \"Visible\": false} ] }","title":"Configuring Sub-Windows"},{"location":"camera_views/#performance-impact","text":"Note : This section is outdated and has not been updated for new performance enhancement changes. Now rendering these views does impact the FPS performance of the game, since this is additional work for the GPU. The following shows the impact on FPS when you open these views. This is measured on Intel core i7 computer with 32 gb RAM and a GeForce GTX 1080 graphics card running the Modular Neighborhood map, using cooked debug bits, no debugger or GameEditor open. The normal state with no subviews open is measuring around 16 ms per frame, which means it is keeping a nice steady 60 FPS (which is the target FPS). As it climbs up to 35ms the FPS drops to around 28 frames per second, spiking to 40ms means a few drops to 25 fps. The simulator can still function and fly correctly when all this is going on even in the worse case because the physics is decoupled from the rendering. However if the delay gets too high such that the communication with PX4 hardware is interrupted due to overly busy CPU then the flight can stall due to timeout in the offboard control messages. On the computer where this was measured the drone could fly the path.py program without any problems with all views open, and with 3 python scripts running to capture each view type. But there was one stall during this flight, but it recovered gracefully and completed the path. So it was right on the limit. The following shows the impact on CPU, perhaps a bit surprisingly, the CPU impact is also non trivial.","title":"Performance Impact"},{"location":"cmake_linux/","text":"Installing cmake on Linux If you don't have cmake version 3.10 (for example, 3.2.2 is the default on Ubuntu 14) you can run the following: mkdir ~/cmake-3.10.2 cd ~/cmake-3.10.2 wget https://cmake.org/files/v3.10/cmake-3.10.2-Linux-x86_64.sh Now you have to run this command by itself (it is interactive) sh cmake-3.10.2-Linux-x86_64.sh --prefix ~/cmake-3.10.2 Answer 'n' to the question about creating another cmake-3.10.2-Linux-x86_64 folder and then sudo update-alternatives --install /usr/bin/cmake cmake ~/cmake-3.10.2/bin/cmake 60 Now type cmake --version to make sure your cmake version is 3.10.2.","title":"Installing cmake on Linux"},{"location":"cmake_linux/#installing-cmake-on-linux","text":"If you don't have cmake version 3.10 (for example, 3.2.2 is the default on Ubuntu 14) you can run the following: mkdir ~/cmake-3.10.2 cd ~/cmake-3.10.2 wget https://cmake.org/files/v3.10/cmake-3.10.2-Linux-x86_64.sh Now you have to run this command by itself (it is interactive) sh cmake-3.10.2-Linux-x86_64.sh --prefix ~/cmake-3.10.2 Answer 'n' to the question about creating another cmake-3.10.2-Linux-x86_64 folder and then sudo update-alternatives --install /usr/bin/cmake cmake ~/cmake-3.10.2/bin/cmake 60 Now type cmake --version to make sure your cmake version is 3.10.2.","title":"Installing cmake on Linux"},{"location":"custom_drone/","text":"AirLib on a Real Drone The AirLib library can be compiled and deployed on the companion computer on a real drone. For our testing, we mounted a Gigabyte Brix BXi7-5500 ultra compact PC on the drone connected to the Pixhawk flight controller over USB. The Gigabyte PC is running Ubuntu, so we are able to SSH into it over Wi-Fi: Once connected you can run MavLinkTest with this command line: MavLinkTest -serial:/dev/ttyACM0,115200 -logdir:. And this will produce a log file of the flight which can then be used for playback in the simulator . You can also add -proxy:192.168.1.100:14550 to connect MavLinkTest to a remote computer where you can run QGroundControl or our PX4 Log Viewer which is another handy way to see what is going on with your drone. MavLinkTest then has some simple commands for testing your drone, here's a simple example of some commands: arm takeoff 5 orbit 10 2 This will arm the drone, takeoff of 5 meters, then do an orbit pattern radius 10 meters, at 2 m/s. Type '?' to find all available commands. Note: Some commands (for example, orbit ) are named differently and have different syntax in MavLinkTest and DroneShell (for example, circlebypath -radius 10 -velocity 21 ). When you land the drone you can stop MavLinkTest and copy the *.mavlink log file that was generated. DroneServer and DroneShell Once you are happy that the MavLinkTest is working, you can also run DroneServer and DroneShell as follows. First, run MavLinkTest with a local proxy to send everything to DroneServer: MavLinkTest -serial:/dev/ttyACM0,115200 -logdir:. -proxy:127.0.0.1:14560 Change ~/Documents/AirSim/settings.json to say \"serial\":false, because we want DroneServer to look for this UDP connection. DroneServer 0 Lastly, you can now connect DroneShell to this instance of DroneServer and use the DroneShell commands to fly your drone: DroneShell ==||=> Welcome to DroneShell 1.0. Type ? for help. Microsoft Research (c) 2016. Waiting for drone to report a valid GPS location... ==||=> requestcontrol ==||=> arm ==||=> takeoff ==||=> circlebypath -radius 10 -velocity 2 PX4 Specific Tools You can run the MavlinkCom library and MavLinkTest app to test the connection between your companion computer and flight controller. How Does This Work? AirSim uses MavLinkCom component developed by @lovettchris. The MavLinkCom has a proxy architecture where you can open a connection to PX4 either using serial or UDP and then other components share this connection. When PX4 sends MavLink message, all components receive that message. If any component sends a message then it's received by PX4 only. This allows you to connect any number of components to PX4 This code opens a connection for LogViewer and QGC. You can add something more if you like. If you want to use QGC + AirSim together than you will need QGC to let own the serial port. QGC opens up TCP connection that acts as a proxy so any other component can connect to QGC and send MavLinkMessage to QGC and then QGC forwards that message to PX4. So you tell AirSim to connect to QGC and let QGC own serial port. For companion board, the way we did it earlier was to have Gigabyte Brix on the drone. This x86 full-fledged computer that will connect to PX4 through USB. We had Ubuntu on Brix and ran DroneServer . The DroneServer created an API endpoint that we can talk to via C++ client code (or Python code) and it translated API calls to MavLink messages. That way you can write your code against the same API, test it in the simulator and then run the same code on an actual vehicle. So the companion computer has DroneServer running along with client code.","title":"AirSim on Real Drones"},{"location":"custom_drone/#airlib-on-a-real-drone","text":"The AirLib library can be compiled and deployed on the companion computer on a real drone. For our testing, we mounted a Gigabyte Brix BXi7-5500 ultra compact PC on the drone connected to the Pixhawk flight controller over USB. The Gigabyte PC is running Ubuntu, so we are able to SSH into it over Wi-Fi: Once connected you can run MavLinkTest with this command line: MavLinkTest -serial:/dev/ttyACM0,115200 -logdir:. And this will produce a log file of the flight which can then be used for playback in the simulator . You can also add -proxy:192.168.1.100:14550 to connect MavLinkTest to a remote computer where you can run QGroundControl or our PX4 Log Viewer which is another handy way to see what is going on with your drone. MavLinkTest then has some simple commands for testing your drone, here's a simple example of some commands: arm takeoff 5 orbit 10 2 This will arm the drone, takeoff of 5 meters, then do an orbit pattern radius 10 meters, at 2 m/s. Type '?' to find all available commands. Note: Some commands (for example, orbit ) are named differently and have different syntax in MavLinkTest and DroneShell (for example, circlebypath -radius 10 -velocity 21 ). When you land the drone you can stop MavLinkTest and copy the *.mavlink log file that was generated.","title":"AirLib on a Real Drone"},{"location":"custom_drone/#droneserver-and-droneshell","text":"Once you are happy that the MavLinkTest is working, you can also run DroneServer and DroneShell as follows. First, run MavLinkTest with a local proxy to send everything to DroneServer: MavLinkTest -serial:/dev/ttyACM0,115200 -logdir:. -proxy:127.0.0.1:14560 Change ~/Documents/AirSim/settings.json to say \"serial\":false, because we want DroneServer to look for this UDP connection. DroneServer 0 Lastly, you can now connect DroneShell to this instance of DroneServer and use the DroneShell commands to fly your drone: DroneShell ==||=> Welcome to DroneShell 1.0. Type ? for help. Microsoft Research (c) 2016. Waiting for drone to report a valid GPS location... ==||=> requestcontrol ==||=> arm ==||=> takeoff ==||=> circlebypath -radius 10 -velocity 2","title":"DroneServer and DroneShell"},{"location":"custom_drone/#px4-specific-tools","text":"You can run the MavlinkCom library and MavLinkTest app to test the connection between your companion computer and flight controller.","title":"PX4 Specific Tools"},{"location":"custom_drone/#how-does-this-work","text":"AirSim uses MavLinkCom component developed by @lovettchris. The MavLinkCom has a proxy architecture where you can open a connection to PX4 either using serial or UDP and then other components share this connection. When PX4 sends MavLink message, all components receive that message. If any component sends a message then it's received by PX4 only. This allows you to connect any number of components to PX4 This code opens a connection for LogViewer and QGC. You can add something more if you like. If you want to use QGC + AirSim together than you will need QGC to let own the serial port. QGC opens up TCP connection that acts as a proxy so any other component can connect to QGC and send MavLinkMessage to QGC and then QGC forwards that message to PX4. So you tell AirSim to connect to QGC and let QGC own serial port. For companion board, the way we did it earlier was to have Gigabyte Brix on the drone. This x86 full-fledged computer that will connect to PX4 through USB. We had Ubuntu on Brix and ran DroneServer . The DroneServer created an API endpoint that we can talk to via C++ client code (or Python code) and it translated API calls to MavLink messages. That way you can write your code against the same API, test it in the simulator and then run the same code on an actual vehicle. So the companion computer has DroneServer running along with client code.","title":"How Does This Work?"},{"location":"distance_sensor/","text":"Distance Sensor By default, Distance Sensor points to the front of the vehicle. It can be pointed in any direction by modifying the settings Configurable Parameters - Parameter Description X Y Z Position of the sensor relative to the vehicle (in NED, in meters) (Default (0,0,0)-Multirotor, (0,0,-1)-Car) Yaw Pitch Roll Orientation of the sensor relative to the vehicle (degrees) (Default (0,0,0)) MinDistance Minimum distance measured by distance sensor (metres, only used to fill Mavlink message for PX4) (Default 0.2m) MaxDistance Maximum distance measured by distance sensor (metres) (Default 40.0m) ExternalController Whether data is to be sent to external controller such as ArduPilot or PX4 if being used (default true ) For example, to make the sensor point towards the ground (for altitude measurement similar to barometer), the orientation can be modified as follows - \"Distance\": { \"SensorType\": 5, \"Enabled\" : true, \"Yaw\": 0, \"Pitch\": -90, \"Roll\": 0 } Note: For Cars, the sensor is placed 1 meter above the vehicle center by default. This is required since otherwise the sensor gives strange data due it being inside the vehicle. This doesn't affect the sensor values say when measuring the distance between 2 cars. See PythonClient/car/distance_sensor_multi.py for an example usage.","title":"Distance Sensor"},{"location":"distance_sensor/#distance-sensor","text":"By default, Distance Sensor points to the front of the vehicle. It can be pointed in any direction by modifying the settings Configurable Parameters - Parameter Description X Y Z Position of the sensor relative to the vehicle (in NED, in meters) (Default (0,0,0)-Multirotor, (0,0,-1)-Car) Yaw Pitch Roll Orientation of the sensor relative to the vehicle (degrees) (Default (0,0,0)) MinDistance Minimum distance measured by distance sensor (metres, only used to fill Mavlink message for PX4) (Default 0.2m) MaxDistance Maximum distance measured by distance sensor (metres) (Default 40.0m) ExternalController Whether data is to be sent to external controller such as ArduPilot or PX4 if being used (default true ) For example, to make the sensor point towards the ground (for altitude measurement similar to barometer), the orientation can be modified as follows - \"Distance\": { \"SensorType\": 5, \"Enabled\" : true, \"Yaw\": 0, \"Pitch\": -90, \"Roll\": 0 } Note: For Cars, the sensor is placed 1 meter above the vehicle center by default. This is required since otherwise the sensor gives strange data due it being inside the vehicle. This doesn't affect the sensor values say when measuring the distance between 2 cars. See PythonClient/car/distance_sensor_multi.py for an example usage.","title":"Distance Sensor"},{"location":"dynamic_objects/","text":"Setup Dynamic Objects for Scenario Environments for Cosys-AirSim The available environments often feature some custom-made dynamic blueprints that can be used to create random but deterministic change in your environment. Location While these can be found in the environments available, they are also separately saved in Unreal/Environments/DynamicObjects . Copy the c++ files to your environments Source folder ( Environments/Source/levelname/ ) and copy the uassets to your Contents folder. Features Dynamic AI humans walking between waypoints Dynamic spawning stacked goods (pallets etc.) Dynamic static objects spawning (either always the same or pick from a set of options) Small dynamic changes such as random open doors. All randomization is controllable by a seed to make sure you can simulate the same setup again. Other animate objects such as modular conveyor belts and robotic arms are available as well. Some features can also be configured with a launchfile/launch parameters. Usage There are several object types and settings to make the environment dynamic. Here follows some simple instructions to tweak and alter their behaviour using the created blueprints. Seed & World Dynamics Configuration To control the randomisation functionally used in the dynamic objects, a controllable seed number is used. In every level using the dynamic objects an actor has to be present of the class Dynamic World Master . This object will visually show the chosen seed every time the simulation is started (it can be hidden as well with a toggle). There are few other toggles available as well. The seed and these other settings can be controlled both in standalone(build packages) and in the editor: - Editor : - To control the seed in the editor, change the Editor Seed setting of the Dynamic World Master actor in the level. If it is set to 0, it will generate a random seed number. But if set to anything else, it will use that number as seed for all dynamic objects. - To make the world static and turn off all dynamic changes throughout the simulation (conveyor belts, randomized changes to statics) set the Editor Is Static boolean to true. Default is false. - Toggle the AI in the world on and off set the Editor Spawn AI . Default to true. - Standalone : - To control the seed when using the simulator as standalone, use the launch parameter -startSeed INT with X being the chosen seed value. if not set it will chose a random one. - To make the world static and turn off all dynamic changes throughout the simulation (conveyor belts, randomized changes to statics) add a launch parameter -isStatic BOOL with the boolean set to true. If not set it defaults to false. - Toggle the AI in the world on and off with a launch parameter the -spawnAI BOOL . If not set it defaults to true. Start Point In order for your environment to have multiple starting points, the Dynamic World Master can be configured to teleport the AirSim vehicle after launch to one of several manually defined starting points. To define a new startpoint in your environment, place objects of the type Target Point in your environment. At launch these will be marked as potential staring points for the simulator. They are used in order that they appear in the World Outliner. To configure which starting point is used, you can configure the number: - Editor : To control the starting point in the editor, change the Editor Start Point setting of the Dynamic World Master actor in the level. - Standalone : To control the starting point in aa standalone build, use the launch parameter -startPoint INT with X being the chosen starting point. Dynamic Marked Objects Objects in your environment can be marked to be 'dynamic'. It's mean purpose is to offer a simple setup with a configurable amount of dynamic objects that can move or be deleted. All dynamic objects (those that can be removed or moved) need to have their actor tag set to DynamicObject . To control the dynamic objects a few parameters are available: - Remove percentage : number of marked objects to remove randomly - Move percentage : number of marked objects to move and rotate slightly - Move offset : maximum the object can move in cm - Rotation offset : maximum rotation in degrees the object can rotate These are as well available in the editor with the DynamicWorldMaster actors settings. Or can be setup in the launch parameters. - Editor : Search for the settings Editor Remove Percentage , Editor Move Percentage , Editor Move Offset Value and Editor Rotation Offset Value to configure the dynamic marked objects system. - Standalone : Use the launch parameters -removePercentage INT , -movePercentage INT , -moveOffsetValue INT and -moveRotationValue INT to configure the dynamic marked objects system. Furthermore, you can mark an object as Guide with an actor tag. This will print out the horizontal distance of all marked dynamic objects to these Guide Objects for debugging or validation purposes. LaunchFile You can also use a file to define the previous dynamic setting configurations line per line. Then by pressing a button ( O ) you can switch to the next configuration. This file has the following structure: seed,removePercentage,movePercentage,moveOffsetValue,moveRotationValue For example you can create launchfile.ini , each line defining a new configuration: 0,50,25,50,50 450,10,10,50,50 450,10,10,50,50 500,10,10,50,50 450,10,10,50,50 Do note that this only configures those 5 settings. The Starting point , Is Static and Spawn AI settings are not configured this way and are configured just as before. to make the environment load this file, you need to define it. Which similarly to before is different for the editor or standalone: - Editor : - To control the launchfile in the editor, enable the Use Launch File toggle and set the Editor Launch File field to the absolute filepath of the launchfile of the DynamicWorldMaster actor in the level. - Standalone : - To control the launchfile when using the simulator as standalone, use the launch parameter -launchFile STRING and set it to the absolute filepath of the launchfile. Dynamic Static Spawners Some blueprints are also available to be used to spawn dynamic objects. Currently, there are 4 dynamic static object spawners blueprints available: - RandomStackSpawner : This can be used to create a dynamic formed stacked set of goods. Like a pallet with boxes spawned on top. One can control it with the following settings: Setting Description Static The StaticMesh that needs to be stacked dynamically Min Width Count Minimum amount of statics to spawn in the Y direction Max Width Count Maximum amount of statics to spawn in the Y direction Min Length Count Minimum amount of statics to spawn in the X direction Max Length Count Maximum amount of statics to spawn in the X direction Min Height Count Minimum amount of statics to spawn in the Z direction Max Height Count Maximum amount of statics to spawn in the Z direction Random Rotation Boolean to toggle the application of a random rotation to the object ( for barrels and other cylinder objects) Random Position Offset Value in cm to apply a random offset in position in any direction Chance To Spawn Percentage of chance to spawn each object in the stack Chance to Change Percentage of chance to alter the stack configuration every so many seconds Average Time Between Changes Average time delta in seconds between changes Max Time Between Changes Offset Maximum time delta offset in seconds between changes (to not have all objects change at same time a small random offset is used) Nav Collision W/L/H This setting can be used to create a area around the object spawner of which the Dynamic AI pathfinding will stay away from. RandomStackSpawnerSwitcher : This can be used to create a dynamic formed stacked set of goods. Like a pallet with boxes spawned on top. Difference with the one above is that this one can select from a Data Table object to select randomly a 'goods'/object type, and it's stacking settings. One can control it with the following settings: Setting Description Data Table The Data Table object of the type RandomStackSpawnerSwitcherStruct to set the object types and their settings similar to the ones above for the normal RandomStackSpawner Chance To Spawn Percentage of chance to spawn each object in the stack Chance to Change Percentage of chance to alter the stack configuration every so many seconds Chance To Switch Percentage of chance to switch to a different object type from Data Table Average Time Between Changes Average time delta in seconds between changes Max Time Between Changes Offset Maximum time delta offset in seconds between changes (to not have all objects change at same time a small random offset is used) RandomStaticModifier : This can be used to spawn a singular static and alter it's spawn transform dynamically. One can control it with the following settings: Setting Description Static The StaticMesh that needs to be spawned Chance To Spawn Percentage of chance to spawn the object Max Rotation Offset Maximum rotation in degrees (both positive and negative) to alter the transform Max XPosition Offset Maximum position offset in cm (both positive and negative) to alter the transform in the X axis Max YPosition Offset Maximum position offset in cm (both positive and negative) to alter the transform in the Y axis Chance to Change Percentage of chance to alter the stack configuration every so many seconds Average Time Between Changes Average time delta in seconds between changes Max Time Between Changes Offset Maximum time delta offset in seconds between changes (to not have all objects change at same time a small random offset is used) RandomStaticPicker : This can be used to spawn a singular randomly picked static out of list of chosen statics and alter it's spawn transform dynamically. One can control it with the following settings: Setting Description Statics The list of StaticMesh objects that can be picked from to spawn one Chance To Spawn Percentage of chance to spawn the object Max Rotation Offset Maximum rotation in degrees (both positive and negative) to alter the transform Max XPosition Offset Maximum position offset in cm (both positive and negative) to alter the transform in the X axis Max YPosition Offset Maximum position offset in cm (both positive and negative) to alter the transform in the Y axis Chance to Change Percentage of chance to alter the stack configuration every so many seconds Average Time Between Changes Average time delta in seconds between changes Max Time Between Changes Offset Maximum time delta offset in seconds between changes (to not have all objects change at same time a small random offset is used) There are some other more simple dynamic objects such as doors and conveyor belts that have self-explanatory settings very similar to those above. All are based on the seed randomisation. Grouped AI A human looking AI is also available to walk dynamically between a set of self chosen waypoints. They are based on the DetourAIController of Unreal so will avoid each other and the user pretty well. In order to have more control over them, some custom blueprints were created. Their main features are that the AI themselves, the waypoints and the spawners can be assigned a group ID number. So that all functionally is grouped. They also use the Seed randomisation so that they can be spawned at the same waypoints and target the same waypoints each time if the same seed value is chosen. The following blueprints are available: - GroupedTargetPoint : These are the TargetPoints (waypoints) that the AI will walk between. Their only setting is the group ID number to decide which AI group will be able to pick this waypoint to spawn AI and be the target waypoint for them. - GroupedAI : One can manually spawn an AI by placing these in the world. They need to be assigned the Group ID manually to choose which waypoints to target. - GroupedAISpawner : To automate the spawner of AI, one can use this blueprint. It will spawn Ai at the waypoints of the same group. A setting is available to configure the fill percentage. This will set the percentage of waypoints to spawn AI upon. On also has to chose which Skeletal Meshes and their Animation Blueprints can be chosen from. Spline Animations Making statics and skeletal meshes move along a spline path at a fixed speed. See the video below for more information on how it works:","title":"Dynamic Objects"},{"location":"dynamic_objects/#setup-dynamic-objects-for-scenario-environments-for-cosys-airsim","text":"The available environments often feature some custom-made dynamic blueprints that can be used to create random but deterministic change in your environment.","title":"Setup Dynamic Objects for Scenario Environments for Cosys-AirSim"},{"location":"dynamic_objects/#location","text":"While these can be found in the environments available, they are also separately saved in Unreal/Environments/DynamicObjects . Copy the c++ files to your environments Source folder ( Environments/Source/levelname/ ) and copy the uassets to your Contents folder.","title":"Location"},{"location":"dynamic_objects/#features","text":"Dynamic AI humans walking between waypoints Dynamic spawning stacked goods (pallets etc.) Dynamic static objects spawning (either always the same or pick from a set of options) Small dynamic changes such as random open doors. All randomization is controllable by a seed to make sure you can simulate the same setup again. Other animate objects such as modular conveyor belts and robotic arms are available as well. Some features can also be configured with a launchfile/launch parameters.","title":"Features"},{"location":"dynamic_objects/#usage","text":"There are several object types and settings to make the environment dynamic. Here follows some simple instructions to tweak and alter their behaviour using the created blueprints.","title":"Usage"},{"location":"dynamic_objects/#seed-world-dynamics-configuration","text":"To control the randomisation functionally used in the dynamic objects, a controllable seed number is used. In every level using the dynamic objects an actor has to be present of the class Dynamic World Master . This object will visually show the chosen seed every time the simulation is started (it can be hidden as well with a toggle). There are few other toggles available as well. The seed and these other settings can be controlled both in standalone(build packages) and in the editor: - Editor : - To control the seed in the editor, change the Editor Seed setting of the Dynamic World Master actor in the level. If it is set to 0, it will generate a random seed number. But if set to anything else, it will use that number as seed for all dynamic objects. - To make the world static and turn off all dynamic changes throughout the simulation (conveyor belts, randomized changes to statics) set the Editor Is Static boolean to true. Default is false. - Toggle the AI in the world on and off set the Editor Spawn AI . Default to true. - Standalone : - To control the seed when using the simulator as standalone, use the launch parameter -startSeed INT with X being the chosen seed value. if not set it will chose a random one. - To make the world static and turn off all dynamic changes throughout the simulation (conveyor belts, randomized changes to statics) add a launch parameter -isStatic BOOL with the boolean set to true. If not set it defaults to false. - Toggle the AI in the world on and off with a launch parameter the -spawnAI BOOL . If not set it defaults to true.","title":"Seed & World Dynamics Configuration"},{"location":"dynamic_objects/#start-point","text":"In order for your environment to have multiple starting points, the Dynamic World Master can be configured to teleport the AirSim vehicle after launch to one of several manually defined starting points. To define a new startpoint in your environment, place objects of the type Target Point in your environment. At launch these will be marked as potential staring points for the simulator. They are used in order that they appear in the World Outliner. To configure which starting point is used, you can configure the number: - Editor : To control the starting point in the editor, change the Editor Start Point setting of the Dynamic World Master actor in the level. - Standalone : To control the starting point in aa standalone build, use the launch parameter -startPoint INT with X being the chosen starting point.","title":"Start Point"},{"location":"dynamic_objects/#dynamic-marked-objects","text":"Objects in your environment can be marked to be 'dynamic'. It's mean purpose is to offer a simple setup with a configurable amount of dynamic objects that can move or be deleted. All dynamic objects (those that can be removed or moved) need to have their actor tag set to DynamicObject . To control the dynamic objects a few parameters are available: - Remove percentage : number of marked objects to remove randomly - Move percentage : number of marked objects to move and rotate slightly - Move offset : maximum the object can move in cm - Rotation offset : maximum rotation in degrees the object can rotate These are as well available in the editor with the DynamicWorldMaster actors settings. Or can be setup in the launch parameters. - Editor : Search for the settings Editor Remove Percentage , Editor Move Percentage , Editor Move Offset Value and Editor Rotation Offset Value to configure the dynamic marked objects system. - Standalone : Use the launch parameters -removePercentage INT , -movePercentage INT , -moveOffsetValue INT and -moveRotationValue INT to configure the dynamic marked objects system. Furthermore, you can mark an object as Guide with an actor tag. This will print out the horizontal distance of all marked dynamic objects to these Guide Objects for debugging or validation purposes.","title":"Dynamic Marked Objects"},{"location":"dynamic_objects/#launchfile","text":"You can also use a file to define the previous dynamic setting configurations line per line. Then by pressing a button ( O ) you can switch to the next configuration. This file has the following structure: seed,removePercentage,movePercentage,moveOffsetValue,moveRotationValue For example you can create launchfile.ini , each line defining a new configuration: 0,50,25,50,50 450,10,10,50,50 450,10,10,50,50 500,10,10,50,50 450,10,10,50,50 Do note that this only configures those 5 settings. The Starting point , Is Static and Spawn AI settings are not configured this way and are configured just as before. to make the environment load this file, you need to define it. Which similarly to before is different for the editor or standalone: - Editor : - To control the launchfile in the editor, enable the Use Launch File toggle and set the Editor Launch File field to the absolute filepath of the launchfile of the DynamicWorldMaster actor in the level. - Standalone : - To control the launchfile when using the simulator as standalone, use the launch parameter -launchFile STRING and set it to the absolute filepath of the launchfile.","title":"LaunchFile"},{"location":"dynamic_objects/#dynamic-static-spawners","text":"Some blueprints are also available to be used to spawn dynamic objects. Currently, there are 4 dynamic static object spawners blueprints available: - RandomStackSpawner : This can be used to create a dynamic formed stacked set of goods. Like a pallet with boxes spawned on top. One can control it with the following settings: Setting Description Static The StaticMesh that needs to be stacked dynamically Min Width Count Minimum amount of statics to spawn in the Y direction Max Width Count Maximum amount of statics to spawn in the Y direction Min Length Count Minimum amount of statics to spawn in the X direction Max Length Count Maximum amount of statics to spawn in the X direction Min Height Count Minimum amount of statics to spawn in the Z direction Max Height Count Maximum amount of statics to spawn in the Z direction Random Rotation Boolean to toggle the application of a random rotation to the object ( for barrels and other cylinder objects) Random Position Offset Value in cm to apply a random offset in position in any direction Chance To Spawn Percentage of chance to spawn each object in the stack Chance to Change Percentage of chance to alter the stack configuration every so many seconds Average Time Between Changes Average time delta in seconds between changes Max Time Between Changes Offset Maximum time delta offset in seconds between changes (to not have all objects change at same time a small random offset is used) Nav Collision W/L/H This setting can be used to create a area around the object spawner of which the Dynamic AI pathfinding will stay away from. RandomStackSpawnerSwitcher : This can be used to create a dynamic formed stacked set of goods. Like a pallet with boxes spawned on top. Difference with the one above is that this one can select from a Data Table object to select randomly a 'goods'/object type, and it's stacking settings. One can control it with the following settings: Setting Description Data Table The Data Table object of the type RandomStackSpawnerSwitcherStruct to set the object types and their settings similar to the ones above for the normal RandomStackSpawner Chance To Spawn Percentage of chance to spawn each object in the stack Chance to Change Percentage of chance to alter the stack configuration every so many seconds Chance To Switch Percentage of chance to switch to a different object type from Data Table Average Time Between Changes Average time delta in seconds between changes Max Time Between Changes Offset Maximum time delta offset in seconds between changes (to not have all objects change at same time a small random offset is used) RandomStaticModifier : This can be used to spawn a singular static and alter it's spawn transform dynamically. One can control it with the following settings: Setting Description Static The StaticMesh that needs to be spawned Chance To Spawn Percentage of chance to spawn the object Max Rotation Offset Maximum rotation in degrees (both positive and negative) to alter the transform Max XPosition Offset Maximum position offset in cm (both positive and negative) to alter the transform in the X axis Max YPosition Offset Maximum position offset in cm (both positive and negative) to alter the transform in the Y axis Chance to Change Percentage of chance to alter the stack configuration every so many seconds Average Time Between Changes Average time delta in seconds between changes Max Time Between Changes Offset Maximum time delta offset in seconds between changes (to not have all objects change at same time a small random offset is used) RandomStaticPicker : This can be used to spawn a singular randomly picked static out of list of chosen statics and alter it's spawn transform dynamically. One can control it with the following settings: Setting Description Statics The list of StaticMesh objects that can be picked from to spawn one Chance To Spawn Percentage of chance to spawn the object Max Rotation Offset Maximum rotation in degrees (both positive and negative) to alter the transform Max XPosition Offset Maximum position offset in cm (both positive and negative) to alter the transform in the X axis Max YPosition Offset Maximum position offset in cm (both positive and negative) to alter the transform in the Y axis Chance to Change Percentage of chance to alter the stack configuration every so many seconds Average Time Between Changes Average time delta in seconds between changes Max Time Between Changes Offset Maximum time delta offset in seconds between changes (to not have all objects change at same time a small random offset is used) There are some other more simple dynamic objects such as doors and conveyor belts that have self-explanatory settings very similar to those above. All are based on the seed randomisation.","title":"Dynamic Static Spawners"},{"location":"dynamic_objects/#grouped-ai","text":"A human looking AI is also available to walk dynamically between a set of self chosen waypoints. They are based on the DetourAIController of Unreal so will avoid each other and the user pretty well. In order to have more control over them, some custom blueprints were created. Their main features are that the AI themselves, the waypoints and the spawners can be assigned a group ID number. So that all functionally is grouped. They also use the Seed randomisation so that they can be spawned at the same waypoints and target the same waypoints each time if the same seed value is chosen. The following blueprints are available: - GroupedTargetPoint : These are the TargetPoints (waypoints) that the AI will walk between. Their only setting is the group ID number to decide which AI group will be able to pick this waypoint to spawn AI and be the target waypoint for them. - GroupedAI : One can manually spawn an AI by placing these in the world. They need to be assigned the Group ID manually to choose which waypoints to target. - GroupedAISpawner : To automate the spawner of AI, one can use this blueprint. It will spawn Ai at the waypoints of the same group. A setting is available to configure the fill percentage. This will set the percentage of waypoints to spawn AI upon. On also has to chose which Skeletal Meshes and their Animation Blueprints can be chosen from.","title":"Grouped AI"},{"location":"dynamic_objects/#spline-animations","text":"Making statics and skeletal meshes move along a spline path at a fixed speed. See the video below for more information on how it works:","title":"Spline Animations"},{"location":"echo/","text":"How to Use Echo sensor modalities in Cosys-AirSim Cosys-AirSim supports Echo sensors for multirotors and cars. Echo sensors can be configured to behave like sonar, radar or other echo-based sensor types. The enablement of an echo sensor and the other settings can be configured via AirSimSettings json. Please see general sensors for information on configuration of general/shared sensor settings. Enabling echo sensor on a vehicle By default, echo sensors are not enabled. To enable one, set the SensorType and Enabled attributes in settings json. \"echo1\": { \"SensorType\": 7, \"Enabled\" : true, Multiple echo sensors can be enabled on a vehicle . Echo configuration The following parameters can be configured right now via settings json. Parameter Description X Y Z Position of the echo sensor relative to the vehicle (in NED, in meters) Roll Pitch Yaw Orientation of the echo sensor relative to the vehicle (in degrees, yaw-pitch-roll order to front vector +X) External Uncouple the sensor from the vehicle. If enabled, the position and orientation will be relative to Unreal world coordinates ExternalLocal When in external mode, if this is enabled the retrieved pose of the sensor will be in Local NED coordinates(from starting position from vehicle) and not converted Unreal NED coordinates which is default runParallel Uses CPU parallelisation for speeding up the ray casting for active sensing. This disables all debug drawing except for the final reflected points if enabled (DrawReflectedPoints) SenseActive Enable active sensing where the sensor will emit a signal and receive signals from the reflections SensePassive Enable passive sensing where the sensor will receive signals from other active sources in the world (Passive Echo Beacons, see details below) PassiveRadius The radius in meters in which the sensor will receive signals from passive sources if that mode is enabled NumberOfTraces Amount of traces (rays) being cast. If set to a negative value, it will only do 2D sensing in horizontal plane! SensorLowerAzimuthLimit The lower azimuth angle limit in degrees for receiving signals on the sensor (default = -90) SensorUpperAzimuthLimit The upper azimuth angle limit in degrees for receiving signals on the sensor (default = 90) SensorLowerElevationLimit The lower elevation angle limit in degrees for receiving signals on the sensor (default = -90) SensorUpperElevationLimit The upper elevation angle limit in degrees for receiving signals on the sensor (default = 90) MeasurementFrequency The frequency of the sensor (measurements/s) SensorDiameter The diameter of the sensor plane used to capture the reflecting traces (meter) ReflectionOpeningAngle Opening angle of reflections (degrees) ReflectionLimit Maximum amount of reflections that can happen. ReflectionDistanceLimit Maximum distance between two reflections (meters) AttenuationPerDistance Attenuation of signal wrt distance traveled (dB/m) AttenuationPerReflection Attenuation of signal wrt reflections (dB) AttenuationLimit Attenuation at which the signal is considered dissipated (dB) DistanceLimit Maximum distance a reflection can travel (meters) PauseAfterMeasurement Pause the simulation after each measurement. Useful for API interaction to be synced IgnoreMarked Remove objects with the Unreal Tag MarkedIgnore from the sensor data DrawReflectedPoints Draw debug points in world where reflected points are captured by the sensor DrawReflectedLines Draw debug lines in world from reflected points to the sensor DrawReflectedPaths Draw the full paths of the reflected points DrawInitialPoints Draw the points of the initial half sphere where the traces (rays) are cast DrawExternalPoints Draw a pointcloud coming through the API from an external source DrawBounceLines Draw lines of all bouncing reflections of the traces with their color depending on attenuation DrawPassiveSources Draw debug points and reflection lines for all detected passive echo sources (original sources and their reflection echos against objects) DrawPassiveLines Draw debug lines of the sensor to the passive echo sources that are detected with line of sight. DrawSensor Draw the physical sensor in the world on the vehicle with a 3D axes shown where the sensor is e.g., { \"SeeDocsAt\": \"https://cosys-lab.github.io/settings/\", \"SettingsVersion\": 2.0, \"SimMode\": \"SkidVehicle\", \"Vehicles\": { \"CPHusky\": { \"VehicleType\": \"CPHusky\", \"AutoCreate\": true, \"Sensors\": { \"SonarSensor1\": { \"SensorType\": 7, \"Enabled\": true, \"X\": 0, \"Y\": 0, \"Z\": -0.55, \"Roll\": 0, \"Pitch\": 0, \"Yaw\": 0, \"SenseActive\": true, \"SensePassive\": false, \"MeasurementFrequency\": 5, \"NumberOfTraces\": 10000, \"SensorDiameter\": 0.5, \"SensorLowerAzimuthLimit\": -90, \"SensorUpperAzimuthLimit\": 90, \"SensorLowerElevationLimit\": -90, \"SensorUpperElevationLimit\": 90, \"AttenuationPerDistance\": 0, \"AttenuationPerReflection\": 0, \"AttenuationLimit\": -100, \"DistanceLimit\": 10, \"ReflectionLimit\": 3, \"ReflectionDistanceLimit\": 0.4, \"ReflectionOpeningAngle\": 10 } } } } } Passive Echo Beacons While the default configuration of the echo sensor is to emit a signal and receive the reflections, it is also possible to have passive echo sources in the world. These are objects that emit a signal and the echo sensor will receive the reflections of these signals. This can be used to simulate other echo sources in the world that are not the echo sensor itself. One can define these from the Unreal Editor itself or through the AirSimSettings json file. In the Editor, use the search function to look for Passive Echo Beacon and add it to the world. You can alter the settings from the Details panel. In the AirSimSettings json file you can define new beacons under the PassiveEchoBeacons section. The beacons have the following settings: Parameter Description X Y Z Position of the beacon relative to the Unreal World origin, so not in robot reference frame! (in NED, in meters) Roll Pitch Yaw Orientation of the beacon relative to the Unreal World origin, so not in robot reference frame! (in degrees, yaw-pitch-roll order to front vector +X) Enabe Toggle the beacon on or off. InitialDirections Amount of traces (rays) being cast. This defines the resolution of the resulting reflection point cloud. SensorLowerAzimuthLimit The lower azimuth angle limit in degrees for sending out the initial rays of the source. (default = -90) SensorUpperAzimuthLimit The upper azimuth angle limit in degrees for sending out the initial rays of the source. (default = 90) SensorLowerElevationLimit The lower elevation angle limit in degrees for sending out the initial rays of the source. (default = -90) SensorUpperElevationLimit The upper elevation angle limit in degrees for sending out the initial rays of the source. (default = 90) ReflectionLimit Maximum amount of reflections that can happen. ReflectionDistanceLimit Maximum distance between two reflections (meters) ReflectionOnlyFinal Only save the final reflection along a trace. This will ignore all other reflections that happen along the trace in the data AttenuationPerDistance Attenuation of signal wrt distance traveled (dB/m) AttenuationPerReflection Attenuation of signal wrt reflections (dB) AttenuationLimit Attenuation at which the signal is considered dissipated (dB) DistanceLimit Maximum distance a reflection can travel (meters) DrawDebugAllPoints Draw debug points in world where reflected points are happening due to this source. It will also show the reflection direction with a line DrawDebugAllLines Draw all lines that are being cast from the source to the reflections, not only the ones that are reflected DrawDebugLocation Draw a 3D axes shown where the source is DrawDebugDuration Duration in seconds that the debug points and lines will be shown in the world. -1 is infinite. In the settings file this can look like this example : { \"SeeDocsAt\": \"https://cosys-lab.github.io/settings/\", \"SettingsVersion\": 2.0, \"SimMode\": \"SkidVehicle\", \"ViewMode\": \"\", \"Vehicles\": { \"airsimvehicle\": { \"VehicleType\": \"CPHusky\", \"AutoCreate\": true, \"Sensors\": { ... \"echo\": { \"SensorType\": 7, \"Enabled\": true, ... \"DrawPassiveSources\": false, \"DrawPassiveLines\": true, \"DrawSensor\": true, \"SenseActive\": false, \"SensePassive\": true, \"PassiveRadius\" : 10 } } } }, \"PassiveEchoBeacons\": { \"passiveEchoBeacon1\": { \"X\": 5, \"Y\": 5, \"Z\": -5, \"Roll\": 0, \"Pitch\": 0, \"Yaw\": 0, \"Enable\" : true, \"InitialDirections\": 1000, \"SensorLowerAzimuthLimit\": -90, \"SensorUpperAzimuthLimit\": 90, \"SensorLowerElevationLimit\": -90, \"SensorUpperElevationLimit\": 90, \"AttenuationPerDistance\": 0, \"AttenuationPerReflection\": 0, \"AttenuationLimit\": -100, \"DistanceLimit\": 10, \"ReflectionLimit\": 3, \"DrawDebugAllPoints\": true, \"DrawDebugAllLines\": false, \"DrawDebugLocation\": true, \"DrawDebugDuration\": -1 } } } Client API Use getEchoData(sensor name, vehicle name) API to retrieve the echo sensor data. The API returns Point-Cloud(s) as a flat array of floats, the final attenuation, total distance and reflection count (+ reflection normal for passive beacon reflections) along with the timestamp of the capture and sensor pose. Echo Pose: Default:Active Point-Cloud: Echo sensor pose in the vehicle frame / External: If set to External (see table) the coordinates will be in either Unreal NED when ExternalLocal is false or Local NED (from starting position from vehicle) when ExternalLocal is true . Active Point-Cloud The floats represent [x, y, z, attenuation, total_distance, reflection_count] for each point hit within the range in the last scan in NED format. Active Groundtruth: For each point of the Active Point-Cloud a label string is kept that has the name of the object that the point belongs to. Passive Point-Cloud: The floats represent [x, y, z, attenuation, total_distance, reflection_count, reflection angle x, reflection angle y, reflection angle z] for each point hit within the range in the last scan in NED format. Passive Groundtruth: For each point two strings are kept of the Passive Point-Cloud. The first a label string representing the object of the reflection and second the name of the Passive Echo Beacon that was the source of this reflection. Use setEchoData(sensor name, vehicle name, echo data) API to render an external pointcloud back to the simulation. It expects it to be [x,y,z] as a flat array of floats.","title":"Pulse Echo"},{"location":"echo/#how-to-use-echo-sensor-modalities-in-cosys-airsim","text":"Cosys-AirSim supports Echo sensors for multirotors and cars. Echo sensors can be configured to behave like sonar, radar or other echo-based sensor types. The enablement of an echo sensor and the other settings can be configured via AirSimSettings json. Please see general sensors for information on configuration of general/shared sensor settings.","title":"How to Use Echo sensor modalities in Cosys-AirSim"},{"location":"echo/#enabling-echo-sensor-on-a-vehicle","text":"By default, echo sensors are not enabled. To enable one, set the SensorType and Enabled attributes in settings json. \"echo1\": { \"SensorType\": 7, \"Enabled\" : true, Multiple echo sensors can be enabled on a vehicle .","title":"Enabling echo sensor on a vehicle"},{"location":"echo/#echo-configuration","text":"The following parameters can be configured right now via settings json. Parameter Description X Y Z Position of the echo sensor relative to the vehicle (in NED, in meters) Roll Pitch Yaw Orientation of the echo sensor relative to the vehicle (in degrees, yaw-pitch-roll order to front vector +X) External Uncouple the sensor from the vehicle. If enabled, the position and orientation will be relative to Unreal world coordinates ExternalLocal When in external mode, if this is enabled the retrieved pose of the sensor will be in Local NED coordinates(from starting position from vehicle) and not converted Unreal NED coordinates which is default runParallel Uses CPU parallelisation for speeding up the ray casting for active sensing. This disables all debug drawing except for the final reflected points if enabled (DrawReflectedPoints) SenseActive Enable active sensing where the sensor will emit a signal and receive signals from the reflections SensePassive Enable passive sensing where the sensor will receive signals from other active sources in the world (Passive Echo Beacons, see details below) PassiveRadius The radius in meters in which the sensor will receive signals from passive sources if that mode is enabled NumberOfTraces Amount of traces (rays) being cast. If set to a negative value, it will only do 2D sensing in horizontal plane! SensorLowerAzimuthLimit The lower azimuth angle limit in degrees for receiving signals on the sensor (default = -90) SensorUpperAzimuthLimit The upper azimuth angle limit in degrees for receiving signals on the sensor (default = 90) SensorLowerElevationLimit The lower elevation angle limit in degrees for receiving signals on the sensor (default = -90) SensorUpperElevationLimit The upper elevation angle limit in degrees for receiving signals on the sensor (default = 90) MeasurementFrequency The frequency of the sensor (measurements/s) SensorDiameter The diameter of the sensor plane used to capture the reflecting traces (meter) ReflectionOpeningAngle Opening angle of reflections (degrees) ReflectionLimit Maximum amount of reflections that can happen. ReflectionDistanceLimit Maximum distance between two reflections (meters) AttenuationPerDistance Attenuation of signal wrt distance traveled (dB/m) AttenuationPerReflection Attenuation of signal wrt reflections (dB) AttenuationLimit Attenuation at which the signal is considered dissipated (dB) DistanceLimit Maximum distance a reflection can travel (meters) PauseAfterMeasurement Pause the simulation after each measurement. Useful for API interaction to be synced IgnoreMarked Remove objects with the Unreal Tag MarkedIgnore from the sensor data DrawReflectedPoints Draw debug points in world where reflected points are captured by the sensor DrawReflectedLines Draw debug lines in world from reflected points to the sensor DrawReflectedPaths Draw the full paths of the reflected points DrawInitialPoints Draw the points of the initial half sphere where the traces (rays) are cast DrawExternalPoints Draw a pointcloud coming through the API from an external source DrawBounceLines Draw lines of all bouncing reflections of the traces with their color depending on attenuation DrawPassiveSources Draw debug points and reflection lines for all detected passive echo sources (original sources and their reflection echos against objects) DrawPassiveLines Draw debug lines of the sensor to the passive echo sources that are detected with line of sight. DrawSensor Draw the physical sensor in the world on the vehicle with a 3D axes shown where the sensor is e.g., { \"SeeDocsAt\": \"https://cosys-lab.github.io/settings/\", \"SettingsVersion\": 2.0, \"SimMode\": \"SkidVehicle\", \"Vehicles\": { \"CPHusky\": { \"VehicleType\": \"CPHusky\", \"AutoCreate\": true, \"Sensors\": { \"SonarSensor1\": { \"SensorType\": 7, \"Enabled\": true, \"X\": 0, \"Y\": 0, \"Z\": -0.55, \"Roll\": 0, \"Pitch\": 0, \"Yaw\": 0, \"SenseActive\": true, \"SensePassive\": false, \"MeasurementFrequency\": 5, \"NumberOfTraces\": 10000, \"SensorDiameter\": 0.5, \"SensorLowerAzimuthLimit\": -90, \"SensorUpperAzimuthLimit\": 90, \"SensorLowerElevationLimit\": -90, \"SensorUpperElevationLimit\": 90, \"AttenuationPerDistance\": 0, \"AttenuationPerReflection\": 0, \"AttenuationLimit\": -100, \"DistanceLimit\": 10, \"ReflectionLimit\": 3, \"ReflectionDistanceLimit\": 0.4, \"ReflectionOpeningAngle\": 10 } } } } }","title":"Echo configuration"},{"location":"echo/#passive-echo-beacons","text":"While the default configuration of the echo sensor is to emit a signal and receive the reflections, it is also possible to have passive echo sources in the world. These are objects that emit a signal and the echo sensor will receive the reflections of these signals. This can be used to simulate other echo sources in the world that are not the echo sensor itself. One can define these from the Unreal Editor itself or through the AirSimSettings json file. In the Editor, use the search function to look for Passive Echo Beacon and add it to the world. You can alter the settings from the Details panel. In the AirSimSettings json file you can define new beacons under the PassiveEchoBeacons section. The beacons have the following settings: Parameter Description X Y Z Position of the beacon relative to the Unreal World origin, so not in robot reference frame! (in NED, in meters) Roll Pitch Yaw Orientation of the beacon relative to the Unreal World origin, so not in robot reference frame! (in degrees, yaw-pitch-roll order to front vector +X) Enabe Toggle the beacon on or off. InitialDirections Amount of traces (rays) being cast. This defines the resolution of the resulting reflection point cloud. SensorLowerAzimuthLimit The lower azimuth angle limit in degrees for sending out the initial rays of the source. (default = -90) SensorUpperAzimuthLimit The upper azimuth angle limit in degrees for sending out the initial rays of the source. (default = 90) SensorLowerElevationLimit The lower elevation angle limit in degrees for sending out the initial rays of the source. (default = -90) SensorUpperElevationLimit The upper elevation angle limit in degrees for sending out the initial rays of the source. (default = 90) ReflectionLimit Maximum amount of reflections that can happen. ReflectionDistanceLimit Maximum distance between two reflections (meters) ReflectionOnlyFinal Only save the final reflection along a trace. This will ignore all other reflections that happen along the trace in the data AttenuationPerDistance Attenuation of signal wrt distance traveled (dB/m) AttenuationPerReflection Attenuation of signal wrt reflections (dB) AttenuationLimit Attenuation at which the signal is considered dissipated (dB) DistanceLimit Maximum distance a reflection can travel (meters) DrawDebugAllPoints Draw debug points in world where reflected points are happening due to this source. It will also show the reflection direction with a line DrawDebugAllLines Draw all lines that are being cast from the source to the reflections, not only the ones that are reflected DrawDebugLocation Draw a 3D axes shown where the source is DrawDebugDuration Duration in seconds that the debug points and lines will be shown in the world. -1 is infinite. In the settings file this can look like this example : { \"SeeDocsAt\": \"https://cosys-lab.github.io/settings/\", \"SettingsVersion\": 2.0, \"SimMode\": \"SkidVehicle\", \"ViewMode\": \"\", \"Vehicles\": { \"airsimvehicle\": { \"VehicleType\": \"CPHusky\", \"AutoCreate\": true, \"Sensors\": { ... \"echo\": { \"SensorType\": 7, \"Enabled\": true, ... \"DrawPassiveSources\": false, \"DrawPassiveLines\": true, \"DrawSensor\": true, \"SenseActive\": false, \"SensePassive\": true, \"PassiveRadius\" : 10 } } } }, \"PassiveEchoBeacons\": { \"passiveEchoBeacon1\": { \"X\": 5, \"Y\": 5, \"Z\": -5, \"Roll\": 0, \"Pitch\": 0, \"Yaw\": 0, \"Enable\" : true, \"InitialDirections\": 1000, \"SensorLowerAzimuthLimit\": -90, \"SensorUpperAzimuthLimit\": 90, \"SensorLowerElevationLimit\": -90, \"SensorUpperElevationLimit\": 90, \"AttenuationPerDistance\": 0, \"AttenuationPerReflection\": 0, \"AttenuationLimit\": -100, \"DistanceLimit\": 10, \"ReflectionLimit\": 3, \"DrawDebugAllPoints\": true, \"DrawDebugAllLines\": false, \"DrawDebugLocation\": true, \"DrawDebugDuration\": -1 } } }","title":"Passive Echo Beacons"},{"location":"echo/#client-api","text":"Use getEchoData(sensor name, vehicle name) API to retrieve the echo sensor data. The API returns Point-Cloud(s) as a flat array of floats, the final attenuation, total distance and reflection count (+ reflection normal for passive beacon reflections) along with the timestamp of the capture and sensor pose. Echo Pose: Default:Active Point-Cloud: Echo sensor pose in the vehicle frame / External: If set to External (see table) the coordinates will be in either Unreal NED when ExternalLocal is false or Local NED (from starting position from vehicle) when ExternalLocal is true . Active Point-Cloud The floats represent [x, y, z, attenuation, total_distance, reflection_count] for each point hit within the range in the last scan in NED format. Active Groundtruth: For each point of the Active Point-Cloud a label string is kept that has the name of the object that the point belongs to. Passive Point-Cloud: The floats represent [x, y, z, attenuation, total_distance, reflection_count, reflection angle x, reflection angle y, reflection angle z] for each point hit within the range in the last scan in NED format. Passive Groundtruth: For each point two strings are kept of the Passive Point-Cloud. The first a label string representing the object of the reflection and second the name of the Passive Echo Beacon that was the source of this reflection. Use setEchoData(sensor name, vehicle name, echo data) API to render an external pointcloud back to the simulation. It expects it to be [x,y,z] as a flat array of floats.","title":"Client API"},{"location":"event_sim/","text":"Cosys-AirSim provides a Python-based event camera simulator, aimed at performance and ability to run in real-time along with the sim. Event cameras An event camera is a special vision sensor that measures changes in logarithmic brightness and only reports 'events'. Each event is a set of four values that gets generated every time the absolute change in the logarithmic brightness exceeds a certain threshold. An event contains the timestamp of the measurement, pixel location (x and y coordinates) and the polarity: which is either +1/-1 based on whether the logarithmic brightness has increased or decreased. Most event cameras have a temporal resolution of the order of microseconds, making them significantly faster than RGB sensors, and also demonstrate a high dynamic range and low motion blur. More details about event cameras can be found in this tutorial from RPG-UZH Cosys-AirSim event simulator The Cosys-AirSim event simulator uses two consecutive RGB images (converted to grayscale), and computes \"past events\" that would have occurred during the transition based on the change in log luminance between the images. These events are reported as a stream of bytes, following this format: x and y are the pixel locations of the event firing, timestamp is the global timestamp in microseconds and pol is either +1/-1 depending on whether the brightness increased or decreased. Along with this bytestream, an accumulation of events over a 2D frame is also constructed, known as an 'event image' that visualizes +1 events as red and -1 as blue pixels. An example event image is shown below: Usage An example script to run the event simulator alongside Cosys-AirSim is located at https://github.com/Cosys-Lab/Cosys-AirSim/blob/main/PythonClient/eventcamera_sim/test_event_sim.py. The following optional command-line arguments can be passed to this script. args.width, args.height (float): Simulated event camera resolution args.save (bool): Whether or not to save the event data to a file, args.debug (bool): Whether or not to display the simulated events as an image The implementation of the actual event simulation, written in Python and numba, is at https://github.com/Cosys-Lab/Cosys-AirSim/blob/main/PythonClient/eventcamera_sim/event_simulator.py. The event simulator is initialized as follows, with the arguments controlling the resolution of the camera. from event_simulator import * ev_sim = EventSimulator(W, H) The actual computation of the events is triggered through an image_callback function, which is executed every time a new RGB image is obtained. The first time this function is called, due to the lack of a 'previous' image, it acts as an initialization of the event sim. event_img, events = ev_sim.image_callback(img, ts_delta) This function, which behaves similar to a callback (called every time a new image is received) returns an event image as a one dimensional array of +1/-1 values, thus indicating only whether events were seen at each pixel, but not the timing/number of events. This one dimensional array can be converted into the red/blue event image as seen in the function convert_event_img_rgb . events is a numpy array of events, each of format . Through this function, the event sim computes the difference between the past and the current image, and computes a stream of events which is then returned as a numpy array. This can then be appended to a file. There are quite a few parameters that can be tuned to achieve a level of visual fidelity/performance of the event simulation. The main factors to tune are the following: The resolution of the camera. The log luminance threshold TOL that determines whether or not a detected change counts as an event. Note: There is also currently a max limit on the number of events generated per pair of images, which can also be tuned. Algorithm The working of the event simulator loosely follows this set of operations: 1. Take the difference between the log intensities of the current and previous frames. 2. Iterating over all pixels, calculate the polarity for each each pixel based on a threshold of change in log intensity. 3. Determine the number of events to be fired per pixel, based on extent of intensity change over the threshold. Let $N_{max}$ be the maximum number of events that can occur at a single pixel, then the total number of firings to be simulated at pixel location $u$ would be $N_e(u) = min(N_{max}, \\frac{\\Delta L(u)}{TOL})$. 4. Determine the timestamps for each interpolated event by interpolating between the amount of time that has elapsed between the captures of the previous and current images. $t = t_{prev} + \\frac{\\Delta T}{N_e(u)}$ 5. Generate the output bytestream by simulating events at every pixel and sort by timestamp.","title":"Event camera"},{"location":"event_sim/#event-cameras","text":"An event camera is a special vision sensor that measures changes in logarithmic brightness and only reports 'events'. Each event is a set of four values that gets generated every time the absolute change in the logarithmic brightness exceeds a certain threshold. An event contains the timestamp of the measurement, pixel location (x and y coordinates) and the polarity: which is either +1/-1 based on whether the logarithmic brightness has increased or decreased. Most event cameras have a temporal resolution of the order of microseconds, making them significantly faster than RGB sensors, and also demonstrate a high dynamic range and low motion blur. More details about event cameras can be found in this tutorial from RPG-UZH","title":"Event cameras"},{"location":"event_sim/#cosys-airsim-event-simulator","text":"The Cosys-AirSim event simulator uses two consecutive RGB images (converted to grayscale), and computes \"past events\" that would have occurred during the transition based on the change in log luminance between the images. These events are reported as a stream of bytes, following this format: x and y are the pixel locations of the event firing, timestamp is the global timestamp in microseconds and pol is either +1/-1 depending on whether the brightness increased or decreased. Along with this bytestream, an accumulation of events over a 2D frame is also constructed, known as an 'event image' that visualizes +1 events as red and -1 as blue pixels. An example event image is shown below:","title":"Cosys-AirSim event simulator"},{"location":"event_sim/#usage","text":"An example script to run the event simulator alongside Cosys-AirSim is located at https://github.com/Cosys-Lab/Cosys-AirSim/blob/main/PythonClient/eventcamera_sim/test_event_sim.py. The following optional command-line arguments can be passed to this script. args.width, args.height (float): Simulated event camera resolution args.save (bool): Whether or not to save the event data to a file, args.debug (bool): Whether or not to display the simulated events as an image The implementation of the actual event simulation, written in Python and numba, is at https://github.com/Cosys-Lab/Cosys-AirSim/blob/main/PythonClient/eventcamera_sim/event_simulator.py. The event simulator is initialized as follows, with the arguments controlling the resolution of the camera. from event_simulator import * ev_sim = EventSimulator(W, H) The actual computation of the events is triggered through an image_callback function, which is executed every time a new RGB image is obtained. The first time this function is called, due to the lack of a 'previous' image, it acts as an initialization of the event sim. event_img, events = ev_sim.image_callback(img, ts_delta) This function, which behaves similar to a callback (called every time a new image is received) returns an event image as a one dimensional array of +1/-1 values, thus indicating only whether events were seen at each pixel, but not the timing/number of events. This one dimensional array can be converted into the red/blue event image as seen in the function convert_event_img_rgb . events is a numpy array of events, each of format . Through this function, the event sim computes the difference between the past and the current image, and computes a stream of events which is then returned as a numpy array. This can then be appended to a file. There are quite a few parameters that can be tuned to achieve a level of visual fidelity/performance of the event simulation. The main factors to tune are the following: The resolution of the camera. The log luminance threshold TOL that determines whether or not a detected change counts as an event. Note: There is also currently a max limit on the number of events generated per pair of images, which can also be tuned.","title":"Usage"},{"location":"event_sim/#algorithm","text":"The working of the event simulator loosely follows this set of operations: 1. Take the difference between the log intensities of the current and previous frames. 2. Iterating over all pixels, calculate the polarity for each each pixel based on a threshold of change in log intensity. 3. Determine the number of events to be fired per pixel, based on extent of intensity change over the threshold. Let $N_{max}$ be the maximum number of events that can occur at a single pixel, then the total number of firings to be simulated at pixel location $u$ would be $N_e(u) = min(N_{max}, \\frac{\\Delta L(u)}{TOL})$. 4. Determine the timestamps for each interpolated event by interpolating between the amount of time that has elapsed between the captures of the previous and current images. $t = t_{prev} + \\frac{\\Delta T}{N_e(u)}$ 5. Generate the output bytestream by simulating events at every pixel and sort by timestamp.","title":"Algorithm"},{"location":"flight_controller/","text":"Flight Controller What is Flight Controller? \"Wait!\" you ask, \"Why do you need flight controller for a simulator?\". The primary job of flight controller is to take in desired state as input, estimate actual state using sensors data and then drive the actuators in such a way so that actual state comes as close to the desired state. For quadrotors, desired state can be specified as roll, pitch and yaw, for example. It then estimates actual roll, pitch and yaw using gyroscope and accelerometer. Then it generates appropriate motor signals so actual state becomes desired state. How Simulator uses Flight Controller? Simulator consumes the motor signals generated by flight controller to figure out force and thrust generated by each actuator (i.e. propellers in case of quadrotor). This is then used by the physics engine to compute the kinetic properties of the vehicle. This in turn generates simulated sensor data and feed it back to the flight controller. What is Hardware- and Software-in-Loop? Hardware-in-Loop (HITL or HIL) means flight controller runs in actual hardware such as Naze32 or Pixhawk chip. You then connect this hardware to PC using USB port. Simulator talks to the device to retrieve actuator signals and send it simulated sensor data. This is obviously as close as you can get to real thing. However, it typically requires more steps to set up and usually hard to debug. One big issue is that simulator clock and device clock runs on their own speed and accuracy. Also, USB connection (which is usually only USB 2.0) may not be enough for real-time communication. In \"software-in-loop\" simulation (SITL or SIL) mode the firmware runs in your computer as opposed to separate board. This is generally fine except that now you are not touching any code paths that are specific to your device. Also, none of your code now runs with real-time clock usually provided by specialized hardware board. For well-designed flight controllers with software clock, these are usually not concerning issues. What Flight Controllers are Supported? AirSim has built-in flight controller called simple_flight and it is used by default. You don't need to do anything to use or configure it. AirSim also supports PX4 & ArduPilot as external flight controllers for advanced users. Using AirSim Without Flight Controller Yes, now it's possible to use AirSim without flight controller. Please see the instructions here for how to use so-called \"Computer Vision\" mode. If you don't need vehicle dynamics, we highly recommend using this mode.","title":"Flight Controller"},{"location":"flight_controller/#flight-controller","text":"","title":"Flight Controller"},{"location":"flight_controller/#what-is-flight-controller","text":"\"Wait!\" you ask, \"Why do you need flight controller for a simulator?\". The primary job of flight controller is to take in desired state as input, estimate actual state using sensors data and then drive the actuators in such a way so that actual state comes as close to the desired state. For quadrotors, desired state can be specified as roll, pitch and yaw, for example. It then estimates actual roll, pitch and yaw using gyroscope and accelerometer. Then it generates appropriate motor signals so actual state becomes desired state.","title":"What is Flight Controller?"},{"location":"flight_controller/#how-simulator-uses-flight-controller","text":"Simulator consumes the motor signals generated by flight controller to figure out force and thrust generated by each actuator (i.e. propellers in case of quadrotor). This is then used by the physics engine to compute the kinetic properties of the vehicle. This in turn generates simulated sensor data and feed it back to the flight controller.","title":"How Simulator uses Flight Controller?"},{"location":"flight_controller/#what-is-hardware-and-software-in-loop","text":"Hardware-in-Loop (HITL or HIL) means flight controller runs in actual hardware such as Naze32 or Pixhawk chip. You then connect this hardware to PC using USB port. Simulator talks to the device to retrieve actuator signals and send it simulated sensor data. This is obviously as close as you can get to real thing. However, it typically requires more steps to set up and usually hard to debug. One big issue is that simulator clock and device clock runs on their own speed and accuracy. Also, USB connection (which is usually only USB 2.0) may not be enough for real-time communication. In \"software-in-loop\" simulation (SITL or SIL) mode the firmware runs in your computer as opposed to separate board. This is generally fine except that now you are not touching any code paths that are specific to your device. Also, none of your code now runs with real-time clock usually provided by specialized hardware board. For well-designed flight controllers with software clock, these are usually not concerning issues.","title":"What is Hardware- and Software-in-Loop?"},{"location":"flight_controller/#what-flight-controllers-are-supported","text":"AirSim has built-in flight controller called simple_flight and it is used by default. You don't need to do anything to use or configure it. AirSim also supports PX4 & ArduPilot as external flight controllers for advanced users.","title":"What Flight Controllers are Supported?"},{"location":"flight_controller/#using-airsim-without-flight-controller","text":"Yes, now it's possible to use AirSim without flight controller. Please see the instructions here for how to use so-called \"Computer Vision\" mode. If you don't need vehicle dynamics, we highly recommend using this mode.","title":"Using AirSim Without Flight Controller"},{"location":"gazebo_drone/","text":"Welcome to GazeboDrone GazeboDrone allows connecting a gazebo drone to the Cosys-AirSim drone, using the gazebo drone as a flight dynamic model (FDM) and Cosys-AirSim to generate environmental sensor data. It can be used for Multicopters , Fixed-wings or any other vehicle. Dependencies Gazebo Make sure you have installed gazebo dependencies: sudo apt-get install libgazebo9-dev AirLib This project is built with GCC 8, so AirLib needs to be built with GCC 8 too. Run from your AirSim root folder: ./clean.sh ./setup.sh ./build.sh --gcc Cosys-AirSim simulator The Cosys-AirSim UE plugin needs to be built with clang, so you can't use the one compiled in the previous step. You can use our binaries or you can clone AirSim again in another folder and buid it without the above option, then you can run Blocks or your own environment. Cosys-AirSim settings Inside your settings.json file you need to add this line: \"PhysicsEngineName\":\"ExternalPhysicsEngine\" . You may want to change the visual model of the Cosys-AirSim drone, for that you can follow this tutorial. Build Execute this from your Cosys-AirSim root folder: cd GazeboDrone mkdir build && cd build cmake -DCMAKE_C_COMPILER=gcc-8 -DCMAKE_CXX_COMPILER=g++-8 .. make Run First run the Cosys-AirSim simulator and your Gazebo model and then execute this from your Cosys-AirSim root folder: cd GazeboDrone/build ./GazeboDrone","title":"Import Gazebo models"},{"location":"gazebo_drone/#welcome-to-gazebodrone","text":"GazeboDrone allows connecting a gazebo drone to the Cosys-AirSim drone, using the gazebo drone as a flight dynamic model (FDM) and Cosys-AirSim to generate environmental sensor data. It can be used for Multicopters , Fixed-wings or any other vehicle.","title":"Welcome to GazeboDrone"},{"location":"gazebo_drone/#dependencies","text":"","title":"Dependencies"},{"location":"gazebo_drone/#gazebo","text":"Make sure you have installed gazebo dependencies: sudo apt-get install libgazebo9-dev","title":"Gazebo"},{"location":"gazebo_drone/#airlib","text":"This project is built with GCC 8, so AirLib needs to be built with GCC 8 too. Run from your AirSim root folder: ./clean.sh ./setup.sh ./build.sh --gcc","title":"AirLib"},{"location":"gazebo_drone/#cosys-airsim-simulator","text":"The Cosys-AirSim UE plugin needs to be built with clang, so you can't use the one compiled in the previous step. You can use our binaries or you can clone AirSim again in another folder and buid it without the above option, then you can run Blocks or your own environment.","title":"Cosys-AirSim simulator"},{"location":"gazebo_drone/#cosys-airsim-settings","text":"Inside your settings.json file you need to add this line: \"PhysicsEngineName\":\"ExternalPhysicsEngine\" . You may want to change the visual model of the Cosys-AirSim drone, for that you can follow this tutorial.","title":"Cosys-AirSim settings"},{"location":"gazebo_drone/#build","text":"Execute this from your Cosys-AirSim root folder: cd GazeboDrone mkdir build && cd build cmake -DCMAKE_C_COMPILER=gcc-8 -DCMAKE_CXX_COMPILER=g++-8 .. make","title":"Build"},{"location":"gazebo_drone/#run","text":"First run the Cosys-AirSim simulator and your Gazebo model and then execute this from your Cosys-AirSim root folder: cd GazeboDrone/build ./GazeboDrone","title":"Run"},{"location":"gpulidar/","text":"How to Use GPU Lidar in Cosys-AirSim Cosys-AirSim supports a GPU accelerated Lidar for multirotors and cars. It uses a depth camera that rotates around to simulate a Lidar while exploiting the GPU to do most of the work. Should allow for a large increase in amount of points that can be simulated. The enablement of a GPU lidar and the other lidar settings can be configured via AirSimSettings json. Please see general sensors for information on configuration of general/shared sensor settings. Note that this sensor type is currently not supported for Multirotor mode. It only works for Car and Computervision. Enabling GPU lidar on a vehicle By default, GPU lidars are not enabled. To enable the sensor, set the SensorType and Enabled attributes in settings json. \"GPULidar1\": { \"SensorType\": 8, \"Enabled\" : true, Multiple GPU lidars can be enabled on a vehicle. But one has to turn off DrawDebugPoints! Ignoring glass and other material types One can set an object that should be invisible to LIDAR sensors (such as glass) by giving them an Unreal Tag called LidarIgnore . GPU Lidar configuration The following parameters can be configured right now via settings json. For some more information check the publication on this topic here . Parameter Description NumberOfChannels Number of channels/lasers of the lidar. When set to 1 it will act as a 2D horizontal LiDAR and will use the VerticalFOVUpper value as the vertical angle to scan. Range Range, in meters MeasurementsPerCycle amount of measurements in one full cycle (horizontal resolution) RotationsPerSecond Rotations per second Resolution Defines the resolution of the depth camera image that generates the Lidar point cloud HorizontalFOVStart Horizontal FOV start for the lidar, in degrees HorizontalFOVEnd Horizontal FOV end for the lidar, in degrees VerticalFOVUpper Vertical FOV upper limit for the lidar, in degrees VerticalFOVLower Vertical FOV lower limit for the lidar, in degrees X Y Z Position of the lidar relative to the vehicle (in NED, in meters) Roll Pitch Yaw Orientation of the lidar relative to the vehicle (in degrees, yaw-pitch-roll order to front vector +X) IgnoreMarked Remove objects with the Unreal Tag MarkedIgnore from the sensor data GroundTruth Generate ground truth labeling color values InstanceSegmentation Enable to set the generated ground truth to the instance segmentation labeling. Set to false to choose a different annotation label Annotation If GroundTruth is enabled and InstanceSegmentation is disabled, you can set this value to the name of the annotation you want to use. This will be used for the ground truth color labels. DrawSensor Draw the physical sensor in the world on the vehicle with a 3D axes shown where the sensor is External Uncouple the sensor from the vehicle. If enabled, the position and orientation will be relative to Unreal world coordinates in NED format from the settings file. ExternalLocal When in external mode, if this is enabled the retrieved pose of the sensor will be in Local NED coordinates(from starting position from vehicle) and not converted Unreal NED coordinates which is default GenerateIntensity Toggle intensity calculation on or off. This requires a surface material map to be available. See below for more information. rangeMaxLambertianPercentage Lambertian reflectivity percentage to max out on. Will act linear to 0% for below. rainMaxIntensity Rain intensity maximum to scale from in mm/hour. rainConstantA Constant one to to calculate the extinction coefficient in rain rainConstantB Constant one to to calculate the extinction coefficient in rain GenerateNoise Generate and add range-noise based on normal distribution if set to true MinNoiseStandardDeviation The standard deviation to generate the noise normal distribution, in meters. This is the minimal noise (at 0 distance) NoiseDistanceScale To scale the noise with distance, set this parameter. This way the minimal noise is scaled depending on the distance compared to total maximum range of the sensor { \"SeeDocsAt\": \"https://cosys-lab.github.io/settings/\", \"SettingsVersion\": 2.0, \"SimMode\": \"SkidVehicle\", \"Vehicles\": { \"airsimvehicle\": { \"VehicleType\": \"CPHusky\", \"AutoCreate\": true, \"Sensors\": { \"gpulidar1\": { \"SensorType\": 8, \"Enabled\" : true, \"External\": false, \"NumberOfChannels\": 32, \"Range\": 50, \"Resolution\": 1024, \"RotationsPerSecond\": 10, \"MeasurementsPerCycle\": 512, \"X\": 0, \"Y\": 0, \"Z\": -0.3, \"Roll\": 0, \"Pitch\": 0, \"Yaw\" : 0, \"VerticalFOVUpper\": 20, \"VerticalFOVLower\": -20, \"HorizontalFOVStart\": 0, \"HorizontalFOVEnd\": 360, \"DrawDebugPoints\": true, \"DrawMode\": 1, \"Resolution\": 1024, \"IgnoreMarked\": true, \"GroundTruth\": true, \"InstanceSegmentation\": true, \"Annotation\": \"\", \"GenerateIntensity\": false, \"rangeMaxLambertianPercentage\": 80, \"rainMaxIntensity\": 70, \"rainConstantA\": 0.01, \"rainConstantB\": 0.6, \"DrawSensor\": false } } } } } Intensity Surface Material map If 'GenerateIntensity' is enabled in the settings json, a surface material map is required. This map is used to calculate the intensity of the lidar points. e.g.: wood,0.9 alluminium,0.5 concrete,0.3 asphalt,0.1 This needs to be saved as 'materials.csv' in your documents folder where also your settings json file resides. Server side visualization for debugging By default, the lidar points are not drawn on the viewport. To enable the drawing of hit laser points on the viewport, please enable setting 'DrawDebugPoints' via settings json. This is only for testing purposes and will affect the data slightly. It also needs to be disabled when using multiple Lidar sensors to avoid artifacts!! e.g.: \"Lidar1\": { ... \"DrawDebugPoints\": true }, You can also tweak the variation of debugging with the 'DrawMode' parameter: - 0 = no coloring - 1 = groundtruth color labels (instance segmentation or other annotation labels depending on settings) - 2 = material - 3 = impact angle - 4 = intensity e.g.: \"Lidar1\": { ... \"DrawDebugPoints\": true, \"DrawMode\": 4 }, Client API Use getGPULidarData(sensor name, vehicle name) API to retrieve the GPU Lidar data. The API returns a Point-Cloud as a flat array of floats along with the timestamp of the capture and lidar pose. Point-Cloud: The floats represent [x,y,z, rgb, intensity] coordinate for each point hit within the range in the last scan in NED format. Lidar Pose: Default: sensor pose in the vehicle frame / External: If set to External (see table) the coordinates will be in either Unreal NED when ExternalLocal is false or Local NED (from starting position from vehicle) when ExternalLocal is true . Rgb represents a float32 representation of the RGB8 value that is linked either the instance segmentation system or a different annotation label. See the Image API documentation , Annotation documentation and the instance segmentation documentation . The float32 comes from binary concatenation of the RGB8 values : rgb = value_segmentation.R << 16 | value_segmentation.G << 8 | value_segmentation.B \\ It can be retrieved from the API and converted back to RGB8 with for example the following Python code: lidar_data = client.getGPULidarData('lidar', 'vehicle') points = np.array(lidar_data.point_cloud, dtype=np.dtype('f4')) points = np.reshape(points, (int(points.shape[0] / 5), 5)) rgb_values = points[:, 3].astype(np.uint32) rgb = np.zeros((np.shape(points)[0], 3)) xyz = points[:, 0:3] for index, rgb_value in enumerate(rgb_values): rgb[index, 0] = (rgb_value >> 16) & 0xFF rgb[index, 1] = (rgb_value >> 8) & 0xFF rgb[index, 2] = rgb_value & 0xFF","title":"GPU LIDAR"},{"location":"gpulidar/#how-to-use-gpu-lidar-in-cosys-airsim","text":"Cosys-AirSim supports a GPU accelerated Lidar for multirotors and cars. It uses a depth camera that rotates around to simulate a Lidar while exploiting the GPU to do most of the work. Should allow for a large increase in amount of points that can be simulated. The enablement of a GPU lidar and the other lidar settings can be configured via AirSimSettings json. Please see general sensors for information on configuration of general/shared sensor settings. Note that this sensor type is currently not supported for Multirotor mode. It only works for Car and Computervision.","title":"How to Use GPU Lidar in Cosys-AirSim"},{"location":"gpulidar/#enabling-gpu-lidar-on-a-vehicle","text":"By default, GPU lidars are not enabled. To enable the sensor, set the SensorType and Enabled attributes in settings json. \"GPULidar1\": { \"SensorType\": 8, \"Enabled\" : true, Multiple GPU lidars can be enabled on a vehicle. But one has to turn off DrawDebugPoints!","title":"Enabling GPU lidar on a vehicle"},{"location":"gpulidar/#ignoring-glass-and-other-material-types","text":"One can set an object that should be invisible to LIDAR sensors (such as glass) by giving them an Unreal Tag called LidarIgnore .","title":"Ignoring glass and other material types"},{"location":"gpulidar/#gpu-lidar-configuration","text":"The following parameters can be configured right now via settings json. For some more information check the publication on this topic here . Parameter Description NumberOfChannels Number of channels/lasers of the lidar. When set to 1 it will act as a 2D horizontal LiDAR and will use the VerticalFOVUpper value as the vertical angle to scan. Range Range, in meters MeasurementsPerCycle amount of measurements in one full cycle (horizontal resolution) RotationsPerSecond Rotations per second Resolution Defines the resolution of the depth camera image that generates the Lidar point cloud HorizontalFOVStart Horizontal FOV start for the lidar, in degrees HorizontalFOVEnd Horizontal FOV end for the lidar, in degrees VerticalFOVUpper Vertical FOV upper limit for the lidar, in degrees VerticalFOVLower Vertical FOV lower limit for the lidar, in degrees X Y Z Position of the lidar relative to the vehicle (in NED, in meters) Roll Pitch Yaw Orientation of the lidar relative to the vehicle (in degrees, yaw-pitch-roll order to front vector +X) IgnoreMarked Remove objects with the Unreal Tag MarkedIgnore from the sensor data GroundTruth Generate ground truth labeling color values InstanceSegmentation Enable to set the generated ground truth to the instance segmentation labeling. Set to false to choose a different annotation label Annotation If GroundTruth is enabled and InstanceSegmentation is disabled, you can set this value to the name of the annotation you want to use. This will be used for the ground truth color labels. DrawSensor Draw the physical sensor in the world on the vehicle with a 3D axes shown where the sensor is External Uncouple the sensor from the vehicle. If enabled, the position and orientation will be relative to Unreal world coordinates in NED format from the settings file. ExternalLocal When in external mode, if this is enabled the retrieved pose of the sensor will be in Local NED coordinates(from starting position from vehicle) and not converted Unreal NED coordinates which is default GenerateIntensity Toggle intensity calculation on or off. This requires a surface material map to be available. See below for more information. rangeMaxLambertianPercentage Lambertian reflectivity percentage to max out on. Will act linear to 0% for below. rainMaxIntensity Rain intensity maximum to scale from in mm/hour. rainConstantA Constant one to to calculate the extinction coefficient in rain rainConstantB Constant one to to calculate the extinction coefficient in rain GenerateNoise Generate and add range-noise based on normal distribution if set to true MinNoiseStandardDeviation The standard deviation to generate the noise normal distribution, in meters. This is the minimal noise (at 0 distance) NoiseDistanceScale To scale the noise with distance, set this parameter. This way the minimal noise is scaled depending on the distance compared to total maximum range of the sensor { \"SeeDocsAt\": \"https://cosys-lab.github.io/settings/\", \"SettingsVersion\": 2.0, \"SimMode\": \"SkidVehicle\", \"Vehicles\": { \"airsimvehicle\": { \"VehicleType\": \"CPHusky\", \"AutoCreate\": true, \"Sensors\": { \"gpulidar1\": { \"SensorType\": 8, \"Enabled\" : true, \"External\": false, \"NumberOfChannels\": 32, \"Range\": 50, \"Resolution\": 1024, \"RotationsPerSecond\": 10, \"MeasurementsPerCycle\": 512, \"X\": 0, \"Y\": 0, \"Z\": -0.3, \"Roll\": 0, \"Pitch\": 0, \"Yaw\" : 0, \"VerticalFOVUpper\": 20, \"VerticalFOVLower\": -20, \"HorizontalFOVStart\": 0, \"HorizontalFOVEnd\": 360, \"DrawDebugPoints\": true, \"DrawMode\": 1, \"Resolution\": 1024, \"IgnoreMarked\": true, \"GroundTruth\": true, \"InstanceSegmentation\": true, \"Annotation\": \"\", \"GenerateIntensity\": false, \"rangeMaxLambertianPercentage\": 80, \"rainMaxIntensity\": 70, \"rainConstantA\": 0.01, \"rainConstantB\": 0.6, \"DrawSensor\": false } } } } }","title":"GPU Lidar configuration"},{"location":"gpulidar/#intensity-surface-material-map","text":"If 'GenerateIntensity' is enabled in the settings json, a surface material map is required. This map is used to calculate the intensity of the lidar points. e.g.: wood,0.9 alluminium,0.5 concrete,0.3 asphalt,0.1 This needs to be saved as 'materials.csv' in your documents folder where also your settings json file resides.","title":"Intensity Surface Material map"},{"location":"gpulidar/#server-side-visualization-for-debugging","text":"By default, the lidar points are not drawn on the viewport. To enable the drawing of hit laser points on the viewport, please enable setting 'DrawDebugPoints' via settings json. This is only for testing purposes and will affect the data slightly. It also needs to be disabled when using multiple Lidar sensors to avoid artifacts!! e.g.: \"Lidar1\": { ... \"DrawDebugPoints\": true }, You can also tweak the variation of debugging with the 'DrawMode' parameter: - 0 = no coloring - 1 = groundtruth color labels (instance segmentation or other annotation labels depending on settings) - 2 = material - 3 = impact angle - 4 = intensity e.g.: \"Lidar1\": { ... \"DrawDebugPoints\": true, \"DrawMode\": 4 },","title":"Server side visualization for debugging"},{"location":"gpulidar/#client-api","text":"Use getGPULidarData(sensor name, vehicle name) API to retrieve the GPU Lidar data. The API returns a Point-Cloud as a flat array of floats along with the timestamp of the capture and lidar pose. Point-Cloud: The floats represent [x,y,z, rgb, intensity] coordinate for each point hit within the range in the last scan in NED format. Lidar Pose: Default: sensor pose in the vehicle frame / External: If set to External (see table) the coordinates will be in either Unreal NED when ExternalLocal is false or Local NED (from starting position from vehicle) when ExternalLocal is true . Rgb represents a float32 representation of the RGB8 value that is linked either the instance segmentation system or a different annotation label. See the Image API documentation , Annotation documentation and the instance segmentation documentation . The float32 comes from binary concatenation of the RGB8 values : rgb = value_segmentation.R << 16 | value_segmentation.G << 8 | value_segmentation.B \\ It can be retrieved from the API and converted back to RGB8 with for example the following Python code: lidar_data = client.getGPULidarData('lidar', 'vehicle') points = np.array(lidar_data.point_cloud, dtype=np.dtype('f4')) points = np.reshape(points, (int(points.shape[0] / 5), 5)) rgb_values = points[:, 3].astype(np.uint32) rgb = np.zeros((np.shape(points)[0], 3)) xyz = points[:, 0:3] for index, rgb_value in enumerate(rgb_values): rgb[index, 0] = (rgb_value >> 16) & 0xFF rgb[index, 1] = (rgb_value >> 8) & 0xFF rgb[index, 2] = rgb_value & 0xFF","title":"Client API"},{"location":"image_apis/","text":"Image APIs Please read general API doc first if you are not familiar with AirSim APIs. Getting a Single Image Here's a sample code to get a single image from camera named \"0\". The returned value is bytes of png format image. To get uncompressed and other format as well as available cameras please see next sections. Python import cosysairsim as airsim # for car use CarClient() client = airsim.MultirotorClient() png_image = client.simGetImage(\"0\", airsim.ImageType.Scene) # do something with image C++ #include \"vehicles/multirotor/api/MultirotorRpcLibClient.hpp\" int getOneImage() { using namespace msr::airlib; // for car use CarRpcLibClient MultirotorRpcLibClient client; std::vector png_image = client.simGetImage(\"0\", VehicleCameraBase::ImageType::Scene); // do something with images } Getting Images with More Flexibility The simGetImages API which is slightly more complex to use than simGetImage API, for example, you can get left camera view, right camera view and depth image from left camera in a single API call. The simGetImages API also allows you to get uncompressed images as well as floating point single channel images (instead of 3 channel (RGB), each 8 bit). Python import cosysairsim as airsim # for car use CarClient() client = airsim.MultirotorClient() responses = client.simGetImages([ # png format airsim.ImageRequest(0, airsim.ImageType.Scene), # uncompressed RGB array bytes airsim.ImageRequest(1, airsim.ImageType.Scene, False, False), # floating point uncompressed image airsim.ImageRequest(1, airsim.ImageType.DepthPlanar, True)]) # do something with response which contains image data, pose, timestamp etc Using AirSim Images with NumPy If you plan to use numpy for image manipulation, you should get uncompressed RGB image and then convert to numpy like this: responses = client.simGetImages([airsim.ImageRequest(\"0\", airsim.ImageType.Scene, False, False)]) response = responses[0] # get numpy array img1d = np.fromstring(response.image_data_uint8, dtype=np.uint8) # reshape array to 4 channel image array H X W X 4 img_rgb = img1d.reshape(response.height, response.width, 3) # original image is fliped vertically img_rgb = np.flipud(img_rgb) # write to png airsim.write_png(os.path.normpath(filename + '.png'), img_rgb) Quick Tips The API simGetImage returns binary string literal which means you can simply dump it in binary file to create a .png file. However if you want to process it in any other way than you can handy function airsim.string_to_uint8_array . This converts binary string literal to NumPy uint8 array. The API simGetImages can accept request for multiple image types from any cameras in single call. You can specify if image is png compressed, RGB uncompressed or float array. For png compressed images, you get binary string literal . For float array you get Python list of float64. You can convert this float array to NumPy 2D array using airsim.list_to_2d_float_array(response.image_data_float, response.width, response.height) You can also save float array to .pfm file (Portable Float Map format) using airsim.write_pfm() function. If you are looking to query position and orientation information in sync with a call to one of the image APIs, you can use client.simPause(True) and client.simPause(False) to pause the simulation while calling the image API and querying the desired physics state, ensuring that the physics state remains the same immediately after the image API call. C++ int getStereoAndDepthImages() { using namespace msr::airlib; typedef VehicleCameraBase::ImageRequest ImageRequest; typedef VehicleCameraBase::ImageResponse ImageResponse; typedef VehicleCameraBase::ImageType ImageType; // for car use // CarRpcLibClient client; MultirotorRpcLibClient client; // get right, left and depth images. First two as png, second as float16. std::vector request = { //png format ImageRequest(\"0\", ImageType::Scene), //uncompressed RGB array bytes ImageRequest(\"1\", ImageType::Scene, false, false), //floating point uncompressed image ImageRequest(\"1\", ImageType::DepthPlanar, true) }; const std::vector& response = client.simGetImages(request); // do something with response which contains image data, pose, timestamp etc } Ready to Run Complete Examples Python C++ For a more complete ready to run sample code please see sample code in HelloDrone project for multirotors or HelloCar project . See also other example code that generates specified number of stereo images along with ground truth depth and disparity and saving it to pfm format . Available Cameras These are the default cameras already available in each vehicle. Apart from these, you can add more cameras to the vehicles or make them are not attached to any vehicle by setting them as external . Car The cameras on car can be accessed by following names in API calls: front_center , front_right , front_left , fpv and back_center . Here FPV camera is driver's head position in the car. Multirotor The cameras on the drone can be accessed by following names in API calls: front_center , front_right , front_left , bottom_center and back_center . Computer Vision Mode Camera names are same as in multirotor. Backward compatibility for camera names Before AirSim v1.2, cameras were accessed using ID numbers instead of names. For backward compatibility you can still use following ID numbers for above camera names in same order as above: \"0\" , \"1\" , \"2\" , \"3\" , \"4\" . In addition, camera name \"\" is also available to access the default camera which is generally the camera \"0\" . \"Computer Vision\" Mode You can use AirSim in so-called \"Computer Vision\" mode. In this mode, physics engine is disabled and there is no vehicle, just cameras (If you want to have the vehicle but without its kinematics, you can use the Multirotor mode with the Physics Engine ExternalPhysicsEngine ). You can move around using keyboard (use F1 to see help on keys). You can press Record button to continuously generate images. Or you can call APIs to move cameras around and take images. You can use AirSim in so-called \"Computer Vision\" mode. In this mode, physics engine is disabled. It has a standard set of cameras and can have any sensor added similar to other vehicles. You can move around using keyboard (use F1 to see help on keys, additionally use left shift to go faster and spacebar to hold in place (handy for when moving camera manually). You can press Record button to continuously generate images. Or you can call APIs to move cameras around and take images. To active this mode, edit settings.json that you can find in your Documents\\AirSim folder (or ~/Documents/AirSim on Linux) and make sure following values exist at root level: { \"SettingsVersion\": 2.0, \"SimMode\": \"ComputerVision\" } This mode was inspired from UnrealCV project . Setting Pose in Computer Vision Mode To move around the environment using APIs you can use simSetVehiclePose API. This API takes position and orientation and sets that on the invisible vehicle where the front-center camera is located. All rest of the cameras move along keeping the relative position. If you don't want to change position (or orientation) then just set components of position (or orientation) to floating point nan values. The simGetVehiclePose allows to retrieve the current pose. You can also use simGetGroundTruthKinematics to get the quantities kinematics quantities for the movement. Many other non-vehicle specific APIs are also available such as segmentation APIs, collision APIs and camera APIs. Camera APIs The simGetCameraInfo returns the FOV(in degrees), projection matrix of a camera as well as the pose which can be: Default: The pose of the camera in the vehicle frame. External: If set to External the coordinates will be in either Unreal NED when ExternalLocal is false or Local NED (from starting position from vehicle) when ExternalLocal is true . Note that if MoveWorldOrigin in the settings.json is set to true the Unreal coordinates will be moved to be the same origin as the player start location and as such this may effect where the sensor will spawn and which coordinates are returned when ExternalLocal is false . The simSetCameraPose sets the pose for the specified camera while taking an input pose as a combination of relative position and a quaternion in NED frame. The handy airsim.to_quaternion() function allows to convert pitch, roll, yaw to quaternion. For example, to set camera-0 to 15-degree pitch while maintaining the same position, you can use: camera_pose = airsim.Pose(airsim.Vector3r(0, 0, 0), airsim.to_quaternion(0.261799, 0, 0)) #PRY in radians client.simSetCameraPose(0, camera_pose); simSetCameraFov allows changing the Field-of-View of the camera at runtime. simSetDistortionParams , simGetDistortionParams allow setting and fetching the distortion parameters K1, K2, K3, P1, P2 All Camera APIs take in 3 common parameters apart from the API-specific ones, camera_name (str), vehicle_name (str). Camera and vehicle name is used to get the specific camera on the specific vehicle. Gimbal You can set stabilization for pitch, roll or yaw for any camera using settings . Changing Resolution and Camera Parameters To change resolution, FOV etc, you can use settings.json . For example, below addition in settings.json sets parameters for scene capture and uses \"Computer Vision\" mode described above. If you omit any setting then below default values will be used. For more information see settings doc . If you are using stereo camera, currently the distance between left and right is fixed at 25 cm. { \"SettingsVersion\": 2.0, \"CameraDefaults\": { \"CaptureSettings\": [ { \"ImageType\": 0, \"Width\": 256, \"Height\": 144, \"FOV_Degrees\": 90, \"AutoExposureBias\": 1.3, \"AutoExposureMaxBrightness\": 0.64, \"AutoExposureMinBrightness\": 0.03, \"MotionBlurAmount\": 1, \"MotionBlurMax\": 10, \"ChromaticAberrationScale\": 2, \"LumenGIEnable\": true, \"LumenReflectionEnable\": true, \"LumenFinalQuality\": 1, \"LumenSceneDetail\": 1, \"LumenSceneLightningDetail\": 1 } ] }, \"SimMode\": \"ComputerVision\" } What Does Pixel Values Mean in Different Image Types? Available ImageType Values Scene = 0, DepthPlanar = 1, DepthPerspective = 2, DepthVis = 3, DisparityNormalized = 4, Segmentation = 5, SurfaceNormals = 6, Infrared = 7, OpticalFlow = 8, OpticalFlowVis = 9 Annotation = 10 DepthPlanar and DepthPerspective You normally want to retrieve the depth image as float (i.e. set pixels_as_float = true ) and specify ImageType = DepthPlanar or ImageType = DepthPerspective in ImageRequest . For ImageType = DepthPlanar , you get depth in camera plane, i.e., all points that are plane-parallel to the camera have same depth. For ImageType = DepthPerspective , you get depth from camera using a projection ray that hits that pixel. Depending on your use case, planner depth or perspective depth may be the ground truth image that you want. For example, you may be able to feed perspective depth to ROS package such as depth_image_proc to generate a point cloud. Or planner depth may be more compatible with estimated depth image generated by stereo algorithms such as SGM. DepthVis When you specify ImageType = DepthVis in ImageRequest , you get an image that helps depth visualization. In this case, each pixel value is interpolated from black to white depending on depth in camera plane in meters. The pixels with pure white means depth of 100m or more while pure black means depth of 0 meters. DisparityNormalized You normally want to retrieve disparity image as float (i.e. set pixels_as_float = true and specify ImageType = DisparityNormalized in ImageRequest ) in which case each pixel is (Xl - Xr)/Xmax , which is thereby normalized to values between 0 to 1. Segmentation When you specify ImageType = Segmentation in ImageRequest , you get an image that gives you ground truth instance segmentation of the scene. At the startup, AirSim assigns a random color index to each mesh available in environment. The RGB values for each color index ID can be retrieved from the API. You can assign a specific value to a specific mesh using APIs. For example, below Python code sets the object ID for the mesh called \"Ground\" to 20 in Blocks environment and hence changes its color in Segmentation view to the 20th color of the instance segmentation colormap: Note that this will not do a check if this color is already assigned to a different object! success = client.simSetSegmentationObjectID(\"Ground\", 20) The return value is a boolean type that lets you know if the mesh was found. Notice that typical Unreal environments, like Blocks, usually have many other meshes that comprises of same object, for example, \"Ground_2\", \"Ground_3\" and so on. As it is tedious to set object ID for all of these meshes, AirSim also supports regular expressions. For example, the code below sets all meshes which have names starting with \"ground\" (ignoring case) to 21 with just one line: success = client.simSetSegmentationObjectID(\"ground[\\w]*\", 21, True) The return value is true if at least one mesh was found using regular expression matching. When wanting to retrieve the segmentation image through the API, it is recommended that you request uncompressed image using this API to ensure you get precise RGB values for segmentation image: responses = client.simGetImages([airsim.ImageRequest( \"front_center\", airsim.ImageType.Segmentation, False, False)]) img_rgb_string = responses[0].image_data_uint8 rgbarray = np.frombuffer(img_rgb_string, np.uint8) rgbarray_shaped = rgbarray.reshape((540,960,3)) rgbarray_shaped = rgbarray_shaped img = Image.fromarray(rgbarray_shaped, 'RGB') img.show() To retrieve the color map to know which color is assign to each color index you can use: colorMap = client.simGetSegmentationColorMap() An example can be found in segmentation_test.py (Cosys-AirSim/PythonClient/segmentation/segmentation_test.py). For a script that generates a full list of objects and their associated color, please see the script segmentation_generate_list.py (Cosys-AirSim/PythonClient/segmentation/segmentation_generate_list.py). How to Find Mesh names? To get desired ground truth segmentation you will need to know the names of the meshes in your Unreal environment. To do this, you can use the API: currentObjectList = client.simListInstanceSegmentationObjects() This will use an understandable naming depending on the hierarchy the object belong to in the Unreal World (example box_2_fullpalletspawner_5_pallet_4 or door_window_door_38 ). Note that this provides a different result from simListSceneObjects() as this one will make a simple list of all Unreal Actors in the scene, without keeping the hierarchy in mind. An extension to simListInstanceSegmentationObjects() is simListInstanceSegmentationPoses(ned=True, only_visible=True) which will retrieve the 3D object pose of each element in the same order as the first mentioned function. only_visible allows you to only get the objects that are physically visible in the scene. Once you decide on the meshes you are interested, note down their names and use above API to set their object IDs. T Changing Colors for Object IDs At present the color for each object ID is fixed as in this pallet . We will be adding ability to change colors for object IDs to desired values shortly. In the meantime you can open the segmentation image in your favorite image editor and get the RGB values you are interested in. Startup Object IDs At the start, AirSim assigns color indexes to each object found in environment of type UStaticMeshComponent or USkinnedMeshComponent . It then makes an understandable naming depending on the hierarchy the object belong to in the Unreal World (example box_2_fullpalletspawner_5_pallet_4 or door_window_door_38 ). Getting Object ID for Mesh The simGetSegmentationObjectID API allows you get object ID for given mesh name. More information Please see the instance segmentation documentation for some more information on the segmentation system created by Cosys-Lab. Infrared Currently, this is just a map from object ID to grey scale 0-255. So any mesh with object ID 42 shows up with color (42, 42, 42). Please see segmentation section for more details on how to set object IDs. Typically noise setting can be applied for this image type to get slightly more realistic effect. We are still working on adding other infrared artifacts and any contributions are welcome. OpticalFlow and OpticalFlowVis These image types return information about motion perceived by the point of view of the camera. OpticalFlow returns a 2-channel image where the channels correspond to vx and vy respectively. OpticalFlowVis is similar to OpticalFlow but converts flow data to RGB for a more 'visual' output. Object Detection This feature lets you generate object detection using existing cameras in AirSim, find more info here . Annotation The annotation system allows you to choose different groundtruth labeling techniques to create more data from your simulation. Find more info here . When enabling annotation layers, one can choose to render images as well from these layers. The image type if set to annotation does usually require to also supply the name of the annotation layer as defined in the settings. For example with Python, you can use the following examples for RGB and greyscale annotation layers. responses = client.simGetImages([airsim.ImageRequest( \"front_center\", airsim.ImageType.Annotation, False, False, \"RGBTest\")]) img_rgb_string = responses[0].image_data_uint8 rgbarray = np.frombuffer(img_rgb_string, np.uint8) rgbarray_shaped = rgbarray.reshape((540,960,3)) img = Image.fromarray(rgbarray_shaped, 'RGB') img.show() responses = client.simGetImages([airsim.ImageRequest( \"front_center\", airsim.ImageType.Annotation, False, False, \"GreyscaleTest\")]) img_rgb_string = responses[0].image_data_uint8 rgbarray = np.frombuffer(img_rgb_string, np.uint8) rgbarray_shaped = rgbarray.reshape((540,960,3)) greyscale_values = np.divide(rgbarray_shaped[:,:,0], 255) img = Image.fromarray(rgbarray_shaped[:,:,0]) img.show() Lumen Lightning for Scene camera Unreal 5 introduces Lumen lightning. Due to the cameras using scene capture components enabling Lumen for them can be costly on performance. Settings have been added specfically for the scene camera to customize the usage of Lumen for Global Illumination and Reflections. The LumenGIEnable and LumenReflectionEnable settings enable or disable Lumen for the camera. The LumenFinalQuality (0.25-2) setting determines the quality of the final image. The LumenSceneDetail (0.25-4) setting determines the quality of the scene. The LumenSceneLightningDetail (0.25-2) setting determines the quality of the lightning in the scene.","title":"Image APIs"},{"location":"image_apis/#image-apis","text":"Please read general API doc first if you are not familiar with AirSim APIs.","title":"Image APIs"},{"location":"image_apis/#getting-a-single-image","text":"Here's a sample code to get a single image from camera named \"0\". The returned value is bytes of png format image. To get uncompressed and other format as well as available cameras please see next sections.","title":"Getting a Single Image"},{"location":"image_apis/#python","text":"import cosysairsim as airsim # for car use CarClient() client = airsim.MultirotorClient() png_image = client.simGetImage(\"0\", airsim.ImageType.Scene) # do something with image","title":"Python"},{"location":"image_apis/#c","text":"#include \"vehicles/multirotor/api/MultirotorRpcLibClient.hpp\" int getOneImage() { using namespace msr::airlib; // for car use CarRpcLibClient MultirotorRpcLibClient client; std::vector png_image = client.simGetImage(\"0\", VehicleCameraBase::ImageType::Scene); // do something with images }","title":"C++"},{"location":"image_apis/#getting-images-with-more-flexibility","text":"The simGetImages API which is slightly more complex to use than simGetImage API, for example, you can get left camera view, right camera view and depth image from left camera in a single API call. The simGetImages API also allows you to get uncompressed images as well as floating point single channel images (instead of 3 channel (RGB), each 8 bit).","title":"Getting Images with More Flexibility"},{"location":"image_apis/#python_1","text":"import cosysairsim as airsim # for car use CarClient() client = airsim.MultirotorClient() responses = client.simGetImages([ # png format airsim.ImageRequest(0, airsim.ImageType.Scene), # uncompressed RGB array bytes airsim.ImageRequest(1, airsim.ImageType.Scene, False, False), # floating point uncompressed image airsim.ImageRequest(1, airsim.ImageType.DepthPlanar, True)]) # do something with response which contains image data, pose, timestamp etc","title":"Python"},{"location":"image_apis/#using-airsim-images-with-numpy","text":"If you plan to use numpy for image manipulation, you should get uncompressed RGB image and then convert to numpy like this: responses = client.simGetImages([airsim.ImageRequest(\"0\", airsim.ImageType.Scene, False, False)]) response = responses[0] # get numpy array img1d = np.fromstring(response.image_data_uint8, dtype=np.uint8) # reshape array to 4 channel image array H X W X 4 img_rgb = img1d.reshape(response.height, response.width, 3) # original image is fliped vertically img_rgb = np.flipud(img_rgb) # write to png airsim.write_png(os.path.normpath(filename + '.png'), img_rgb)","title":"Using AirSim Images with NumPy"},{"location":"image_apis/#quick-tips","text":"The API simGetImage returns binary string literal which means you can simply dump it in binary file to create a .png file. However if you want to process it in any other way than you can handy function airsim.string_to_uint8_array . This converts binary string literal to NumPy uint8 array. The API simGetImages can accept request for multiple image types from any cameras in single call. You can specify if image is png compressed, RGB uncompressed or float array. For png compressed images, you get binary string literal . For float array you get Python list of float64. You can convert this float array to NumPy 2D array using airsim.list_to_2d_float_array(response.image_data_float, response.width, response.height) You can also save float array to .pfm file (Portable Float Map format) using airsim.write_pfm() function. If you are looking to query position and orientation information in sync with a call to one of the image APIs, you can use client.simPause(True) and client.simPause(False) to pause the simulation while calling the image API and querying the desired physics state, ensuring that the physics state remains the same immediately after the image API call.","title":"Quick Tips"},{"location":"image_apis/#c_1","text":"int getStereoAndDepthImages() { using namespace msr::airlib; typedef VehicleCameraBase::ImageRequest ImageRequest; typedef VehicleCameraBase::ImageResponse ImageResponse; typedef VehicleCameraBase::ImageType ImageType; // for car use // CarRpcLibClient client; MultirotorRpcLibClient client; // get right, left and depth images. First two as png, second as float16. std::vector request = { //png format ImageRequest(\"0\", ImageType::Scene), //uncompressed RGB array bytes ImageRequest(\"1\", ImageType::Scene, false, false), //floating point uncompressed image ImageRequest(\"1\", ImageType::DepthPlanar, true) }; const std::vector& response = client.simGetImages(request); // do something with response which contains image data, pose, timestamp etc }","title":"C++"},{"location":"image_apis/#ready-to-run-complete-examples","text":"","title":"Ready to Run Complete Examples"},{"location":"image_apis/#python_2","text":"","title":"Python"},{"location":"image_apis/#c_2","text":"For a more complete ready to run sample code please see sample code in HelloDrone project for multirotors or HelloCar project . See also other example code that generates specified number of stereo images along with ground truth depth and disparity and saving it to pfm format .","title":"C++"},{"location":"image_apis/#available-cameras","text":"These are the default cameras already available in each vehicle. Apart from these, you can add more cameras to the vehicles or make them are not attached to any vehicle by setting them as external .","title":"Available Cameras"},{"location":"image_apis/#car","text":"The cameras on car can be accessed by following names in API calls: front_center , front_right , front_left , fpv and back_center . Here FPV camera is driver's head position in the car.","title":"Car"},{"location":"image_apis/#multirotor","text":"The cameras on the drone can be accessed by following names in API calls: front_center , front_right , front_left , bottom_center and back_center .","title":"Multirotor"},{"location":"image_apis/#computer-vision-mode","text":"Camera names are same as in multirotor.","title":"Computer Vision Mode"},{"location":"image_apis/#backward-compatibility-for-camera-names","text":"Before AirSim v1.2, cameras were accessed using ID numbers instead of names. For backward compatibility you can still use following ID numbers for above camera names in same order as above: \"0\" , \"1\" , \"2\" , \"3\" , \"4\" . In addition, camera name \"\" is also available to access the default camera which is generally the camera \"0\" .","title":"Backward compatibility for camera names"},{"location":"image_apis/#computer-vision-mode_1","text":"You can use AirSim in so-called \"Computer Vision\" mode. In this mode, physics engine is disabled and there is no vehicle, just cameras (If you want to have the vehicle but without its kinematics, you can use the Multirotor mode with the Physics Engine ExternalPhysicsEngine ). You can move around using keyboard (use F1 to see help on keys). You can press Record button to continuously generate images. Or you can call APIs to move cameras around and take images. You can use AirSim in so-called \"Computer Vision\" mode. In this mode, physics engine is disabled. It has a standard set of cameras and can have any sensor added similar to other vehicles. You can move around using keyboard (use F1 to see help on keys, additionally use left shift to go faster and spacebar to hold in place (handy for when moving camera manually). You can press Record button to continuously generate images. Or you can call APIs to move cameras around and take images. To active this mode, edit settings.json that you can find in your Documents\\AirSim folder (or ~/Documents/AirSim on Linux) and make sure following values exist at root level: { \"SettingsVersion\": 2.0, \"SimMode\": \"ComputerVision\" } This mode was inspired from UnrealCV project .","title":"\"Computer Vision\" Mode"},{"location":"image_apis/#setting-pose-in-computer-vision-mode","text":"To move around the environment using APIs you can use simSetVehiclePose API. This API takes position and orientation and sets that on the invisible vehicle where the front-center camera is located. All rest of the cameras move along keeping the relative position. If you don't want to change position (or orientation) then just set components of position (or orientation) to floating point nan values. The simGetVehiclePose allows to retrieve the current pose. You can also use simGetGroundTruthKinematics to get the quantities kinematics quantities for the movement. Many other non-vehicle specific APIs are also available such as segmentation APIs, collision APIs and camera APIs.","title":"Setting Pose in Computer Vision Mode"},{"location":"image_apis/#camera-apis","text":"The simGetCameraInfo returns the FOV(in degrees), projection matrix of a camera as well as the pose which can be: Default: The pose of the camera in the vehicle frame. External: If set to External the coordinates will be in either Unreal NED when ExternalLocal is false or Local NED (from starting position from vehicle) when ExternalLocal is true . Note that if MoveWorldOrigin in the settings.json is set to true the Unreal coordinates will be moved to be the same origin as the player start location and as such this may effect where the sensor will spawn and which coordinates are returned when ExternalLocal is false . The simSetCameraPose sets the pose for the specified camera while taking an input pose as a combination of relative position and a quaternion in NED frame. The handy airsim.to_quaternion() function allows to convert pitch, roll, yaw to quaternion. For example, to set camera-0 to 15-degree pitch while maintaining the same position, you can use: camera_pose = airsim.Pose(airsim.Vector3r(0, 0, 0), airsim.to_quaternion(0.261799, 0, 0)) #PRY in radians client.simSetCameraPose(0, camera_pose); simSetCameraFov allows changing the Field-of-View of the camera at runtime. simSetDistortionParams , simGetDistortionParams allow setting and fetching the distortion parameters K1, K2, K3, P1, P2 All Camera APIs take in 3 common parameters apart from the API-specific ones, camera_name (str), vehicle_name (str). Camera and vehicle name is used to get the specific camera on the specific vehicle.","title":"Camera APIs"},{"location":"image_apis/#gimbal","text":"You can set stabilization for pitch, roll or yaw for any camera using settings .","title":"Gimbal"},{"location":"image_apis/#changing-resolution-and-camera-parameters","text":"To change resolution, FOV etc, you can use settings.json . For example, below addition in settings.json sets parameters for scene capture and uses \"Computer Vision\" mode described above. If you omit any setting then below default values will be used. For more information see settings doc . If you are using stereo camera, currently the distance between left and right is fixed at 25 cm. { \"SettingsVersion\": 2.0, \"CameraDefaults\": { \"CaptureSettings\": [ { \"ImageType\": 0, \"Width\": 256, \"Height\": 144, \"FOV_Degrees\": 90, \"AutoExposureBias\": 1.3, \"AutoExposureMaxBrightness\": 0.64, \"AutoExposureMinBrightness\": 0.03, \"MotionBlurAmount\": 1, \"MotionBlurMax\": 10, \"ChromaticAberrationScale\": 2, \"LumenGIEnable\": true, \"LumenReflectionEnable\": true, \"LumenFinalQuality\": 1, \"LumenSceneDetail\": 1, \"LumenSceneLightningDetail\": 1 } ] }, \"SimMode\": \"ComputerVision\" }","title":"Changing Resolution and Camera Parameters"},{"location":"image_apis/#what-does-pixel-values-mean-in-different-image-types","text":"","title":"What Does Pixel Values Mean in Different Image Types?"},{"location":"image_apis/#available-imagetype-values","text":"Scene = 0, DepthPlanar = 1, DepthPerspective = 2, DepthVis = 3, DisparityNormalized = 4, Segmentation = 5, SurfaceNormals = 6, Infrared = 7, OpticalFlow = 8, OpticalFlowVis = 9 Annotation = 10","title":"Available ImageType Values"},{"location":"image_apis/#depthplanar-and-depthperspective","text":"You normally want to retrieve the depth image as float (i.e. set pixels_as_float = true ) and specify ImageType = DepthPlanar or ImageType = DepthPerspective in ImageRequest . For ImageType = DepthPlanar , you get depth in camera plane, i.e., all points that are plane-parallel to the camera have same depth. For ImageType = DepthPerspective , you get depth from camera using a projection ray that hits that pixel. Depending on your use case, planner depth or perspective depth may be the ground truth image that you want. For example, you may be able to feed perspective depth to ROS package such as depth_image_proc to generate a point cloud. Or planner depth may be more compatible with estimated depth image generated by stereo algorithms such as SGM.","title":"DepthPlanar and DepthPerspective"},{"location":"image_apis/#depthvis","text":"When you specify ImageType = DepthVis in ImageRequest , you get an image that helps depth visualization. In this case, each pixel value is interpolated from black to white depending on depth in camera plane in meters. The pixels with pure white means depth of 100m or more while pure black means depth of 0 meters.","title":"DepthVis"},{"location":"image_apis/#disparitynormalized","text":"You normally want to retrieve disparity image as float (i.e. set pixels_as_float = true and specify ImageType = DisparityNormalized in ImageRequest ) in which case each pixel is (Xl - Xr)/Xmax , which is thereby normalized to values between 0 to 1.","title":"DisparityNormalized"},{"location":"image_apis/#segmentation","text":"When you specify ImageType = Segmentation in ImageRequest , you get an image that gives you ground truth instance segmentation of the scene. At the startup, AirSim assigns a random color index to each mesh available in environment. The RGB values for each color index ID can be retrieved from the API. You can assign a specific value to a specific mesh using APIs. For example, below Python code sets the object ID for the mesh called \"Ground\" to 20 in Blocks environment and hence changes its color in Segmentation view to the 20th color of the instance segmentation colormap: Note that this will not do a check if this color is already assigned to a different object! success = client.simSetSegmentationObjectID(\"Ground\", 20) The return value is a boolean type that lets you know if the mesh was found. Notice that typical Unreal environments, like Blocks, usually have many other meshes that comprises of same object, for example, \"Ground_2\", \"Ground_3\" and so on. As it is tedious to set object ID for all of these meshes, AirSim also supports regular expressions. For example, the code below sets all meshes which have names starting with \"ground\" (ignoring case) to 21 with just one line: success = client.simSetSegmentationObjectID(\"ground[\\w]*\", 21, True) The return value is true if at least one mesh was found using regular expression matching. When wanting to retrieve the segmentation image through the API, it is recommended that you request uncompressed image using this API to ensure you get precise RGB values for segmentation image: responses = client.simGetImages([airsim.ImageRequest( \"front_center\", airsim.ImageType.Segmentation, False, False)]) img_rgb_string = responses[0].image_data_uint8 rgbarray = np.frombuffer(img_rgb_string, np.uint8) rgbarray_shaped = rgbarray.reshape((540,960,3)) rgbarray_shaped = rgbarray_shaped img = Image.fromarray(rgbarray_shaped, 'RGB') img.show() To retrieve the color map to know which color is assign to each color index you can use: colorMap = client.simGetSegmentationColorMap() An example can be found in segmentation_test.py (Cosys-AirSim/PythonClient/segmentation/segmentation_test.py). For a script that generates a full list of objects and their associated color, please see the script segmentation_generate_list.py (Cosys-AirSim/PythonClient/segmentation/segmentation_generate_list.py).","title":"Segmentation"},{"location":"image_apis/#how-to-find-mesh-names","text":"To get desired ground truth segmentation you will need to know the names of the meshes in your Unreal environment. To do this, you can use the API: currentObjectList = client.simListInstanceSegmentationObjects() This will use an understandable naming depending on the hierarchy the object belong to in the Unreal World (example box_2_fullpalletspawner_5_pallet_4 or door_window_door_38 ). Note that this provides a different result from simListSceneObjects() as this one will make a simple list of all Unreal Actors in the scene, without keeping the hierarchy in mind. An extension to simListInstanceSegmentationObjects() is simListInstanceSegmentationPoses(ned=True, only_visible=True) which will retrieve the 3D object pose of each element in the same order as the first mentioned function. only_visible allows you to only get the objects that are physically visible in the scene. Once you decide on the meshes you are interested, note down their names and use above API to set their object IDs. T","title":"How to Find Mesh names?"},{"location":"image_apis/#changing-colors-for-object-ids","text":"At present the color for each object ID is fixed as in this pallet . We will be adding ability to change colors for object IDs to desired values shortly. In the meantime you can open the segmentation image in your favorite image editor and get the RGB values you are interested in.","title":"Changing Colors for Object IDs"},{"location":"image_apis/#startup-object-ids","text":"At the start, AirSim assigns color indexes to each object found in environment of type UStaticMeshComponent or USkinnedMeshComponent . It then makes an understandable naming depending on the hierarchy the object belong to in the Unreal World (example box_2_fullpalletspawner_5_pallet_4 or door_window_door_38 ).","title":"Startup Object IDs"},{"location":"image_apis/#getting-object-id-for-mesh","text":"The simGetSegmentationObjectID API allows you get object ID for given mesh name.","title":"Getting Object ID for Mesh"},{"location":"image_apis/#more-information","text":"Please see the instance segmentation documentation for some more information on the segmentation system created by Cosys-Lab.","title":"More information"},{"location":"image_apis/#infrared","text":"Currently, this is just a map from object ID to grey scale 0-255. So any mesh with object ID 42 shows up with color (42, 42, 42). Please see segmentation section for more details on how to set object IDs. Typically noise setting can be applied for this image type to get slightly more realistic effect. We are still working on adding other infrared artifacts and any contributions are welcome.","title":"Infrared"},{"location":"image_apis/#opticalflow-and-opticalflowvis","text":"These image types return information about motion perceived by the point of view of the camera. OpticalFlow returns a 2-channel image where the channels correspond to vx and vy respectively. OpticalFlowVis is similar to OpticalFlow but converts flow data to RGB for a more 'visual' output.","title":"OpticalFlow and OpticalFlowVis"},{"location":"image_apis/#object-detection","text":"This feature lets you generate object detection using existing cameras in AirSim, find more info here .","title":"Object Detection"},{"location":"image_apis/#annotation","text":"The annotation system allows you to choose different groundtruth labeling techniques to create more data from your simulation. Find more info here . When enabling annotation layers, one can choose to render images as well from these layers. The image type if set to annotation does usually require to also supply the name of the annotation layer as defined in the settings. For example with Python, you can use the following examples for RGB and greyscale annotation layers. responses = client.simGetImages([airsim.ImageRequest( \"front_center\", airsim.ImageType.Annotation, False, False, \"RGBTest\")]) img_rgb_string = responses[0].image_data_uint8 rgbarray = np.frombuffer(img_rgb_string, np.uint8) rgbarray_shaped = rgbarray.reshape((540,960,3)) img = Image.fromarray(rgbarray_shaped, 'RGB') img.show() responses = client.simGetImages([airsim.ImageRequest( \"front_center\", airsim.ImageType.Annotation, False, False, \"GreyscaleTest\")]) img_rgb_string = responses[0].image_data_uint8 rgbarray = np.frombuffer(img_rgb_string, np.uint8) rgbarray_shaped = rgbarray.reshape((540,960,3)) greyscale_values = np.divide(rgbarray_shaped[:,:,0], 255) img = Image.fromarray(rgbarray_shaped[:,:,0]) img.show()","title":"Annotation"},{"location":"image_apis/#lumen-lightning-for-scene-camera","text":"Unreal 5 introduces Lumen lightning. Due to the cameras using scene capture components enabling Lumen for them can be costly on performance. Settings have been added specfically for the scene camera to customize the usage of Lumen for Global Illumination and Reflections. The LumenGIEnable and LumenReflectionEnable settings enable or disable Lumen for the camera. The LumenFinalQuality (0.25-2) setting determines the quality of the final image. The LumenSceneDetail (0.25-4) setting determines the quality of the scene. The LumenSceneLightningDetail (0.25-2) setting determines the quality of the lightning in the scene.","title":"Lumen Lightning for Scene camera"},{"location":"install_linux/","text":"Intall or Build Cosys-AirSim on Linux The current recommended and tested environment is Ubuntu 22.04 LTS . Theoretically, you can build on other distros as well, but we haven't tested it. Install Unreal Engine Download the latest version of Unreal Engine 5.4 from the [official download page]](https://www.unrealengine.com/en-US/linux). This will require an Epic Games account. Once the zip archive is downloaded you can extract it to where you want to install the Unreal Engine. unzip Linux_Unreal_Engine_5.4.X.zip -d destination_folder If you chose a folder such as for example /opt/UnrealEngine make sure to provide permissions and to set the owner, otherwise you might run into issues: sudo chmod -R 777 /opt/UnrealEngine sudo chown -r yourusername /opt/UnrealEngine From where you install Unreal Engine, you can run Engine/Binaries/Linux/UnrealEditor from the terminal to launch Unreal Engine. For more information you can read the quick start guide . You can alternatively install Unreal Engine from source if you do not use a Ubuntu distribution, see the documentation linked above for more information. Build Cosys-Airsim Clone Cosys-AirSim and build it: bash # go to the folder where you clone GitHub projects git clone https://github.com/Cosys-Lab/Cosys-AirSim.git cd Cosys-AirSim ./setup.sh ./build.sh Build Unreal Environment Finally, you will need an Unreal project that hosts the environment for your vehicles. Cosys-AirSim comes with a built-in \"Blocks Environment\" which you can use, or you can create your own. Please see setting up Unreal Environment if you'd like to setup your own environment. How to Use Cosys-AirSim Once Cosys-AirSim is setup: - Navigate to the environment folder (for example for BLocks it is Unreal\\Environments\\Blocks ), and run update_from_git.sh . - Go to UnrealEngine installation folder and start Unreal by running ./Engine/Binaries/Linux/UnrealEditor . - When Unreal Engine prompts for opening or creating project, select Browse and choose Cosys-AirSim/Unreal/Environments/Blocks (or your custom Unreal project). - Alternatively, the project file can be passed as a commandline argument. For Blocks: ./Engine/Binaries/Linux/UnrealEditor /Unreal/Environments/Blocks/Blocks.uproject - If you get prompts to convert project, look for More Options or Convert-In-Place option. If you get prompted to build, choose Yes. If you get prompted to disable Cosys-AirSim plugin, choose No. - After Unreal Editor loads, press Play button. See Using APIs and settings.json for various options available for Cosys-AirSim usage. !!! tip Go to 'Edit->Editor Preferences', in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. [Optional] Setup Remote Control (Multirotor Only) A remote control is required if you want to fly manually. See the remote control setup for more details. Alternatively, you can use APIs for programmatic control or use the so-called Computer Vision mode to move around using the keyboard.","title":"Install from Source on Linux"},{"location":"install_linux/#intall-or-build-cosys-airsim-on-linux","text":"The current recommended and tested environment is Ubuntu 22.04 LTS . Theoretically, you can build on other distros as well, but we haven't tested it.","title":"Intall or Build Cosys-AirSim on Linux"},{"location":"install_linux/#install-unreal-engine","text":"Download the latest version of Unreal Engine 5.4 from the [official download page]](https://www.unrealengine.com/en-US/linux). This will require an Epic Games account. Once the zip archive is downloaded you can extract it to where you want to install the Unreal Engine. unzip Linux_Unreal_Engine_5.4.X.zip -d destination_folder If you chose a folder such as for example /opt/UnrealEngine make sure to provide permissions and to set the owner, otherwise you might run into issues: sudo chmod -R 777 /opt/UnrealEngine sudo chown -r yourusername /opt/UnrealEngine From where you install Unreal Engine, you can run Engine/Binaries/Linux/UnrealEditor from the terminal to launch Unreal Engine. For more information you can read the quick start guide . You can alternatively install Unreal Engine from source if you do not use a Ubuntu distribution, see the documentation linked above for more information.","title":"Install Unreal Engine"},{"location":"install_linux/#build-cosys-airsim","text":"Clone Cosys-AirSim and build it: bash # go to the folder where you clone GitHub projects git clone https://github.com/Cosys-Lab/Cosys-AirSim.git cd Cosys-AirSim ./setup.sh ./build.sh","title":"Build Cosys-Airsim"},{"location":"install_linux/#build-unreal-environment","text":"Finally, you will need an Unreal project that hosts the environment for your vehicles. Cosys-AirSim comes with a built-in \"Blocks Environment\" which you can use, or you can create your own. Please see setting up Unreal Environment if you'd like to setup your own environment.","title":"Build Unreal Environment"},{"location":"install_linux/#how-to-use-cosys-airsim","text":"Once Cosys-AirSim is setup: - Navigate to the environment folder (for example for BLocks it is Unreal\\Environments\\Blocks ), and run update_from_git.sh . - Go to UnrealEngine installation folder and start Unreal by running ./Engine/Binaries/Linux/UnrealEditor . - When Unreal Engine prompts for opening or creating project, select Browse and choose Cosys-AirSim/Unreal/Environments/Blocks (or your custom Unreal project). - Alternatively, the project file can be passed as a commandline argument. For Blocks: ./Engine/Binaries/Linux/UnrealEditor /Unreal/Environments/Blocks/Blocks.uproject - If you get prompts to convert project, look for More Options or Convert-In-Place option. If you get prompted to build, choose Yes. If you get prompted to disable Cosys-AirSim plugin, choose No. - After Unreal Editor loads, press Play button. See Using APIs and settings.json for various options available for Cosys-AirSim usage. !!! tip Go to 'Edit->Editor Preferences', in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked.","title":"How to Use Cosys-AirSim"},{"location":"install_linux/#optional-setup-remote-control-multirotor-only","text":"A remote control is required if you want to fly manually. See the remote control setup for more details. Alternatively, you can use APIs for programmatic control or use the so-called Computer Vision mode to move around using the keyboard.","title":"[Optional] Setup Remote Control (Multirotor Only)"},{"location":"install_precompiled/","text":"Download and install precompiled Plugin If you wish to not build the plugin from source, you can download the precompiled plugin from the releases page for the right version of Unreal you are using. It does not come with a environment so you will need to create your own Unreal project. Follow this step-by-step guide . The releases page also comes with additional downloads and links to the several API implementations for ROS(2) and the Python and Matlab API clients for that specific version of the Cosys-AirSim plugin.","title":"Download and install from precompiled plugin"},{"location":"install_precompiled/#download-and-install-precompiled-plugin","text":"If you wish to not build the plugin from source, you can download the precompiled plugin from the releases page for the right version of Unreal you are using. It does not come with a environment so you will need to create your own Unreal project. Follow this step-by-step guide . The releases page also comes with additional downloads and links to the several API implementations for ROS(2) and the Python and Matlab API clients for that specific version of the Cosys-AirSim plugin.","title":"Download and install precompiled Plugin"},{"location":"install_windows/","text":"Install or Build Cosys-AirSim on Windows Install Unreal Engine Download the Epic Games Launcher. While the Unreal Engine is open source and free to download, registration is still required. Run the Epic Games Launcher, open the Unreal Engine tab on the left pane. Click on the Install button on the top right, which should show the option to download Unreal Engine 5.4.X . Chose the install location to suit your needs, as shown in the images below. If you have multiple versions of Unreal installed then make sure the version you are using is set to current by clicking down arrow next to the Launch button for the version. Build Cosys-AirSim Install Visual Studio 2022. Make sure to select Desktop Development with C++ and Windows 10/11 SDK 10.0.X (choose latest) and select the latest .NET Framework SDK under the 'Individual Components' tab while installing VS 2022. More info here . Start Developer Command Prompt for VS 2022 . Clone the repo: git clone https://github.com/Cosys-Lab/Cosys-AirSim.git , and go the AirSim directory by cd Cosys-AirSim . Run build.cmd from the command line. This will create ready to use plugin bits in the Unreal\\Plugins folder that can be dropped into any Unreal project. Build Unreal Project Finally, you will need an Unreal project that hosts the environment for your vehicles. Make sure to close and re-open the Unreal Engine and the Epic Games Launcher before building your first environment if you haven't done so already. After restarting the Epic Games Launcher it will ask you to associate project file extensions with Unreal Engine, click on 'fix now' to fix it. Cosys-AirSim comes with a built-in \"Blocks Environment\" which you can use, or you can create your own. Please see setting up Unreal Environment . Setup Remote Control (Multirotor only) A remote control is required if you want to fly manually. See the remote control setup for more details. Alternatively, you can use APIs for programmatic control or use the so-called Computer Vision mode to move around using the keyboard. How to Use Cosys-AirSim Once Cosys-AirSim is set up by following above steps, you can, 1. Navigate to folder Unreal\\Environments\\Blocks and run update_from_git.bat . 2. Double click on .sln file to load the Blocks project in Unreal\\Environments\\Blocks (or .sln file in your own custom Unreal project). If you don't see .sln file then you probably haven't completed steps in Build Unreal Project section above. 3. Select your Unreal project as Start Up project (for example, Blocks project) and make sure Build config is set to \"Develop Editor\" and x64. 4. After Unreal Editor loads, press Play button. !!! tip Go to 'Edit->Editor Preferences', in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. See Using APIs and settings.json for various options available. FAQ I get an error Il \u2018P1\u2019, version \u2018X\u2019, does not match \u2018P2\u2019, version \u2018X\u2019 This is caused by multiple versions of Visual Studio installed on the machine. The build script of Cosys-AirSim will use the latest versions it can find so need to make Unreal does the same. Open or create a file called BuildConfiguration.xml in C:\\Users\\USERNAME\\AppData\\Roaming\\Unreal Engine\\UnrealBuildTool and add the following: Latest I get error C100 : An internal error has occurred in the compiler when running build.cmd We have noticed this happening with VS version 15.9.0 and have checked-in a workaround in Cosys-AirSim code. If you have this VS version, please make sure to pull the latest Cosys-AirSim code. I get error \"'corecrt.h': No such file or directory\" or \"Windows SDK version 8.1 not found\" Very likely you don't have Windows SDK installed with Visual Studio. How do I use PX4 firmware with Cosys-AirSim? By default, Cosys-AirSim uses its own built-in firmware called simple_flight . There is no additional setup if you just want to go with it. If you want to switch to using PX4 instead then please see this guide . I made changes in Visual Studio but there is no effect Sometimes the Unreal + VS build system doesn't recompile if you make changes to only header files. To ensure a recompile, make some Unreal based cpp file \"dirty\" like AirSimGameMode.cpp. Unreal still uses VS2015 or I'm getting some link error Running several versions of VS can lead to issues when compiling UE projects. One problem that may arise is that UE will try to compile with an older version of VS which may or may not work. There are two settings in Unreal, one for for the engine and one for the project, to adjust the version of VS to be used. 1. Edit -> Editor preferences -> General -> Source code 2. Edit -> Project Settings -> Platforms -> Windows -> Toolchain ->CompilerVersion In some cases, these settings will still not lead to the desired result and errors such as the following might be produced: LINK : fatal error LNK1181: cannot open input file 'ws2_32.lib' To resolve such issues the following procedure can be applied: 1. Uninstall all old versions of VS using the VisualStudioUninstaller 2. Repair/Install VS2017 3. Restart machine and install Epic launcher and desired version of the engine","title":"Install from Source on Windows"},{"location":"install_windows/#install-or-build-cosys-airsim-on-windows","text":"","title":"Install or Build Cosys-AirSim on Windows"},{"location":"install_windows/#install-unreal-engine","text":"Download the Epic Games Launcher. While the Unreal Engine is open source and free to download, registration is still required. Run the Epic Games Launcher, open the Unreal Engine tab on the left pane. Click on the Install button on the top right, which should show the option to download Unreal Engine 5.4.X . Chose the install location to suit your needs, as shown in the images below. If you have multiple versions of Unreal installed then make sure the version you are using is set to current by clicking down arrow next to the Launch button for the version.","title":"Install Unreal Engine"},{"location":"install_windows/#build-cosys-airsim","text":"Install Visual Studio 2022. Make sure to select Desktop Development with C++ and Windows 10/11 SDK 10.0.X (choose latest) and select the latest .NET Framework SDK under the 'Individual Components' tab while installing VS 2022. More info here . Start Developer Command Prompt for VS 2022 . Clone the repo: git clone https://github.com/Cosys-Lab/Cosys-AirSim.git , and go the AirSim directory by cd Cosys-AirSim . Run build.cmd from the command line. This will create ready to use plugin bits in the Unreal\\Plugins folder that can be dropped into any Unreal project.","title":"Build Cosys-AirSim"},{"location":"install_windows/#build-unreal-project","text":"Finally, you will need an Unreal project that hosts the environment for your vehicles. Make sure to close and re-open the Unreal Engine and the Epic Games Launcher before building your first environment if you haven't done so already. After restarting the Epic Games Launcher it will ask you to associate project file extensions with Unreal Engine, click on 'fix now' to fix it. Cosys-AirSim comes with a built-in \"Blocks Environment\" which you can use, or you can create your own. Please see setting up Unreal Environment .","title":"Build Unreal Project"},{"location":"install_windows/#setup-remote-control-multirotor-only","text":"A remote control is required if you want to fly manually. See the remote control setup for more details. Alternatively, you can use APIs for programmatic control or use the so-called Computer Vision mode to move around using the keyboard.","title":"Setup Remote Control (Multirotor only)"},{"location":"install_windows/#how-to-use-cosys-airsim","text":"Once Cosys-AirSim is set up by following above steps, you can, 1. Navigate to folder Unreal\\Environments\\Blocks and run update_from_git.bat . 2. Double click on .sln file to load the Blocks project in Unreal\\Environments\\Blocks (or .sln file in your own custom Unreal project). If you don't see .sln file then you probably haven't completed steps in Build Unreal Project section above. 3. Select your Unreal project as Start Up project (for example, Blocks project) and make sure Build config is set to \"Develop Editor\" and x64. 4. After Unreal Editor loads, press Play button. !!! tip Go to 'Edit->Editor Preferences', in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. See Using APIs and settings.json for various options available.","title":"How to Use Cosys-AirSim"},{"location":"install_windows/#faq","text":"","title":"FAQ"},{"location":"install_windows/#i-get-an-error-il-p1-version-x-does-not-match-p2-version-x","text":"This is caused by multiple versions of Visual Studio installed on the machine. The build script of Cosys-AirSim will use the latest versions it can find so need to make Unreal does the same. Open or create a file called BuildConfiguration.xml in C:\\Users\\USERNAME\\AppData\\Roaming\\Unreal Engine\\UnrealBuildTool and add the following: Latest ","title":"I get an error Il \u2018P1\u2019, version \u2018X\u2019, does not match \u2018P2\u2019, version \u2018X\u2019"},{"location":"install_windows/#i-get-error-c100-an-internal-error-has-occurred-in-the-compiler-when-running-buildcmd","text":"We have noticed this happening with VS version 15.9.0 and have checked-in a workaround in Cosys-AirSim code. If you have this VS version, please make sure to pull the latest Cosys-AirSim code.","title":"I get error C100 : An internal error has occurred in the compiler when running build.cmd"},{"location":"install_windows/#i-get-error-corecrth-no-such-file-or-directory-or-windows-sdk-version-81-not-found","text":"Very likely you don't have Windows SDK installed with Visual Studio.","title":"I get error \"'corecrt.h': No such file or directory\" or \"Windows SDK version 8.1 not found\""},{"location":"install_windows/#how-do-i-use-px4-firmware-with-cosys-airsim","text":"By default, Cosys-AirSim uses its own built-in firmware called simple_flight . There is no additional setup if you just want to go with it. If you want to switch to using PX4 instead then please see this guide .","title":"How do I use PX4 firmware with Cosys-AirSim?"},{"location":"install_windows/#i-made-changes-in-visual-studio-but-there-is-no-effect","text":"Sometimes the Unreal + VS build system doesn't recompile if you make changes to only header files. To ensure a recompile, make some Unreal based cpp file \"dirty\" like AirSimGameMode.cpp.","title":"I made changes in Visual Studio but there is no effect"},{"location":"install_windows/#unreal-still-uses-vs2015-or-im-getting-some-link-error","text":"Running several versions of VS can lead to issues when compiling UE projects. One problem that may arise is that UE will try to compile with an older version of VS which may or may not work. There are two settings in Unreal, one for for the engine and one for the project, to adjust the version of VS to be used. 1. Edit -> Editor preferences -> General -> Source code 2. Edit -> Project Settings -> Platforms -> Windows -> Toolchain ->CompilerVersion In some cases, these settings will still not lead to the desired result and errors such as the following might be produced: LINK : fatal error LNK1181: cannot open input file 'ws2_32.lib' To resolve such issues the following procedure can be applied: 1. Uninstall all old versions of VS using the VisualStudioUninstaller 2. Repair/Install VS2017 3. Restart machine and install Epic launcher and desired version of the engine","title":"Unreal still uses VS2015 or I'm getting some link error"},{"location":"instance_segmentation/","text":"Instance Segmentation in Cosys-AirSim An Instance segmentation system is implemented into Cosys-AirSim. It uses Proxy Mesh rendering to allow for each object in the world to get its own color. Limitations 2744000 different colors are currently available to be assigned to unique objects. If your environment during a run requires more colors, you will generate errors and new objects will be assigned color [0,0,0]. Only static and skeletal meshes are supported. Landscape objects aren't supported. This is the special object type in Unreal to make terrain with. As a work-around, StaticMesh terrain must be used. Foliage objects aren't supported. This is the special object type in Unreal to place trees, grass and other plants that move with the wind. As a work-around, StaticMesh objects must be used. Brush objects aren't supported. This is a special object type in Unreal to create your own meshes with. As a work-around, you can convert them to a StaticMesh. These and other unsupported object types that are less common that either will not be rendered (decals, text, foliage, ...) or will by default be given the RGB color value of [149,149,149] or [0,0,0]. (brush objects, landscape,...). Usage By default, at the start of the simulation, it will give a random color to each object. Please see the Image API documentation on how to manually set or get the color information. For an example of the Instance Segmentation API, please see the script segmentation_test.py (Cosys-Airsim/PythonClient/segmentation/segmentation_test.py). For a script that generates a full list of objects and their associated color, please see the script segmentation_generate_list.py (Cosys-Airsim/PythonClient/segmentation/segmentation_generate_list.py). When a new object is spawned in your environment by for example a c++ or blueprint extension you made, and you want it to work with the instance segmentation system, you can use the extended function ASimModeBase::AddNewActorToSegmentation(AActor) which is also available in blueprints. Make sure to provide human-readable names to your objects in your environment as the ground truth tables that the AirSim API can provide will use your object naming to create the table. Credits The method used to use Proxy meshes to segment object is a derivative of and inspired by the work of UnrealCV . Their work is licensed under the MIT License. It is made by students from Johns Hopkins University and Peking University under the supervision of Prof. Alan Yuille and Prof. Yizhou Wang. You can read the paper on their work here .","title":"Instance Segmentation"},{"location":"instance_segmentation/#instance-segmentation-in-cosys-airsim","text":"An Instance segmentation system is implemented into Cosys-AirSim. It uses Proxy Mesh rendering to allow for each object in the world to get its own color.","title":"Instance Segmentation in Cosys-AirSim"},{"location":"instance_segmentation/#limitations","text":"2744000 different colors are currently available to be assigned to unique objects. If your environment during a run requires more colors, you will generate errors and new objects will be assigned color [0,0,0]. Only static and skeletal meshes are supported. Landscape objects aren't supported. This is the special object type in Unreal to make terrain with. As a work-around, StaticMesh terrain must be used. Foliage objects aren't supported. This is the special object type in Unreal to place trees, grass and other plants that move with the wind. As a work-around, StaticMesh objects must be used. Brush objects aren't supported. This is a special object type in Unreal to create your own meshes with. As a work-around, you can convert them to a StaticMesh. These and other unsupported object types that are less common that either will not be rendered (decals, text, foliage, ...) or will by default be given the RGB color value of [149,149,149] or [0,0,0]. (brush objects, landscape,...).","title":"Limitations"},{"location":"instance_segmentation/#usage","text":"By default, at the start of the simulation, it will give a random color to each object. Please see the Image API documentation on how to manually set or get the color information. For an example of the Instance Segmentation API, please see the script segmentation_test.py (Cosys-Airsim/PythonClient/segmentation/segmentation_test.py). For a script that generates a full list of objects and their associated color, please see the script segmentation_generate_list.py (Cosys-Airsim/PythonClient/segmentation/segmentation_generate_list.py). When a new object is spawned in your environment by for example a c++ or blueprint extension you made, and you want it to work with the instance segmentation system, you can use the extended function ASimModeBase::AddNewActorToSegmentation(AActor) which is also available in blueprints. Make sure to provide human-readable names to your objects in your environment as the ground truth tables that the AirSim API can provide will use your object naming to create the table.","title":"Usage"},{"location":"instance_segmentation/#credits","text":"The method used to use Proxy meshes to segment object is a derivative of and inspired by the work of UnrealCV . Their work is licensed under the MIT License. It is made by students from Johns Hopkins University and Peking University under the supervision of Prof. Alan Yuille and Prof. Yizhou Wang. You can read the paper on their work here .","title":"Credits"},{"location":"lidar/","text":"How to Use Lidar in AirSim AirSim supports Lidar for multirotors and cars. The enablement of lidar and the other lidar settings can be configured via AirSimSettings json. Please see general sensors for information on configruation of general/shared sensor settings. Enabling lidar on a vehicle By default, lidars are not enabled. To enable lidar, set the SensorType and Enabled attributes in settings json. \"Lidar1\": { \"SensorType\": 6, \"Enabled\" : true, } Multiple lidars can be enabled on a vehicle. Ignoring glass and other material types One can set an object that should be invisible to LIDAR sensors (such as glass) to have no collision for Unreal Traces in order to have it be 'invisible' for lidar sensors. Lidar configuration The following parameters can be configured right now via settings json. Parameter Description NumberOfChannels Number of channels/lasers of the lidar. When set to 1 it will act as a 2D horizontal LiDAR and will use the VerticalFOVUpper value as the vertical angle to scan. Range Range, in meters MeasurementsPerCycle Horizontal resolution. Amount of points in one cycle. RotationsPerSecond Rotations per second HorizontalFOVStart Horizontal FOV start for the lidar, in degrees HorizontalFOVEnd Horizontal FOV end for the lidar, in degrees VerticalFOVUpper Vertical FOV upper limit for the lidar, in degrees VerticalFOVLower Vertical FOV lower limit for the lidar, in degrees X Y Z Position of the lidar relative to the vehicle (in NED, in meters) Roll Pitch Yaw Orientation of the lidar relative to the vehicle (in degrees, yaw-pitch-roll order to front vector +X) GenerateNoise Generate and add range-noise based on normal distribution if set to true MinNoiseStandardDeviation The standard deviation to generate the noise normal distribution, in meters. This is the minimal noise (at 0 distance) NoiseDistanceScale To scale the noise with distance, set this parameter. This way the minimal noise is scaled depending on the distance compared to total maximum range of the sensor UpdateFrequency Amount of times per second that the sensor should update and calculate the next set of poins DrawSensor Draw the physical sensor in the world on the vehicle with a 3D axes shown where the sensor is LimitPoints Limit the amount of points that can be calculated in one measurement (to work around freezes due to bad performance). Will result in incomplete pointclouds External Uncouple the sensor from the vehicle. If enabled, the position and orientation will be relative to Unreal world coordinates ExternalLocal When in external mode, if this is enabled the retrieved pose of the sensor will be in Local NED coordinates(from starting position from vehicle) and not converted Unreal NED coordinates which is default { \"SeeDocsAt\": \"https://cosys-lab.github.io/settings/\", \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"Drone1\": { \"VehicleType\": \"simpleflight\", \"AutoCreate\": true, \"Sensors\": { \"LidarSensor1\": { \"SensorType\": 6, \"Enabled\" : true, \"NumberOfChannels\": 16, \"RotationsPerSecond\": 10, \"MeasurementsPerCycle\": 512, \"X\": 0, \"Y\": 0, \"Z\": -1, \"Roll\": 0, \"Pitch\": 0, \"Yaw\" : 0, \"VerticalFOVUpper\": -15, \"VerticalFOVLower\": -25, \"HorizontalFOVStart\": -20, \"HorizontalFOVEnd\": 20, \"DrawDebugPoints\": true }, \"LidarSensor2\": { \"SensorType\": 6, \"Enabled\" : true, \"NumberOfChannels\": 4, \"RotationsPerSecond\": 10, \"MeasurementsPerCycle\": 64, \"X\": 0, \"Y\": 0, \"Z\": -1, \"Roll\": 0, \"Pitch\": 0, \"Yaw\" : 0, \"VerticalFOVUpper\": -15, \"VerticalFOVLower\": -25, \"DrawDebugPoints\": true } } } } } Server side visualization for debugging By default, the lidar points are not drawn on the viewport. To enable the drawing of hit laser points on the viewport, please enable setting DrawDebugPoints via settings json. \"Lidar1\": { ... \"DrawDebugPoints\": true }, Client API Use getLidarData(sensor name, vehicle name) API to retrieve the Lidar data. The API returns a full scan Point-Cloud as a flat array of floats along with the timestamp of the capture and lidar pose. Point-Cloud: The floats represent [x,y,z] coordinate for each point hit within the range in the last scan in NED format. It will be [0,0,0] for a laser that didn't get any reflection (out of range). Pose: Default: Sensor pose in the vehicle frame / External: If set to External (see table) the coordinates will be in either Unreal NED when ExternalLocal is false or Local NED (from starting position from vehicle) when ExternalLocal is true . Groundtruth: For each point of the Point-Cloud a label string is kept that has the name of the object that the point belongs to a laser that didn't reflect anything will have label out_of_range .","title":"LIDAR"},{"location":"lidar/#how-to-use-lidar-in-airsim","text":"AirSim supports Lidar for multirotors and cars. The enablement of lidar and the other lidar settings can be configured via AirSimSettings json. Please see general sensors for information on configruation of general/shared sensor settings.","title":"How to Use Lidar in AirSim"},{"location":"lidar/#enabling-lidar-on-a-vehicle","text":"By default, lidars are not enabled. To enable lidar, set the SensorType and Enabled attributes in settings json. \"Lidar1\": { \"SensorType\": 6, \"Enabled\" : true, } Multiple lidars can be enabled on a vehicle.","title":"Enabling lidar on a vehicle"},{"location":"lidar/#ignoring-glass-and-other-material-types","text":"One can set an object that should be invisible to LIDAR sensors (such as glass) to have no collision for Unreal Traces in order to have it be 'invisible' for lidar sensors.","title":"Ignoring glass and other material types"},{"location":"lidar/#lidar-configuration","text":"The following parameters can be configured right now via settings json. Parameter Description NumberOfChannels Number of channels/lasers of the lidar. When set to 1 it will act as a 2D horizontal LiDAR and will use the VerticalFOVUpper value as the vertical angle to scan. Range Range, in meters MeasurementsPerCycle Horizontal resolution. Amount of points in one cycle. RotationsPerSecond Rotations per second HorizontalFOVStart Horizontal FOV start for the lidar, in degrees HorizontalFOVEnd Horizontal FOV end for the lidar, in degrees VerticalFOVUpper Vertical FOV upper limit for the lidar, in degrees VerticalFOVLower Vertical FOV lower limit for the lidar, in degrees X Y Z Position of the lidar relative to the vehicle (in NED, in meters) Roll Pitch Yaw Orientation of the lidar relative to the vehicle (in degrees, yaw-pitch-roll order to front vector +X) GenerateNoise Generate and add range-noise based on normal distribution if set to true MinNoiseStandardDeviation The standard deviation to generate the noise normal distribution, in meters. This is the minimal noise (at 0 distance) NoiseDistanceScale To scale the noise with distance, set this parameter. This way the minimal noise is scaled depending on the distance compared to total maximum range of the sensor UpdateFrequency Amount of times per second that the sensor should update and calculate the next set of poins DrawSensor Draw the physical sensor in the world on the vehicle with a 3D axes shown where the sensor is LimitPoints Limit the amount of points that can be calculated in one measurement (to work around freezes due to bad performance). Will result in incomplete pointclouds External Uncouple the sensor from the vehicle. If enabled, the position and orientation will be relative to Unreal world coordinates ExternalLocal When in external mode, if this is enabled the retrieved pose of the sensor will be in Local NED coordinates(from starting position from vehicle) and not converted Unreal NED coordinates which is default { \"SeeDocsAt\": \"https://cosys-lab.github.io/settings/\", \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"Drone1\": { \"VehicleType\": \"simpleflight\", \"AutoCreate\": true, \"Sensors\": { \"LidarSensor1\": { \"SensorType\": 6, \"Enabled\" : true, \"NumberOfChannels\": 16, \"RotationsPerSecond\": 10, \"MeasurementsPerCycle\": 512, \"X\": 0, \"Y\": 0, \"Z\": -1, \"Roll\": 0, \"Pitch\": 0, \"Yaw\" : 0, \"VerticalFOVUpper\": -15, \"VerticalFOVLower\": -25, \"HorizontalFOVStart\": -20, \"HorizontalFOVEnd\": 20, \"DrawDebugPoints\": true }, \"LidarSensor2\": { \"SensorType\": 6, \"Enabled\" : true, \"NumberOfChannels\": 4, \"RotationsPerSecond\": 10, \"MeasurementsPerCycle\": 64, \"X\": 0, \"Y\": 0, \"Z\": -1, \"Roll\": 0, \"Pitch\": 0, \"Yaw\" : 0, \"VerticalFOVUpper\": -15, \"VerticalFOVLower\": -25, \"DrawDebugPoints\": true } } } } }","title":"Lidar configuration"},{"location":"lidar/#server-side-visualization-for-debugging","text":"By default, the lidar points are not drawn on the viewport. To enable the drawing of hit laser points on the viewport, please enable setting DrawDebugPoints via settings json. \"Lidar1\": { ... \"DrawDebugPoints\": true },","title":"Server side visualization for debugging"},{"location":"lidar/#client-api","text":"Use getLidarData(sensor name, vehicle name) API to retrieve the Lidar data. The API returns a full scan Point-Cloud as a flat array of floats along with the timestamp of the capture and lidar pose. Point-Cloud: The floats represent [x,y,z] coordinate for each point hit within the range in the last scan in NED format. It will be [0,0,0] for a laser that didn't get any reflection (out of range). Pose: Default: Sensor pose in the vehicle frame / External: If set to External (see table) the coordinates will be in either Unreal NED when ExternalLocal is false or Local NED (from starting position from vehicle) when ExternalLocal is true . Groundtruth: For each point of the Point-Cloud a label string is kept that has the name of the object that the point belongs to a laser that didn't reflect anything will have label out_of_range .","title":"Client API"},{"location":"log_viewer/","text":"Log Viewer The LogViewer is a Windows WPF app that presents the MavLink streams that it is getting from the Unreal Simulator. You can use this to monitor what is happening on the drone while it is flying. For example, the picture below shows a real time graph of the x, y an z gyro sensor information being generated by the simulator. Usage You can open a log file, it supports .mavlink and PX4 *.ulg files, then you will see the contents of the log in a tree view on the left, whatever metrics you select will be added to the right the right side. You can close each individual chart with the little close box in the top right of each chart and you can group charts so they share the same vertical axis using the group charts button on the top toolbar. There is also a map option which will plot the GPS path the drone took. You can also load multiple log files so you can compare the data from each. Realtime You can also get a realtime view if you connect the LogViewer before you run the simulation. For this to work you need to configure the settings.json with the following settings: { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"PX4\": { ..., \"LogViewerHostIp\": \"127.0.0.1\", \"LogViewerPort\": 14388, } } } Note: do not use the \"Logs\" setting when you want realtime LogViewer logging. Logging to a file using \"Logs\" is mutually exclusive with LogViewer logging. Simply press the blue connector button on the top right corner of the window, select the Socket tab , enter the port number 14388 , and your localhost network. If you are using WSL 2 on Windows then select vEthernet (WSL) . If you do choose vEthernet (WSL) then make sure you also set LocalHostIp and LogViewerHostIp to the matching WSL ethernet address, something like 172.31.64.1 . Then press the record button (triangle on the right hand side of the toolbar). Now start the simulator, and the data will start streaming into LogViewer. The drone view in Log Viewer shows the actual estimated position coming from the PX4, so that is a great way to check whether the PX4 is in sync with the simulator. Sometimes you can see some drift here as the attitude estimation catches up with reality, this can become more visible after a bad crash. Installation If you can't build the LogViewer.sln, there is also a click once installer . Configuration The magic port number 14388 can be configured in the simulator by editing the settings.json file . If you change the port number in LogViewer connection dialog then be sure to make the matching changes in your settings.json file. Debugging See PX4 Logging for more information on how to use the LogViewer to debug situations you are setting.","title":"MavLink LogViewer"},{"location":"log_viewer/#log-viewer","text":"The LogViewer is a Windows WPF app that presents the MavLink streams that it is getting from the Unreal Simulator. You can use this to monitor what is happening on the drone while it is flying. For example, the picture below shows a real time graph of the x, y an z gyro sensor information being generated by the simulator.","title":"Log Viewer"},{"location":"log_viewer/#usage","text":"You can open a log file, it supports .mavlink and PX4 *.ulg files, then you will see the contents of the log in a tree view on the left, whatever metrics you select will be added to the right the right side. You can close each individual chart with the little close box in the top right of each chart and you can group charts so they share the same vertical axis using the group charts button on the top toolbar. There is also a map option which will plot the GPS path the drone took. You can also load multiple log files so you can compare the data from each.","title":"Usage"},{"location":"log_viewer/#realtime","text":"You can also get a realtime view if you connect the LogViewer before you run the simulation. For this to work you need to configure the settings.json with the following settings: { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"PX4\": { ..., \"LogViewerHostIp\": \"127.0.0.1\", \"LogViewerPort\": 14388, } } } Note: do not use the \"Logs\" setting when you want realtime LogViewer logging. Logging to a file using \"Logs\" is mutually exclusive with LogViewer logging. Simply press the blue connector button on the top right corner of the window, select the Socket tab , enter the port number 14388 , and your localhost network. If you are using WSL 2 on Windows then select vEthernet (WSL) . If you do choose vEthernet (WSL) then make sure you also set LocalHostIp and LogViewerHostIp to the matching WSL ethernet address, something like 172.31.64.1 . Then press the record button (triangle on the right hand side of the toolbar). Now start the simulator, and the data will start streaming into LogViewer. The drone view in Log Viewer shows the actual estimated position coming from the PX4, so that is a great way to check whether the PX4 is in sync with the simulator. Sometimes you can see some drift here as the attitude estimation catches up with reality, this can become more visible after a bad crash.","title":"Realtime"},{"location":"log_viewer/#installation","text":"If you can't build the LogViewer.sln, there is also a click once installer .","title":"Installation"},{"location":"log_viewer/#configuration","text":"The magic port number 14388 can be configured in the simulator by editing the settings.json file . If you change the port number in LogViewer connection dialog then be sure to make the matching changes in your settings.json file.","title":"Configuration"},{"location":"log_viewer/#debugging","text":"See PX4 Logging for more information on how to use the LogViewer to debug situations you are setting.","title":"Debugging"},{"location":"matlab/","text":"How to use AirSim with Matlab AirSim and Matlab can be integrated using Python. an example Matlab client is provided demonstrating how to interact with AirSim from Matlab. This can be used from source or installed as a toolbox (install from File Exchange , or from source by double-clicking or dragging into Matlab the file Cosys-AirSim Matlab API Client.mltbx ) Prerequisites These instructions are for Matlab 2024a (with toolboxes for the client: Computer Vision, Aerospace, Signal Processing Toolbox) UE 5.3 and latest AirSim release. It also requires the AirSim python package to be installed. For this go into the PythonClient folder and use pip to install it to your python environment that is also used in Matlab with pip install . You can find out in Matlab what Python version is used with pe = pyenv; pe.Version You should have these components installed and working before proceeding. Usage This a client implementation of the RPC API for Matlab for the Cosys-AirSim simulation framework. A main class AirSimClient is available which implements all API calls. Do note that at this point not all functions have been tested and most function documentation was auto-generated. This is still a WIP client. Initial setup When starting with this wrapper, first try to make a connection to the Cosys-AirSim simulation. vehicle_name = \"airsimvehicle\"; airSimClient = AirSimClient(IsDrone=false, ApiControl=false, IP=\"127.0.0.1\", port=41451, vehicleName=vehicle_name); Now the client object can be used to run API methods from. All functions have some help text written for more information on them. Example This example will: Connect to AirSim Get/set vehicle pose Get instance segmentation groundtruth table Get object pose(s) Get sensor data (imu, echo (active/passive), (gpu)LiDAR, camera (info, rgb, depth, segmentation, annotation)) Do note that the AirSim matlab client has almost all API functions available but not all are listed in this test script. For a full list see the source code fo the AirSimClient class. Do note the test script requires next to the toolboxes listed above in the Prerequisites the following Matlab toolboxes: Lidar Toolbox Navigation Toolbox Robotics System Toolbox ROS Toolbox UAV Toolbox Setup connection %Define client vehicle_name = \"airsimvehicle\"; airSimClient = AirSimClient(IsDrone=false, ApiControl=false, IP=\"127.0.0.1\", port=41451, vehicleName=vehicle_name); Groundtruth labels % Get groundtruth look-up-table of all objects and their instance % segmentation colors for the cameras and GPU LiDAR groundtruthLUT = airSimClient.getInstanceSegmentationLUT(); Get some poses % All poses are right handed coordinate system X Y Z and % orientations are defined as quaternions W X Y Z. % Get poses of all objects in the scene, this takes a while for large % scene so it is in comment by default poses = airSimClient.getAllObjectPoses(false, false); % Get vehicle pose vehiclePoseLocal = airSimClient.getVehiclePose(); vehiclePoseWorld = airSimClient.getObjectPose(vehicle_name, false); % Get an random object pose or choose if you know the name of one useChosenObject = false; chosenObject = \"Cylinder3\"; if useChosenObject finalName = chosenObject; else randomIndex = randi(size(groundtruthLUT, 1), 1); randomName = groundtruthLUT.name(randomIndex); finalName = randomName; end objectPoseLocal = airSimClient.getObjectPose(finalName, true); objectPoseWorld = airSimClient.getObjectPose(finalName, false); figure; subplot(1, 2, 1); plotTransforms([vehiclePoseLocal.position; objectPoseLocal.position], [vehiclePoseLocal.orientation; objectPoseLocal.orientation], FrameLabel=[\"Vehicle\"; finalName], AxisLabels=\"on\") axis equal; grid on; xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") title(\"Local Plot\") subplot(1, 2, 2); plotTransforms([vehiclePoseWorld.position; objectPoseWorld.position], [vehiclePoseWorld.orientation; objectPoseWorld.orientation], FrameLabel=[\"Vehicle\"; finalName], AxisLabels=\"on\") axis equal; grid on; xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") title(\"World Plot\") drawnow %% Set vehicle pose airSimClient.setVehiclePose(airSimClient.getVehiclePose().position + [1 1 0], airSimClient.getVehiclePose().orientation) IMU sensor Data imuSensorName = \"imu\"; [imuData, imuTimestamp] = airSimClient.getIMUData(imuSensorName); Echo sensor data % Example plots passive echo pointcloud % and its reflection directions as 3D quivers echoSensorName = \"echo\"; enablePassive = true; [activePointCloud, activeData, passivePointCloud, passiveData , echoTimestamp, echoSensorPose] = airSimClient.getEchoData(echoSensorName, enablePassive); figure; subplot(1, 2, 1); if ~isempty(activePointCloud) pcshow(activePointCloud, color=\"X\", MarkerSize=50); else pcshow(pointCloud([0, 0, 0])); end title('Active Echo Sensor Pointcloud') xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") xlim([0 10]) ylim([-10 10]) zlim([-10 10]) subplot(1, 2, 2); if ~isempty(passivePointCloud) pcshow(passivePointCloud, color=\"X\", MarkerSize=50); hold on; quiver3(passivePointCloud.Location(:, 1), passivePointCloud.Location(:, 2), passivePointCloud.Location(:, 3),... passivePointCloud.Normal(:, 1), passivePointCloud.Normal(:, 2), passivePointCloud.Normal(:, 3), 2); hold off else pcshow(pointCloud([0, 0, 0])); end title('Passive Echo Sensor Pointcloud') xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") xlim([0 10]) ylim([-10 10]) zlim([-10 10]) drawnow LiDAR sensor data % Example plots lidar pointcloud and getting the groundtruth labels lidarSensorName = \"lidar\"; enableLabels = true; [lidarPointCloud, lidarLabels, LidarTimestamp, LidarSensorPose] = airSimClient.getLidarData(lidarSensorName, enableLabels); figure; if ~isempty(lidarPointCloud) pcshow(lidarPointCloud, MarkerSize=50); else pcshow(pointCloud([0, 0, 0])); end title('LiDAR Pointcloud') xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") xlim([0 10]) ylim([-10 10]) zlim([-10 10]) drawnow GPU LiDAR sensor data % Example plots GPU lidar pointcloud with its RGB segmentation colors gpuLidarSensorName = \"gpulidar\"; enableLabels = true; [gpuLidarPointCloud, gpuLidarTimestamp, gpuLidarSensorPose] = airSimClient.getGPULidarData(gpuLidarSensorName); figure; if ~isempty(gpuLidarPointCloud) pcshow(gpuLidarPointCloud, MarkerSize=50); else pcshow(pointCloud([0, 0, 0])); end title('GPU-Accelerated LiDAR Pointcloud') xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") xlim([0 10]) ylim([-10 10]) zlim([-10 10]) drawnow Cameras %% Get camera info cameraSensorName = \"frontcamera\"; [intrinsics, cameraSensorPose] = airSimClient.getCameraInfo(cameraSensorName); %% Get single camera images % Get images sequentially cameraSensorName = \"front_center\"; [rgbImage, rgbCameraIimestamp] = airSimClient.getCameraImage(cameraSensorName, AirSimCameraTypes.Scene); [segmentationImage, segmentationCameraIimestamp] = airSimClient.getCameraImage(cameraSensorName, AirSimCameraTypes.Segmentation); [depthImage, depthCameraIimestamp] = airSimClient.getCameraImage(cameraSensorName, AirSimCameraTypes.DepthPlanar); [annotationImage, annotationCameraIimestamp] = airSimClient.getCameraImage(cameraSensorName, AirSimCameraTypes.Annotation, \"TextureTestDirect\"); figure; subplot(4, 1, 1); imshow(rgbImage) title(\"RGB Camera Image\") subplot(4, 1, 2); imshow(segmentationImage) title(\"Segmentation Camera Image\") subplot(4, 1, 3); imshow(depthImage ./ max(max(depthImage)).* 255, gray) title(\"Depth Camera Image\") subplot(4, 1, 4); imshow(annotationImage) title(\"Annotation Camera Image\") %% Get synced camera images % By combining the image requests they will be synced % and taken in the same frame cameraSensorName = \"front_center\"; [images, cameraIimestamp] = airSimClient.getCameraImages(cameraSensorName, ... [AirSimCameraTypes.Scene, AirSimCameraTypes.Segmentation, AirSimCameraTypes.DepthPlanar, AirSimCameraTypes.Annotation], ... [\"\", \"\", \"\", \"GreyscaleTest\"]); figure; subplot(4, 1, 1); imshow(images{1}) title(\"Synced RGB Camera Image\") subplot(4, 1, 2); imshow(images{2}) title(\"Synced Segmentation Camera Image\") subplot(4, 1, 3); imshow(images{3} ./ max(max(images{3})).* 255, gray) title(\"Synced Depth Camera Image\") subplot(4, 1, 4); imshow(images{4}) title(\"Synced Annotation Camera Image\")","title":"Matlab"},{"location":"matlab/#how-to-use-airsim-with-matlab","text":"AirSim and Matlab can be integrated using Python. an example Matlab client is provided demonstrating how to interact with AirSim from Matlab. This can be used from source or installed as a toolbox (install from File Exchange , or from source by double-clicking or dragging into Matlab the file Cosys-AirSim Matlab API Client.mltbx )","title":"How to use AirSim with Matlab"},{"location":"matlab/#prerequisites","text":"These instructions are for Matlab 2024a (with toolboxes for the client: Computer Vision, Aerospace, Signal Processing Toolbox) UE 5.3 and latest AirSim release. It also requires the AirSim python package to be installed. For this go into the PythonClient folder and use pip to install it to your python environment that is also used in Matlab with pip install . You can find out in Matlab what Python version is used with pe = pyenv; pe.Version You should have these components installed and working before proceeding.","title":"Prerequisites"},{"location":"matlab/#usage","text":"This a client implementation of the RPC API for Matlab for the Cosys-AirSim simulation framework. A main class AirSimClient is available which implements all API calls. Do note that at this point not all functions have been tested and most function documentation was auto-generated. This is still a WIP client.","title":"Usage"},{"location":"matlab/#initial-setup","text":"When starting with this wrapper, first try to make a connection to the Cosys-AirSim simulation. vehicle_name = \"airsimvehicle\"; airSimClient = AirSimClient(IsDrone=false, ApiControl=false, IP=\"127.0.0.1\", port=41451, vehicleName=vehicle_name); Now the client object can be used to run API methods from. All functions have some help text written for more information on them.","title":"Initial setup"},{"location":"matlab/#example","text":"This example will: Connect to AirSim Get/set vehicle pose Get instance segmentation groundtruth table Get object pose(s) Get sensor data (imu, echo (active/passive), (gpu)LiDAR, camera (info, rgb, depth, segmentation, annotation)) Do note that the AirSim matlab client has almost all API functions available but not all are listed in this test script. For a full list see the source code fo the AirSimClient class. Do note the test script requires next to the toolboxes listed above in the Prerequisites the following Matlab toolboxes: Lidar Toolbox Navigation Toolbox Robotics System Toolbox ROS Toolbox UAV Toolbox","title":"Example"},{"location":"matlab/#setup-connection","text":"%Define client vehicle_name = \"airsimvehicle\"; airSimClient = AirSimClient(IsDrone=false, ApiControl=false, IP=\"127.0.0.1\", port=41451, vehicleName=vehicle_name);","title":"Setup connection"},{"location":"matlab/#groundtruth-labels","text":"% Get groundtruth look-up-table of all objects and their instance % segmentation colors for the cameras and GPU LiDAR groundtruthLUT = airSimClient.getInstanceSegmentationLUT();","title":"Groundtruth labels"},{"location":"matlab/#get-some-poses","text":"% All poses are right handed coordinate system X Y Z and % orientations are defined as quaternions W X Y Z. % Get poses of all objects in the scene, this takes a while for large % scene so it is in comment by default poses = airSimClient.getAllObjectPoses(false, false); % Get vehicle pose vehiclePoseLocal = airSimClient.getVehiclePose(); vehiclePoseWorld = airSimClient.getObjectPose(vehicle_name, false); % Get an random object pose or choose if you know the name of one useChosenObject = false; chosenObject = \"Cylinder3\"; if useChosenObject finalName = chosenObject; else randomIndex = randi(size(groundtruthLUT, 1), 1); randomName = groundtruthLUT.name(randomIndex); finalName = randomName; end objectPoseLocal = airSimClient.getObjectPose(finalName, true); objectPoseWorld = airSimClient.getObjectPose(finalName, false); figure; subplot(1, 2, 1); plotTransforms([vehiclePoseLocal.position; objectPoseLocal.position], [vehiclePoseLocal.orientation; objectPoseLocal.orientation], FrameLabel=[\"Vehicle\"; finalName], AxisLabels=\"on\") axis equal; grid on; xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") title(\"Local Plot\") subplot(1, 2, 2); plotTransforms([vehiclePoseWorld.position; objectPoseWorld.position], [vehiclePoseWorld.orientation; objectPoseWorld.orientation], FrameLabel=[\"Vehicle\"; finalName], AxisLabels=\"on\") axis equal; grid on; xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") title(\"World Plot\") drawnow %% Set vehicle pose airSimClient.setVehiclePose(airSimClient.getVehiclePose().position + [1 1 0], airSimClient.getVehiclePose().orientation)","title":"Get some poses"},{"location":"matlab/#imu-sensor-data","text":"imuSensorName = \"imu\"; [imuData, imuTimestamp] = airSimClient.getIMUData(imuSensorName);","title":"IMU sensor Data"},{"location":"matlab/#echo-sensor-data","text":"% Example plots passive echo pointcloud % and its reflection directions as 3D quivers echoSensorName = \"echo\"; enablePassive = true; [activePointCloud, activeData, passivePointCloud, passiveData , echoTimestamp, echoSensorPose] = airSimClient.getEchoData(echoSensorName, enablePassive); figure; subplot(1, 2, 1); if ~isempty(activePointCloud) pcshow(activePointCloud, color=\"X\", MarkerSize=50); else pcshow(pointCloud([0, 0, 0])); end title('Active Echo Sensor Pointcloud') xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") xlim([0 10]) ylim([-10 10]) zlim([-10 10]) subplot(1, 2, 2); if ~isempty(passivePointCloud) pcshow(passivePointCloud, color=\"X\", MarkerSize=50); hold on; quiver3(passivePointCloud.Location(:, 1), passivePointCloud.Location(:, 2), passivePointCloud.Location(:, 3),... passivePointCloud.Normal(:, 1), passivePointCloud.Normal(:, 2), passivePointCloud.Normal(:, 3), 2); hold off else pcshow(pointCloud([0, 0, 0])); end title('Passive Echo Sensor Pointcloud') xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") xlim([0 10]) ylim([-10 10]) zlim([-10 10]) drawnow","title":"Echo sensor data"},{"location":"matlab/#lidar-sensor-data","text":"% Example plots lidar pointcloud and getting the groundtruth labels lidarSensorName = \"lidar\"; enableLabels = true; [lidarPointCloud, lidarLabels, LidarTimestamp, LidarSensorPose] = airSimClient.getLidarData(lidarSensorName, enableLabels); figure; if ~isempty(lidarPointCloud) pcshow(lidarPointCloud, MarkerSize=50); else pcshow(pointCloud([0, 0, 0])); end title('LiDAR Pointcloud') xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") xlim([0 10]) ylim([-10 10]) zlim([-10 10]) drawnow","title":"LiDAR sensor data"},{"location":"matlab/#gpu-lidar-sensor-data","text":"% Example plots GPU lidar pointcloud with its RGB segmentation colors gpuLidarSensorName = \"gpulidar\"; enableLabels = true; [gpuLidarPointCloud, gpuLidarTimestamp, gpuLidarSensorPose] = airSimClient.getGPULidarData(gpuLidarSensorName); figure; if ~isempty(gpuLidarPointCloud) pcshow(gpuLidarPointCloud, MarkerSize=50); else pcshow(pointCloud([0, 0, 0])); end title('GPU-Accelerated LiDAR Pointcloud') xlabel(\"X (m)\") ylabel(\"Y (m)\") zlabel(\"Z (m)\") xlim([0 10]) ylim([-10 10]) zlim([-10 10]) drawnow","title":"GPU LiDAR sensor data"},{"location":"matlab/#cameras","text":"%% Get camera info cameraSensorName = \"frontcamera\"; [intrinsics, cameraSensorPose] = airSimClient.getCameraInfo(cameraSensorName); %% Get single camera images % Get images sequentially cameraSensorName = \"front_center\"; [rgbImage, rgbCameraIimestamp] = airSimClient.getCameraImage(cameraSensorName, AirSimCameraTypes.Scene); [segmentationImage, segmentationCameraIimestamp] = airSimClient.getCameraImage(cameraSensorName, AirSimCameraTypes.Segmentation); [depthImage, depthCameraIimestamp] = airSimClient.getCameraImage(cameraSensorName, AirSimCameraTypes.DepthPlanar); [annotationImage, annotationCameraIimestamp] = airSimClient.getCameraImage(cameraSensorName, AirSimCameraTypes.Annotation, \"TextureTestDirect\"); figure; subplot(4, 1, 1); imshow(rgbImage) title(\"RGB Camera Image\") subplot(4, 1, 2); imshow(segmentationImage) title(\"Segmentation Camera Image\") subplot(4, 1, 3); imshow(depthImage ./ max(max(depthImage)).* 255, gray) title(\"Depth Camera Image\") subplot(4, 1, 4); imshow(annotationImage) title(\"Annotation Camera Image\") %% Get synced camera images % By combining the image requests they will be synced % and taken in the same frame cameraSensorName = \"front_center\"; [images, cameraIimestamp] = airSimClient.getCameraImages(cameraSensorName, ... [AirSimCameraTypes.Scene, AirSimCameraTypes.Segmentation, AirSimCameraTypes.DepthPlanar, AirSimCameraTypes.Annotation], ... [\"\", \"\", \"\", \"GreyscaleTest\"]); figure; subplot(4, 1, 1); imshow(images{1}) title(\"Synced RGB Camera Image\") subplot(4, 1, 2); imshow(images{2}) title(\"Synced Segmentation Camera Image\") subplot(4, 1, 3); imshow(images{3} ./ max(max(images{3})).* 255, gray) title(\"Synced Depth Camera Image\") subplot(4, 1, 4); imshow(images{4}) title(\"Synced Annotation Camera Image\")","title":"Cameras"},{"location":"mavlinkcom/","text":"Welcome to MavLinkCom MavLinkCom is a cross-platform C++ library that helps connect to and communicate with MavLink based vehicles. Specifically this library is designed to work well with PX4 based drones. Design You can view and edit the Design.dgml diagram in Visual Studio. The following are the most important classes in this library. MavLinkNode This is the base class for all MavLinkNodes (subclasses include MavLinkVehicle, MavLinkVideoClient and MavLinkVideoServer). The node connects to your mavlink enabled vehicle via a MavLinkConnection and provides methods for sending MavLinkMessages and MavLinkCommands and for subscribing to receive messages. This base class also stores the local system id and component id your app wants to use to identify itself to your remote vehicle. You can also call startHeartbeat to send regular heartbeat messages to keep the connection alive. MavLinkMessage This is the encoded MavLinkMessage. For those who have used the mavlink.h C API, this is similar to mavlink_message_t. You do not create these manually, they are encoded from a strongly typed MavLinkMessageBase subclass. Strongly typed message and command classes The MavLinkComGenerator parses the mavlink common.xml message definitions and generates all the MavLink* MavLinkMessageBase subclasses as well as a bunch of handy mavlink enums and a bunch of strongly typed MavLinkCommand subclasses. MavLinkMessageBase This is the base class for a set of strongly typed message classes that are code generated by the MavLinkComGenerator project. This replaces the C messages defined in the mavlink C API and provides a slightly more object oriented way to send and receive messages via sendMessage on MavLinkNode. These classes have encode/decode methods that convert to and from the MavLinkMessage class. MavLinkCommand This is the base class for a set of strongly typed command classes that are code generated by the MavLinkComGenerator project. This replaces the C definitions defined in the mavlink C API and provides a more object oriented way to send commands via the sendCommand method on MavLinkNode. The MavLinkNode takes care of turning these into the underlying mavlink COMMAND_LONG message. MavLinkConnection This class provides static helper methods for creating connections to remote MavLink nodes, over serial ports, as well as UDP, or TCP sockets. This class provides a way to subscribe to receive messages from that node in a pub/sub way so you can have multiple subscribers on the same connection. MavLinkVehicle uses this to track various messages that define the overall vehicle state. MavLinkVehicle MavLinkVehicle is a MavLinkNode that tracks various messages that define the overall vehicle state and provides a VehicleState struct containing a snapshot of that state, including home position, current orientation, local position, global position, and so on. This class also provides a bunch of helper methods that wrap commonly used commands providing simple method calls to do things like arm, disarm, takeoff, land, go to a local coordinate, and fly under offbaord control either by position or velocity control. MavLinkTcpServer This helper class provides a way to setup a \"server\" that accepts MavLinkConnections from remote nodes. You can use this class to get a connection that you can then give to MavLinkVideoServer to serve images over MavLink. MavLinkFtpClient This helper class takes a given MavLinkConnection and provides FTP client support for the MAVLINK_MSG_ID_FILE_TRANSFER_PROTOCOL for vehicles that support the FTP capability. This class provides simple methods to list directory contents, and the get and put files. MavLinkVideoClient This helper class takes a given MavLinkConnection and provides helper methods for requesting video from remote node and packaging up the MAVLINK_MSG_ID_DATA_TRANSMISSION_HANDSHAKE and MAVLINK_MSG_ID_ENCAPSULATED_DATA messages into simple to use MavLinkVideoFrames. MavLinkVideoServer This helper class takes a given MavLinkConnection and provides the server side of the MavLinkVideoClient protocol, including helper methods for notifying when there is a video request to process (hasVideoRequest) and a method to send video frames (sendFrame) which will generate the right MAVLINK_MSG_ID_DATA_TRANSMISSION_HANDSHAKE and MAVLINK_MSG_ID_ENCAPSULATED_DATA sequence. Examples The following code from the UnitTest project shows how to connect to a Pixhawk flight controller over USB serial port, then wait for the first heartbeat message to be received: auto connection = MavLinkConnection::connectSerial(\"drone\", \"/dev/ttyACM0\", 115200, \"sh /etc/init.d/rc.usb\\n\"); MavLinkHeartbeat heartbeat; if (!waitForHeartbeat(10000, heartbeat)) { throw std::runtime_error(\"Received no heartbeat from PX4 after 10 seconds\"); } The following code connects to serial port, and then forwards all messages to and from QGroundControl to that drone using another connection that is joined to the drone stream. auto droneConnection = MavLinkConnection::connectSerial(\"drone\", \"/dev/ttyACM0\", 115200, \"sh /etc/init.d/rc.usb\\n\"); auto proxyConnection = MavLinkConnection::connectRemoteUdp(\"qgc\", \"127.0.0.1\", \"127.0.0.1\", 14550); droneConnection->join(proxyConnection); The following code then takes that connection and turns on heartBeats and starts tracking vehicle information using local system id 166 and component id 1. auto vehicle = std::make_shared(166, 1); vehicle->connect(connection); vehicle->startHeartbeat(); std::this_thread::sleep_for(std::chrono::seconds(5)); VehicleState state = vehicle->getVehicleState(); printf(\"Home position is %s, %f,%f,%f\\n\", state.home.is_set ? \"set\" : \"not set\", state.home.global_pos.lat, state.home.global_pos.lon, state.home.global_pos.alt); The following code uses the vehicle object to arm the drone and take off and wait for the takeoff altitude to be reached: bool rc = false; if (!vehicle->armDisarm(true).wait(3000, &rc) || !rc) { printf(\"arm command failed\\n\"); return; } if (!vehicle->takeoff(targetAlt).wait(3000, &rc) || !rc) { printf(\"takeoff command failed\\n\"); return; } int version = vehicle->getVehicleStateVersion(); while (true) { int newVersion = vehicle->getVehicleStateVersion(); if (version != newVersion) { VehicleState state = vehicle->getVehicleState(); float alt = state.local_est.pos.z; if (alt >= targetAlt - delta && alt <= targetAlt + delta) { reached = true; printf(\"Target altitude reached\\n\"); break; } } else { std::this_thread::sleep_for(std::chrono::milliseconds(10)); } } The following code uses offboard control to make the drone fly in a circle with camera pointed at the center. Here we use the subscribe method to check each new local position message to indicate so we can compute the new velocity vector as soon as that new position is received. We request a high rate for those messages using setMessageInterval to ensure smooth circular orbit. vehicle->setMessageInterval((int)MavLinkMessageIds::MAVLINK_MSG_ID_LOCAL_POSITION_NED, 30); vehicle->requestControl(); int subscription = vehicle->getConnection()->subscribe( [&](std::shared_ptr connection, const MavLinkMessage& m) { if (m.msgid == (int)MavLinkMessageIds::MAVLINK_MSG_ID_LOCAL_POSITION_NED) { // convert generic msg to strongly typed message. MavLinkLocalPositionNed localPos; localPos.decode(msg); float x = localPos.x; float y = localPos.y; float dx = x - cx; float dy = y - cy; float angle = atan2(dy, dx); if (angle < 0) angle += M_PI * 2; float tangent = angle + M_PI_2; double newvx = orbitSpeed * cos(tangent); double newvy = orbitSpeed * sin(tangent); float heading = angle + M_PI; vehicle->moveByLocalVelocityWithAltHold(newvx, newvy, altitude, true, heading); } }); The following code stops flying the drone in offboard mode and tells the drone to loiter at its current location. This version of the code shows how to use the AsyncResult without blocking on a wait call. vehicle->releaseControl(); if (vehicle->loiter().then([=](bool rc) { printf(\"loiter command %s\\n\", rc ? \"succeeded\" : \"failed\"); } The following code gets all configurable parameters from the drone and prints them: auto list = vehicle->getParamList(); auto end = list.end(); int count = 0; for (auto iter = list.begin(); iter < end; iter++) { count++; MavLinkParameter p = *iter; if (p.type == MAV_PARAM_TYPE_REAL32 || p.type == MAV_PARAM_TYPE_REAL64) { printf(\"%s=%f\\n\", p.name.c_str(), p.value); } else { printf(\"%s=%d\\n\", p.name.c_str(), static_cast(p.value)); } } The following code sets a parameter on the Pixhawk to disable the USB safety check (this is handy if you are controlling the Pixhawk over USB using another onboard computer that is part of the drone itself). You should NOT do this if you are connecting your PC or laptop to the drone over USB. MavLinkParameter p; p.name = \"CBRK_USB_CHK\"; p.value = 197848; if (!vehicle->setParameter(p).wait(3000,&rc) || !rc) { printf(\"Setting the CBRK_USB_CHK failed\"); } MavLinkVehicle actually has a helper method for this called allowFlightControlOverUsb, so now you know how it is implemented :-) Advanced Connections You can wire up different configurations of mavlink pipelines using the MavLinkConnection class \"join\" method as shown below. Example 1, we connect to PX4 over serial, and proxy those messages through to QGroundControl and the LogViewer who are listening on remote ports. Example 2: simulation can talk to jMavSim and jMavSim connects to PX4. jMavSim can also manage multiple connections, so it can talk to unreal simulator. Another MavLinkConnection can be joined to proxy connections that jMavSim doesn't support, like the LogViewer or a remote camera node. Example 3: we use MavLinkConnection to connect to PX4 over serial, then join additional connections for all our remote nodes including jMavSim. Example 4: We can also do distributed systems to control the drone remotely:","title":"MavLinkCom"},{"location":"mavlinkcom/#welcome-to-mavlinkcom","text":"MavLinkCom is a cross-platform C++ library that helps connect to and communicate with MavLink based vehicles. Specifically this library is designed to work well with PX4 based drones.","title":"Welcome to MavLinkCom"},{"location":"mavlinkcom/#design","text":"You can view and edit the Design.dgml diagram in Visual Studio. The following are the most important classes in this library.","title":"Design"},{"location":"mavlinkcom/#mavlinknode","text":"This is the base class for all MavLinkNodes (subclasses include MavLinkVehicle, MavLinkVideoClient and MavLinkVideoServer). The node connects to your mavlink enabled vehicle via a MavLinkConnection and provides methods for sending MavLinkMessages and MavLinkCommands and for subscribing to receive messages. This base class also stores the local system id and component id your app wants to use to identify itself to your remote vehicle. You can also call startHeartbeat to send regular heartbeat messages to keep the connection alive.","title":"MavLinkNode"},{"location":"mavlinkcom/#mavlinkmessage","text":"This is the encoded MavLinkMessage. For those who have used the mavlink.h C API, this is similar to mavlink_message_t. You do not create these manually, they are encoded from a strongly typed MavLinkMessageBase subclass.","title":"MavLinkMessage"},{"location":"mavlinkcom/#strongly-typed-message-and-command-classes","text":"The MavLinkComGenerator parses the mavlink common.xml message definitions and generates all the MavLink* MavLinkMessageBase subclasses as well as a bunch of handy mavlink enums and a bunch of strongly typed MavLinkCommand subclasses.","title":"Strongly typed message and command classes"},{"location":"mavlinkcom/#mavlinkmessagebase","text":"This is the base class for a set of strongly typed message classes that are code generated by the MavLinkComGenerator project. This replaces the C messages defined in the mavlink C API and provides a slightly more object oriented way to send and receive messages via sendMessage on MavLinkNode. These classes have encode/decode methods that convert to and from the MavLinkMessage class.","title":"MavLinkMessageBase"},{"location":"mavlinkcom/#mavlinkcommand","text":"This is the base class for a set of strongly typed command classes that are code generated by the MavLinkComGenerator project. This replaces the C definitions defined in the mavlink C API and provides a more object oriented way to send commands via the sendCommand method on MavLinkNode. The MavLinkNode takes care of turning these into the underlying mavlink COMMAND_LONG message.","title":"MavLinkCommand"},{"location":"mavlinkcom/#mavlinkconnection","text":"This class provides static helper methods for creating connections to remote MavLink nodes, over serial ports, as well as UDP, or TCP sockets. This class provides a way to subscribe to receive messages from that node in a pub/sub way so you can have multiple subscribers on the same connection. MavLinkVehicle uses this to track various messages that define the overall vehicle state.","title":"MavLinkConnection"},{"location":"mavlinkcom/#mavlinkvehicle","text":"MavLinkVehicle is a MavLinkNode that tracks various messages that define the overall vehicle state and provides a VehicleState struct containing a snapshot of that state, including home position, current orientation, local position, global position, and so on. This class also provides a bunch of helper methods that wrap commonly used commands providing simple method calls to do things like arm, disarm, takeoff, land, go to a local coordinate, and fly under offbaord control either by position or velocity control.","title":"MavLinkVehicle"},{"location":"mavlinkcom/#mavlinktcpserver","text":"This helper class provides a way to setup a \"server\" that accepts MavLinkConnections from remote nodes. You can use this class to get a connection that you can then give to MavLinkVideoServer to serve images over MavLink.","title":"MavLinkTcpServer"},{"location":"mavlinkcom/#mavlinkftpclient","text":"This helper class takes a given MavLinkConnection and provides FTP client support for the MAVLINK_MSG_ID_FILE_TRANSFER_PROTOCOL for vehicles that support the FTP capability. This class provides simple methods to list directory contents, and the get and put files.","title":"MavLinkFtpClient"},{"location":"mavlinkcom/#mavlinkvideoclient","text":"This helper class takes a given MavLinkConnection and provides helper methods for requesting video from remote node and packaging up the MAVLINK_MSG_ID_DATA_TRANSMISSION_HANDSHAKE and MAVLINK_MSG_ID_ENCAPSULATED_DATA messages into simple to use MavLinkVideoFrames.","title":"MavLinkVideoClient"},{"location":"mavlinkcom/#mavlinkvideoserver","text":"This helper class takes a given MavLinkConnection and provides the server side of the MavLinkVideoClient protocol, including helper methods for notifying when there is a video request to process (hasVideoRequest) and a method to send video frames (sendFrame) which will generate the right MAVLINK_MSG_ID_DATA_TRANSMISSION_HANDSHAKE and MAVLINK_MSG_ID_ENCAPSULATED_DATA sequence.","title":"MavLinkVideoServer"},{"location":"mavlinkcom/#examples","text":"The following code from the UnitTest project shows how to connect to a Pixhawk flight controller over USB serial port, then wait for the first heartbeat message to be received: auto connection = MavLinkConnection::connectSerial(\"drone\", \"/dev/ttyACM0\", 115200, \"sh /etc/init.d/rc.usb\\n\"); MavLinkHeartbeat heartbeat; if (!waitForHeartbeat(10000, heartbeat)) { throw std::runtime_error(\"Received no heartbeat from PX4 after 10 seconds\"); } The following code connects to serial port, and then forwards all messages to and from QGroundControl to that drone using another connection that is joined to the drone stream. auto droneConnection = MavLinkConnection::connectSerial(\"drone\", \"/dev/ttyACM0\", 115200, \"sh /etc/init.d/rc.usb\\n\"); auto proxyConnection = MavLinkConnection::connectRemoteUdp(\"qgc\", \"127.0.0.1\", \"127.0.0.1\", 14550); droneConnection->join(proxyConnection); The following code then takes that connection and turns on heartBeats and starts tracking vehicle information using local system id 166 and component id 1. auto vehicle = std::make_shared(166, 1); vehicle->connect(connection); vehicle->startHeartbeat(); std::this_thread::sleep_for(std::chrono::seconds(5)); VehicleState state = vehicle->getVehicleState(); printf(\"Home position is %s, %f,%f,%f\\n\", state.home.is_set ? \"set\" : \"not set\", state.home.global_pos.lat, state.home.global_pos.lon, state.home.global_pos.alt); The following code uses the vehicle object to arm the drone and take off and wait for the takeoff altitude to be reached: bool rc = false; if (!vehicle->armDisarm(true).wait(3000, &rc) || !rc) { printf(\"arm command failed\\n\"); return; } if (!vehicle->takeoff(targetAlt).wait(3000, &rc) || !rc) { printf(\"takeoff command failed\\n\"); return; } int version = vehicle->getVehicleStateVersion(); while (true) { int newVersion = vehicle->getVehicleStateVersion(); if (version != newVersion) { VehicleState state = vehicle->getVehicleState(); float alt = state.local_est.pos.z; if (alt >= targetAlt - delta && alt <= targetAlt + delta) { reached = true; printf(\"Target altitude reached\\n\"); break; } } else { std::this_thread::sleep_for(std::chrono::milliseconds(10)); } } The following code uses offboard control to make the drone fly in a circle with camera pointed at the center. Here we use the subscribe method to check each new local position message to indicate so we can compute the new velocity vector as soon as that new position is received. We request a high rate for those messages using setMessageInterval to ensure smooth circular orbit. vehicle->setMessageInterval((int)MavLinkMessageIds::MAVLINK_MSG_ID_LOCAL_POSITION_NED, 30); vehicle->requestControl(); int subscription = vehicle->getConnection()->subscribe( [&](std::shared_ptr connection, const MavLinkMessage& m) { if (m.msgid == (int)MavLinkMessageIds::MAVLINK_MSG_ID_LOCAL_POSITION_NED) { // convert generic msg to strongly typed message. MavLinkLocalPositionNed localPos; localPos.decode(msg); float x = localPos.x; float y = localPos.y; float dx = x - cx; float dy = y - cy; float angle = atan2(dy, dx); if (angle < 0) angle += M_PI * 2; float tangent = angle + M_PI_2; double newvx = orbitSpeed * cos(tangent); double newvy = orbitSpeed * sin(tangent); float heading = angle + M_PI; vehicle->moveByLocalVelocityWithAltHold(newvx, newvy, altitude, true, heading); } }); The following code stops flying the drone in offboard mode and tells the drone to loiter at its current location. This version of the code shows how to use the AsyncResult without blocking on a wait call. vehicle->releaseControl(); if (vehicle->loiter().then([=](bool rc) { printf(\"loiter command %s\\n\", rc ? \"succeeded\" : \"failed\"); } The following code gets all configurable parameters from the drone and prints them: auto list = vehicle->getParamList(); auto end = list.end(); int count = 0; for (auto iter = list.begin(); iter < end; iter++) { count++; MavLinkParameter p = *iter; if (p.type == MAV_PARAM_TYPE_REAL32 || p.type == MAV_PARAM_TYPE_REAL64) { printf(\"%s=%f\\n\", p.name.c_str(), p.value); } else { printf(\"%s=%d\\n\", p.name.c_str(), static_cast(p.value)); } } The following code sets a parameter on the Pixhawk to disable the USB safety check (this is handy if you are controlling the Pixhawk over USB using another onboard computer that is part of the drone itself). You should NOT do this if you are connecting your PC or laptop to the drone over USB. MavLinkParameter p; p.name = \"CBRK_USB_CHK\"; p.value = 197848; if (!vehicle->setParameter(p).wait(3000,&rc) || !rc) { printf(\"Setting the CBRK_USB_CHK failed\"); } MavLinkVehicle actually has a helper method for this called allowFlightControlOverUsb, so now you know how it is implemented :-)","title":"Examples"},{"location":"mavlinkcom/#advanced-connections","text":"You can wire up different configurations of mavlink pipelines using the MavLinkConnection class \"join\" method as shown below. Example 1, we connect to PX4 over serial, and proxy those messages through to QGroundControl and the LogViewer who are listening on remote ports. Example 2: simulation can talk to jMavSim and jMavSim connects to PX4. jMavSim can also manage multiple connections, so it can talk to unreal simulator. Another MavLinkConnection can be joined to proxy connections that jMavSim doesn't support, like the LogViewer or a remote camera node. Example 3: we use MavLinkConnection to connect to PX4 over serial, then join additional connections for all our remote nodes including jMavSim. Example 4: We can also do distributed systems to control the drone remotely:","title":"Advanced Connections"},{"location":"mavlinkcom_mocap/","text":"Welcome to MavLinkMoCap This folder contains the MavLinkMoCap library which connects to a OptiTrack camera system for accurate indoor location. Dependencies: OptiTrack Motive . MavLinkCom . Setup RigidBody First you need to define a RigidBody named 'Quadrocopter' using Motive. See Rigid_Body_Tracking . MavLinkTest Use MavLinkTest to talk to your PX4 drone, with \"-server:addr:port\", for example, when connected to drone wifi use: MavLinkMoCap -server:10.42.0.228:14590 \"-project:D:\\OptiTrack\\Motive Project 2016-12-19 04.09.42 PM.ttp\" This publishes the ATT_POS_MOCAP messages and you can proxy those through to the PX4 by running MavLinkTest on the dronebrain using: MavLinkTest -serial:/dev/ttyACM0,115200 -proxy:10.42.0.228:14590 Now the drone will get the ATT_POS_MOCAP and you should see the light turn green meaning it is now has a home position and is ready to fly.","title":"MavLink MoCap"},{"location":"mavlinkcom_mocap/#welcome-to-mavlinkmocap","text":"This folder contains the MavLinkMoCap library which connects to a OptiTrack camera system for accurate indoor location.","title":"Welcome to MavLinkMoCap"},{"location":"mavlinkcom_mocap/#dependencies","text":"OptiTrack Motive . MavLinkCom .","title":"Dependencies:"},{"location":"mavlinkcom_mocap/#setup-rigidbody","text":"First you need to define a RigidBody named 'Quadrocopter' using Motive. See Rigid_Body_Tracking .","title":"Setup RigidBody"},{"location":"mavlinkcom_mocap/#mavlinktest","text":"Use MavLinkTest to talk to your PX4 drone, with \"-server:addr:port\", for example, when connected to drone wifi use: MavLinkMoCap -server:10.42.0.228:14590 \"-project:D:\\OptiTrack\\Motive Project 2016-12-19 04.09.42 PM.ttp\" This publishes the ATT_POS_MOCAP messages and you can proxy those through to the PX4 by running MavLinkTest on the dronebrain using: MavLinkTest -serial:/dev/ttyACM0,115200 -proxy:10.42.0.228:14590 Now the drone will get the ATT_POS_MOCAP and you should see the light turn green meaning it is now has a home position and is ready to fly.","title":"MavLinkTest"},{"location":"meshes/","text":"How to Access Meshes in AIRSIM Cosys-AirSim supports the ability to access the static meshes that make up the scene. Mesh structure Each mesh is represented with the below struct. struct MeshPositionVertexBuffersResponse { Vector3r position; Quaternionr orientation; std::vector vertices; std::vector indices; std::string name; }; The position and orientation are in the Unreal coordinate system. The mesh itself is a triangular mesh represented by the vertices and the indices. The triangular mesh type is typically called a Face-Vertex Mesh. This means every triplet of indices hold the indexes of the vertices that make up the triangle/face. The x,y,z coordinates of the vertices are all stored in a single vector. This means the vertices vector is Nx3 where N is number of vertices. The position of the vertices are the global positions in the Unreal coordinate system. This means they have already been transformed by the position and orientation. How to use The API to get the meshes in the scene is quite simple. However, one should note that the function call is very expensive and should very rarely be called. In general this is ok because this function only accesses the static meshes which for most applications are not changing during the duration of your program. Note that you will have to use a 3rdparty library or your own custom code to actually interact with the received meshes. Below I utilize the Python bindings of libigl to visualize the received meshes. import cosysairsim as airsim AIRSIM_HOST_IP='127.0.0.1' client = airsim.VehicleClient(ip=AIRSIM_HOST_IP) client.confirmConnection() # List of returned meshes are received via this function meshes=client.simGetMeshPositionVertexBuffers() index=0 for m in meshes: # Finds one of the cube meshes in the Blocks environment if 'cube' in m.name: # Code from here on relies on libigl. Libigl uses pybind11 to wrap C++ code. So here the built pyigl.so # library is in the same directory as this example code. # This is here as code for your own mesh library should require something similar from pyigl import * from iglhelpers import * # Convert the lists to numpy arrays vertex_list=np.array(m.vertices,dtype=np.float32) indices=np.array(m.indices,dtype=np.uint32) num_vertices=int(len(vertex_list)/3) num_indices=len(indices) # Libigl requires the shape to be Nx3 where N is number of vertices or indices # It also requires the actual type to be double(float64) for vertices and int64 for the triangles/indices vertices_reshaped=vertex_list.reshape((num_vertices,3)) indices_reshaped=indices.reshape((int(num_indices/3),3)) vertices_reshaped=vertices_reshaped.astype(np.float64) indices_reshaped=indices_reshaped.astype(np.int64) # Libigl function to convert to internal Eigen format v_eig=p2e(vertices_reshaped) i_eig=p2e(indices_reshaped) # View the mesh viewer = igl.glfw.Viewer() viewer.data().set_mesh(v_eig,i_eig) viewer.launch() break","title":"Mesh Vertex Buffers"},{"location":"meshes/#how-to-access-meshes-in-airsim","text":"Cosys-AirSim supports the ability to access the static meshes that make up the scene.","title":"How to Access Meshes in AIRSIM"},{"location":"meshes/#mesh-structure","text":"Each mesh is represented with the below struct. struct MeshPositionVertexBuffersResponse { Vector3r position; Quaternionr orientation; std::vector vertices; std::vector indices; std::string name; }; The position and orientation are in the Unreal coordinate system. The mesh itself is a triangular mesh represented by the vertices and the indices. The triangular mesh type is typically called a Face-Vertex Mesh. This means every triplet of indices hold the indexes of the vertices that make up the triangle/face. The x,y,z coordinates of the vertices are all stored in a single vector. This means the vertices vector is Nx3 where N is number of vertices. The position of the vertices are the global positions in the Unreal coordinate system. This means they have already been transformed by the position and orientation.","title":"Mesh structure"},{"location":"meshes/#how-to-use","text":"The API to get the meshes in the scene is quite simple. However, one should note that the function call is very expensive and should very rarely be called. In general this is ok because this function only accesses the static meshes which for most applications are not changing during the duration of your program. Note that you will have to use a 3rdparty library or your own custom code to actually interact with the received meshes. Below I utilize the Python bindings of libigl to visualize the received meshes. import cosysairsim as airsim AIRSIM_HOST_IP='127.0.0.1' client = airsim.VehicleClient(ip=AIRSIM_HOST_IP) client.confirmConnection() # List of returned meshes are received via this function meshes=client.simGetMeshPositionVertexBuffers() index=0 for m in meshes: # Finds one of the cube meshes in the Blocks environment if 'cube' in m.name: # Code from here on relies on libigl. Libigl uses pybind11 to wrap C++ code. So here the built pyigl.so # library is in the same directory as this example code. # This is here as code for your own mesh library should require something similar from pyigl import * from iglhelpers import * # Convert the lists to numpy arrays vertex_list=np.array(m.vertices,dtype=np.float32) indices=np.array(m.indices,dtype=np.uint32) num_vertices=int(len(vertex_list)/3) num_indices=len(indices) # Libigl requires the shape to be Nx3 where N is number of vertices or indices # It also requires the actual type to be double(float64) for vertices and int64 for the triangles/indices vertices_reshaped=vertex_list.reshape((num_vertices,3)) indices_reshaped=indices.reshape((int(num_indices/3),3)) vertices_reshaped=vertices_reshaped.astype(np.float64) indices_reshaped=indices_reshaped.astype(np.int64) # Libigl function to convert to internal Eigen format v_eig=p2e(vertices_reshaped) i_eig=p2e(indices_reshaped) # View the mesh viewer = igl.glfw.Viewer() viewer.data().set_mesh(v_eig,i_eig) viewer.launch() break","title":"How to use"},{"location":"modify_recording_data/","text":"Modifying Recording Data Cosys-AirSim has a Recording feature to easily collect data and images. The Recording APIs also allows starting and stopping the recording using API. However, the data recorded by default might not be sufficient for your use cases, and it might be preferable to record additional data such as IMU, GPS sensors, Rotor speed for copters, etc. You can use the existing Python and C++ APIs to get the information and store it as required, especially for Lidar. Another option for adding small fields such as GPS or internal data such as Unreal position or something else is possible through modifying the recording methods inside Cosys-AirSim. This page describes the specific methods which you might need to change. The recorded data is written in a airsim_rec.txt file in a tab-separated format, with images in an images/ folder. The entire folder is by default present in the Documents folder (or specified in settings) with the timestamp of when the recording started in %Y-%M-%D-%H-%M-%S format. Car vehicle records the following fields - VehicleName TimeStamp POS_X POS_Y POS_Z Q_W Q_X Q_Y Q_Z Throttle Steering Brake Gear Handbrake RPM Speed ImageFile For Multirotor - VehicleName TimeStamp POS_X POS_Y POS_Z Q_W Q_X Q_Y Q_Z ImageFile Code Changes Note that this requires building and using Cosys-AirSim from source. You can compile a binary yourself after modifying if needed. The primary method which fills the data to be stored is PawnSimApi::getRecordFileLine , it's the base method for all the vehicles, and Car overrides it to log additional data, as can be seen in CarPawnSimApi::getRecordFileLine . To record additional data for multirotor, you can add a similar method in MultirotorPawnSimApi.cpp/h files which overrides the base class implementation and append other data. The currently logged data can also be modified and removed as needed. E.g. recording GPS, IMU and Barometer data also for multirotor - // MultirotorPawnSimApi.cpp std::string MultirotorPawnSimApi::getRecordFileLine(bool is_header_line) const { std::string common_line = PawnSimApi::getRecordFileLine(is_header_line); if (is_header_line) { return common_line + \"Latitude\\tLongitude\\tAltitude\\tPressure\\tAccX\\tAccY\\tAccZ\\t\"; } const auto& state = vehicle_api_->getMultirotorState(); const auto& bar_data = vehicle_api_->getBarometerData(\"\"); const auto& imu_data = vehicle_api_->getImuData(\"\"); std::ostringstream ss; ss << common_line; ss << state.gps_location.latitude << \"\\t\" << state.gps_location.longitude << \"\\t\" << state.gps_location.altitude << \"\\t\"; ss << bar_data.pressure << \"\\t\"; ss << imu_data.linear_acceleration.x() << \"\\t\" << imu_data.linear_acceleration.y() << \"\\t\" << imu_data.linear_acceleration.z() << \"\\t\"; return ss.str(); } // MultirotorPawnSimApi.h virtual std::string getRecordFileLine(bool is_header_line) const override;","title":"Modifying Recording Data"},{"location":"modify_recording_data/#modifying-recording-data","text":"Cosys-AirSim has a Recording feature to easily collect data and images. The Recording APIs also allows starting and stopping the recording using API. However, the data recorded by default might not be sufficient for your use cases, and it might be preferable to record additional data such as IMU, GPS sensors, Rotor speed for copters, etc. You can use the existing Python and C++ APIs to get the information and store it as required, especially for Lidar. Another option for adding small fields such as GPS or internal data such as Unreal position or something else is possible through modifying the recording methods inside Cosys-AirSim. This page describes the specific methods which you might need to change. The recorded data is written in a airsim_rec.txt file in a tab-separated format, with images in an images/ folder. The entire folder is by default present in the Documents folder (or specified in settings) with the timestamp of when the recording started in %Y-%M-%D-%H-%M-%S format. Car vehicle records the following fields - VehicleName TimeStamp POS_X POS_Y POS_Z Q_W Q_X Q_Y Q_Z Throttle Steering Brake Gear Handbrake RPM Speed ImageFile For Multirotor - VehicleName TimeStamp POS_X POS_Y POS_Z Q_W Q_X Q_Y Q_Z ImageFile","title":"Modifying Recording Data"},{"location":"modify_recording_data/#code-changes","text":"Note that this requires building and using Cosys-AirSim from source. You can compile a binary yourself after modifying if needed. The primary method which fills the data to be stored is PawnSimApi::getRecordFileLine , it's the base method for all the vehicles, and Car overrides it to log additional data, as can be seen in CarPawnSimApi::getRecordFileLine . To record additional data for multirotor, you can add a similar method in MultirotorPawnSimApi.cpp/h files which overrides the base class implementation and append other data. The currently logged data can also be modified and removed as needed. E.g. recording GPS, IMU and Barometer data also for multirotor - // MultirotorPawnSimApi.cpp std::string MultirotorPawnSimApi::getRecordFileLine(bool is_header_line) const { std::string common_line = PawnSimApi::getRecordFileLine(is_header_line); if (is_header_line) { return common_line + \"Latitude\\tLongitude\\tAltitude\\tPressure\\tAccX\\tAccY\\tAccZ\\t\"; } const auto& state = vehicle_api_->getMultirotorState(); const auto& bar_data = vehicle_api_->getBarometerData(\"\"); const auto& imu_data = vehicle_api_->getImuData(\"\"); std::ostringstream ss; ss << common_line; ss << state.gps_location.latitude << \"\\t\" << state.gps_location.longitude << \"\\t\" << state.gps_location.altitude << \"\\t\"; ss << bar_data.pressure << \"\\t\"; ss << imu_data.linear_acceleration.x() << \"\\t\" << imu_data.linear_acceleration.y() << \"\\t\" << imu_data.linear_acceleration.z() << \"\\t\"; return ss.str(); } // MultirotorPawnSimApi.h virtual std::string getRecordFileLine(bool is_header_line) const override;","title":"Code Changes"},{"location":"multi_vehicle/","text":"Multiple Vehicles in AirSim Since release 1.2, AirSim is fully enabled for multiple vehicles. This capability allows you to create multiple vehicles easily and use APIs to control them. Creating Multiple Vehicles It's as easy as specifying them in settings.json . The Vehicles element allows you to specify list of vehicles you want to create along with their initial positions and orientations. The positions are specified in NED coordinates in SI units with origin set at Player Start component in Unreal environment. The orientation is specified as Yaw, Pitch and Roll in degrees. Creating Multiple Cars { \"SettingsVersion\": 2.0, \"SimMode\": \"Car\", \"Vehicles\": { \"Car1\": { \"VehicleType\": \"PhysXCar\", \"X\": 4, \"Y\": 0, \"Z\": -2 }, \"Car2\": { \"VehicleType\": \"PhysXCar\", \"X\": -4, \"Y\": 0, \"Z\": -2, \"Yaw\": 90 } } } Creating Multiple Drones { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"Drone1\": { \"VehicleType\": \"SimpleFlight\", \"X\": 4, \"Y\": 0, \"Z\": -2, \"Yaw\": -180 }, \"Drone2\": { \"VehicleType\": \"SimpleFlight\", \"X\": 8, \"Y\": 0, \"Z\": -2 } } } Using APIs for Multiple Vehicles The new APIs since AirSim 1.2 allows you to specify vehicle_name . This name corresponds to keys in json settings (for example, Car1 or Drone2 above). Example code for cars Example code for multirotors Using APIs for multi-vehicles requires specifying the vehicle_name , which needs to be hardcoded in the script or requires parsing of the settings file. There's also a simple API listVehicles() which returns a list (vector in C++) of strings containing names of the current vehicles. For example, with the above settings for 2 Cars - >>> client.listVehicles() ['Car1', 'Car2'] Demo Creating vehicles at runtime through API In the latest main branch of AirSim, the simAddVehicle API can be used to create vehicles at runtime. This is useful to create many such vehicles without needing to specify them in the settings. There are some limitations of this currently, described below - simAddVehicle takes in the following arguments: vehicle_name : Name of the vehicle to be created, this should be unique for each vehicle including any exisiting ones defined in the settings.json vehicle_type : Type of vehicle, e.g. \"simpleflight\". Currently only SimpleFlight, PhysXCar, ComputerVision are supported, in their respective SimModes. Other vehicle types including PX4 and ArduPilot-related aren't supported pose : Initial pose of the vehicle pawn_path : Vehicle blueprint path, default empty wbich uses the default blueprint for the vehicle type Returns: bool Whether vehicle was created The usual APIs can be used to control and interact with the vehicle once created, with the vehicle_name parameter. Specifying other settings such as additional cameras, etc. isn't possible currently, a future enhancement could be passing JSON string of settings for the vehicle. It also works with the listVehicles() API described above, so the vehicles spawned would be included in the list. For some examples, check out HelloSpawnedDrones.cpp - And runtime_car.py -","title":"Multiple Vehicles"},{"location":"multi_vehicle/#multiple-vehicles-in-airsim","text":"Since release 1.2, AirSim is fully enabled for multiple vehicles. This capability allows you to create multiple vehicles easily and use APIs to control them.","title":"Multiple Vehicles in AirSim"},{"location":"multi_vehicle/#creating-multiple-vehicles","text":"It's as easy as specifying them in settings.json . The Vehicles element allows you to specify list of vehicles you want to create along with their initial positions and orientations. The positions are specified in NED coordinates in SI units with origin set at Player Start component in Unreal environment. The orientation is specified as Yaw, Pitch and Roll in degrees.","title":"Creating Multiple Vehicles"},{"location":"multi_vehicle/#creating-multiple-cars","text":"{ \"SettingsVersion\": 2.0, \"SimMode\": \"Car\", \"Vehicles\": { \"Car1\": { \"VehicleType\": \"PhysXCar\", \"X\": 4, \"Y\": 0, \"Z\": -2 }, \"Car2\": { \"VehicleType\": \"PhysXCar\", \"X\": -4, \"Y\": 0, \"Z\": -2, \"Yaw\": 90 } } }","title":"Creating Multiple Cars"},{"location":"multi_vehicle/#creating-multiple-drones","text":"{ \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"Drone1\": { \"VehicleType\": \"SimpleFlight\", \"X\": 4, \"Y\": 0, \"Z\": -2, \"Yaw\": -180 }, \"Drone2\": { \"VehicleType\": \"SimpleFlight\", \"X\": 8, \"Y\": 0, \"Z\": -2 } } }","title":"Creating Multiple Drones"},{"location":"multi_vehicle/#using-apis-for-multiple-vehicles","text":"The new APIs since AirSim 1.2 allows you to specify vehicle_name . This name corresponds to keys in json settings (for example, Car1 or Drone2 above). Example code for cars Example code for multirotors Using APIs for multi-vehicles requires specifying the vehicle_name , which needs to be hardcoded in the script or requires parsing of the settings file. There's also a simple API listVehicles() which returns a list (vector in C++) of strings containing names of the current vehicles. For example, with the above settings for 2 Cars - >>> client.listVehicles() ['Car1', 'Car2']","title":"Using APIs for Multiple Vehicles"},{"location":"multi_vehicle/#demo","text":"","title":"Demo"},{"location":"multi_vehicle/#creating-vehicles-at-runtime-through-api","text":"In the latest main branch of AirSim, the simAddVehicle API can be used to create vehicles at runtime. This is useful to create many such vehicles without needing to specify them in the settings. There are some limitations of this currently, described below - simAddVehicle takes in the following arguments: vehicle_name : Name of the vehicle to be created, this should be unique for each vehicle including any exisiting ones defined in the settings.json vehicle_type : Type of vehicle, e.g. \"simpleflight\". Currently only SimpleFlight, PhysXCar, ComputerVision are supported, in their respective SimModes. Other vehicle types including PX4 and ArduPilot-related aren't supported pose : Initial pose of the vehicle pawn_path : Vehicle blueprint path, default empty wbich uses the default blueprint for the vehicle type Returns: bool Whether vehicle was created The usual APIs can be used to control and interact with the vehicle once created, with the vehicle_name parameter. Specifying other settings such as additional cameras, etc. isn't possible currently, a future enhancement could be passing JSON string of settings for the vehicle. It also works with the listVehicles() API described above, so the vehicles spawned would be included in the list. For some examples, check out HelloSpawnedDrones.cpp - And runtime_car.py -","title":"Creating vehicles at runtime through API"},{"location":"object_detection/","text":"Object Detection About This feature lets you generate object detection using existing cameras in Cosys-AirSim, similar to detection DNN. Using the API you can control which object to detect by name and radius from camera. One can control these settings for each camera, image type and vehicle combination separately. API Set mesh name to detect in wildcard format simAddDetectionFilterMeshName(camera_name, image_type, mesh_name, vehicle_name = '') Clear all mesh names previously added simClearDetectionMeshNames(camera_name, image_type, vehicle_name = '') Set detection radius in cm simSetDetectionFilterRadius(camera_name, image_type, radius_cm, vehicle_name = '') Get detections simGetDetections(camera_name, image_type, vehicle_name = '') Note that if using Annotation camera one has to also give the annotation_name argument to choose the right annotation camera. For example: simGetDetections(camera_name, image_type, vehicle_name = '', annotation_name=\"mygreyscaleannotation\") The return value of simGetDetections is a DetectionInfo array: DetectionInfo name = '' geo_point = GeoPoint() box2D = Box2D() box3D = Box3D() relative_pose = Pose() Usage example Python script detection.py shows how to set detection parameters and shows the result in OpenCV capture. A minimal example using API with Blocks environment to detect Cylinder objects: camera_name = \"0\" image_type = airsim.ImageType.Scene client = airsim.MultirotorClient() client.confirmConnection() client.simSetDetectionFilterRadius(camera_name, image_type, 80 * 100) # in [cm] client.simAddDetectionFilterMeshName(camera_name, image_type, \"Cylinder_*\") client.simGetDetections(camera_name, image_type) detections = client.simClearDetectionMeshNames(camera_name, image_type) Output result: Cylinder: { 'box2D': { 'max': { 'x_val': 617.025634765625, 'y_val': 583.5487060546875}, 'min': { 'x_val': 485.74359130859375, 'y_val': 438.33465576171875}}, 'box3D': { 'max': { 'x_val': 4.900000095367432, 'y_val': 0.7999999523162842, 'z_val': 0.5199999809265137}, 'min': { 'x_val': 3.8999998569488525, 'y_val': -0.19999998807907104, 'z_val': 1.5199999809265137}}, 'geo_point': { 'altitude': 16.979999542236328, 'latitude': 32.28772183970703, 'longitude': 34.864785008379876}, 'name': 'Cylinder9_2', 'relative_pose': { 'orientation': { 'w_val': 0.9929741621017456, 'x_val': 0.0038591264747083187, 'y_val': -0.11333247274160385, 'z_val': 0.03381215035915375}, 'position': { 'x_val': 4.400000095367432, 'y_val': 0.29999998211860657, 'z_val': 1.0199999809265137}}}","title":"Object Detection"},{"location":"object_detection/#object-detection","text":"","title":"Object Detection"},{"location":"object_detection/#about","text":"This feature lets you generate object detection using existing cameras in Cosys-AirSim, similar to detection DNN. Using the API you can control which object to detect by name and radius from camera. One can control these settings for each camera, image type and vehicle combination separately.","title":"About"},{"location":"object_detection/#api","text":"Set mesh name to detect in wildcard format simAddDetectionFilterMeshName(camera_name, image_type, mesh_name, vehicle_name = '') Clear all mesh names previously added simClearDetectionMeshNames(camera_name, image_type, vehicle_name = '') Set detection radius in cm simSetDetectionFilterRadius(camera_name, image_type, radius_cm, vehicle_name = '') Get detections simGetDetections(camera_name, image_type, vehicle_name = '') Note that if using Annotation camera one has to also give the annotation_name argument to choose the right annotation camera. For example: simGetDetections(camera_name, image_type, vehicle_name = '', annotation_name=\"mygreyscaleannotation\") The return value of simGetDetections is a DetectionInfo array: DetectionInfo name = '' geo_point = GeoPoint() box2D = Box2D() box3D = Box3D() relative_pose = Pose()","title":"API"},{"location":"object_detection/#usage-example","text":"Python script detection.py shows how to set detection parameters and shows the result in OpenCV capture. A minimal example using API with Blocks environment to detect Cylinder objects: camera_name = \"0\" image_type = airsim.ImageType.Scene client = airsim.MultirotorClient() client.confirmConnection() client.simSetDetectionFilterRadius(camera_name, image_type, 80 * 100) # in [cm] client.simAddDetectionFilterMeshName(camera_name, image_type, \"Cylinder_*\") client.simGetDetections(camera_name, image_type) detections = client.simClearDetectionMeshNames(camera_name, image_type) Output result: Cylinder: { 'box2D': { 'max': { 'x_val': 617.025634765625, 'y_val': 583.5487060546875}, 'min': { 'x_val': 485.74359130859375, 'y_val': 438.33465576171875}}, 'box3D': { 'max': { 'x_val': 4.900000095367432, 'y_val': 0.7999999523162842, 'z_val': 0.5199999809265137}, 'min': { 'x_val': 3.8999998569488525, 'y_val': -0.19999998807907104, 'z_val': 1.5199999809265137}}, 'geo_point': { 'altitude': 16.979999542236328, 'latitude': 32.28772183970703, 'longitude': 34.864785008379876}, 'name': 'Cylinder9_2', 'relative_pose': { 'orientation': { 'w_val': 0.9929741621017456, 'x_val': 0.0038591264747083187, 'y_val': -0.11333247274160385, 'z_val': 0.03381215035915375}, 'position': { 'x_val': 4.400000095367432, 'y_val': 0.29999998211860657, 'z_val': 1.0199999809265137}}}","title":"Usage example"},{"location":"pfm/","text":"pfm Format Pfm (or Portable FloatMap) image format stores image as floating point pixels and hence are not restricted to usual 0-255 pixel value range. This is useful for HDR images or images that describes something other than colors like depth. One of the good viewer to view this file format is PfmPad . We don't recommend Maverick photo viewer because it doesn't seem to show depth images properly. AirSim has code to write pfm file for C++ and read as well as write for Python .","title":"pfm format"},{"location":"pfm/#pfm-format","text":"Pfm (or Portable FloatMap) image format stores image as floating point pixels and hence are not restricted to usual 0-255 pixel value range. This is useful for HDR images or images that describes something other than colors like depth. One of the good viewer to view this file format is PfmPad . We don't recommend Maverick photo viewer because it doesn't seem to show depth images properly. AirSim has code to write pfm file for C++ and read as well as write for Python .","title":"pfm Format"},{"location":"playback/","text":"Playback AirSim supports playing back the high level commands in a *.mavlink log file that were recorded using the MavLinkTest app for the purpose of comparing real and simulated flight. Example command line: MavLinkTest -serial:/dev/ttyACM0,115200 -logdir:. Then the log file contains the commands performed, which included several \"orbit\" commands, the resulting GPS map of the flight looks like this: Side-by-side comparison Now we can copy the *.mavlink log file recorded by MavLinkTest to the PC running the Unreal simulator with AirSim plugin. When the Simulator is running and the drone is parked in a place in a map that has room to do the same maneuvers we can run this MavLinkTest command line: MavLinkTest -server:127.0.0.1:14550 This should connect to the simulator. Now you can enter this command: PlayLog recording.mavlink The same commands you performed on the real drone will now play again in the simulator. You can then press 't' to see the trace, and it will show you the trace of the real drone and the simulated drone. Every time you press 't' again you can reset the lines so they are sync'd to the current position, this way I was able to capture a side-by-side trace of the \"orbit\" command performed in this recording, which generates the picture below. The pink line is the simulated flight and the red line is the real flight: Note: I'm using the ';' key in the simulator to take control of camera position using keyboard to get this shot. Parameters It may help to set the simulator up with some of the same flight parameters that your real drone is using, for example, in my case I was using a lower than normal cruise speed, slow takeoff speed, and it helps to tell the simulator to wait a long time before disarming (COM_DISARM_LAND) and to turn off the safety switches NAV_RCL_ACT and NAV_DLL_ACT ( don't do that on a real drone). param MPC_XY_CRUISE 2 param MPC_XY_VEL_MAX 2 param MPC_TKO_SPEED 1 param COM_DISARM_LAND 60 param NAV_RCL_ACT 0 param NAV_DLL_ACT 0","title":"Playing Logs"},{"location":"playback/#playback","text":"AirSim supports playing back the high level commands in a *.mavlink log file that were recorded using the MavLinkTest app for the purpose of comparing real and simulated flight. Example command line: MavLinkTest -serial:/dev/ttyACM0,115200 -logdir:. Then the log file contains the commands performed, which included several \"orbit\" commands, the resulting GPS map of the flight looks like this:","title":"Playback"},{"location":"playback/#side-by-side-comparison","text":"Now we can copy the *.mavlink log file recorded by MavLinkTest to the PC running the Unreal simulator with AirSim plugin. When the Simulator is running and the drone is parked in a place in a map that has room to do the same maneuvers we can run this MavLinkTest command line: MavLinkTest -server:127.0.0.1:14550 This should connect to the simulator. Now you can enter this command: PlayLog recording.mavlink The same commands you performed on the real drone will now play again in the simulator. You can then press 't' to see the trace, and it will show you the trace of the real drone and the simulated drone. Every time you press 't' again you can reset the lines so they are sync'd to the current position, this way I was able to capture a side-by-side trace of the \"orbit\" command performed in this recording, which generates the picture below. The pink line is the simulated flight and the red line is the real flight: Note: I'm using the ';' key in the simulator to take control of camera position using keyboard to get this shot.","title":"Side-by-side comparison"},{"location":"playback/#parameters","text":"It may help to set the simulator up with some of the same flight parameters that your real drone is using, for example, in my case I was using a lower than normal cruise speed, slow takeoff speed, and it helps to tell the simulator to wait a long time before disarming (COM_DISARM_LAND) and to turn off the safety switches NAV_RCL_ACT and NAV_DLL_ACT ( don't do that on a real drone). param MPC_XY_CRUISE 2 param MPC_XY_VEL_MAX 2 param MPC_TKO_SPEED 1 param COM_DISARM_LAND 60 param NAV_RCL_ACT 0 param NAV_DLL_ACT 0","title":"Parameters"},{"location":"px4_build/","text":"Building PX4 Source code Getting the PX4 source code is easy: sudo apt-get install git git clone https://github.com/PX4/PX4-Autopilot.git --recursive bash ./PX4-Autopilot/Tools/setup/ubuntu.sh --no-sim-tools cd PX4-Autopilot Now to build it you will need the right tools. PX4 Build tools The full instructions are available on the dev.px4.io website, but we've copied the relevant subset of those instructions here for your convenience. (Note that BashOnWindows ) can be used to build the PX4 firmware, just follow the BashOnWindows instructions at the bottom of this page) then proceed with the Ubuntu setup for PX4. Build SITL version Now you can make the SITL version that runs in posix, from the Firmware folder you created above: make px4_sitl_default none_iris Note: this build system is quite special, it knows how to update git submodules (and there's a lot of them), then it runs cmake (if necessary), then it runs the build itself. So in a way the root Makefile is a meta-meta makefile :-) You might see prompts like this: ******************************************************************************* * IF YOU DID NOT CHANGE THIS FILE (OR YOU DON'T KNOW WHAT A SUBMODULE IS): * * Hit 'u' and to update ALL submodules and resolve this. * * (performs git submodule sync --recursive * * and git submodule update --init --recursive ) * ******************************************************************************* Every time you see this prompt type 'u' on your keyboard. It shouldn't take long, about 2 minutes. If all succeeds, the last line will link the px4 app, which you can then run using the following: make px4_sitl_default none_iris And you should see output that looks like this: creating new parameters file creating new dataman file ______ __ __ ___ | ___ \\ \\ \\ / / / | | |_/ / \\ V / / /| | | __/ / \\ / /_| | | | / /^\\ \\ \\___ | \\_| \\/ \\/ |_/ px4 starting. 18446744073709551615 WARNING: setRealtimeSched failed (not run as root?) ERROR [param] importing from 'rootfs/eeprom/parameters' failed (-1) Command 'param' failed, returned 1 SYS_AUTOSTART: curr: 0 -> new: 4010 SYS_MC_EST_GROUP: curr: 2 -> new: 1 INFO [dataman] Unkown restart, data manager file 'rootfs/fs/microsd/dataman' size is 11797680 bytes BAT_N_CELLS: curr: 0 -> new: 3 CAL_GYRO0_ID: curr: 0 -> new: 2293768 CAL_ACC0_ID: curr: 0 -> new: 1376264 CAL_ACC1_ID: curr: 0 -> new: 1310728 CAL_MAG0_ID: curr: 0 -> new: 196616 so this is good, first run sets up the px4 parameters for SITL mode. Second run has less output. This app is also an interactive console where you can type commands. Type 'help' to see what they are and just type ctrl-C to kill it. You can do that and restart it any time, that's a great way to reset any wonky state if you need to (it's equivalent to a Pixhawk hardware reboot). ARM embedded tools If you plan to build the PX4 firmware for real Pixhawk hardware then you will need the gcc cross-compiler for ARM Cortex-M4 chipset. You can get this compiler by PX4 DevGuide, specifically this is in their ubuntu_sim_nuttx.sh setup script. After following those setup instructions you can verify the install by entering this command arm-none-eabi-gcc --version . You should see the following output: arm-none-eabi-gcc (GNU Tools for Arm Embedded Processors 7-2017-q4-major) 7.2.1 20170904 (release) [ARM/embedded-7-branch revision 255204] Copyright (C) 2017 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. Build PX4 for ARM hardware Now you can build the PX4 firmware for running on real pixhawk hardware: make px4_fmu-v4 This build will take a little longer because it is building a lot more including the NuttX real time OS, all the drivers for the sensors in the Pixhawk flight controller, and more. It is also running the compiler in super size-squeezing mode so it can fit all that in a 1 megabyte ROM !! One nice tid bit is you can plug in your pixhawk USB, and type make px4fmu-v2_default upload to flash the hardware with these brand new bits, so you don't need to use QGroundControl for that. Some Useful Parameters PX4 has many customizable parameters (over 700 of them, in fact) and to get best results with Cosys-AirSim we have found the following parameters are handy: // be sure to enable the new position estimator module: param set SYS_MC_EST_GROUP 2 // increase default limits on cruise speed so you can move around a large map more quickly. param MPC_XY_CRUISE 10 param MPC_XY_VEL_MAX 10 param MPC_Z_VEL_MAX_DN 2 // increase timeout for auto-disarm on landing so that any long running app doesn't have to worry about it param COM_DISARM_LAND 60 // make it possible to fly without radio control attached (do NOT do this one on a real drone) param NAV_RCL_ACT 0 // enable new syslogger to get more information from PX4 logs param set SYS_LOGGER 1 Using BashOnWindows See Bash on Windows Toolchain .","title":"Building PX4"},{"location":"px4_build/#building-px4","text":"","title":"Building PX4"},{"location":"px4_build/#source-code","text":"Getting the PX4 source code is easy: sudo apt-get install git git clone https://github.com/PX4/PX4-Autopilot.git --recursive bash ./PX4-Autopilot/Tools/setup/ubuntu.sh --no-sim-tools cd PX4-Autopilot Now to build it you will need the right tools.","title":"Source code"},{"location":"px4_build/#px4-build-tools","text":"The full instructions are available on the dev.px4.io website, but we've copied the relevant subset of those instructions here for your convenience. (Note that BashOnWindows ) can be used to build the PX4 firmware, just follow the BashOnWindows instructions at the bottom of this page) then proceed with the Ubuntu setup for PX4.","title":"PX4 Build tools"},{"location":"px4_build/#build-sitl-version","text":"Now you can make the SITL version that runs in posix, from the Firmware folder you created above: make px4_sitl_default none_iris Note: this build system is quite special, it knows how to update git submodules (and there's a lot of them), then it runs cmake (if necessary), then it runs the build itself. So in a way the root Makefile is a meta-meta makefile :-) You might see prompts like this: ******************************************************************************* * IF YOU DID NOT CHANGE THIS FILE (OR YOU DON'T KNOW WHAT A SUBMODULE IS): * * Hit 'u' and to update ALL submodules and resolve this. * * (performs git submodule sync --recursive * * and git submodule update --init --recursive ) * ******************************************************************************* Every time you see this prompt type 'u' on your keyboard. It shouldn't take long, about 2 minutes. If all succeeds, the last line will link the px4 app, which you can then run using the following: make px4_sitl_default none_iris And you should see output that looks like this: creating new parameters file creating new dataman file ______ __ __ ___ | ___ \\ \\ \\ / / / | | |_/ / \\ V / / /| | | __/ / \\ / /_| | | | / /^\\ \\ \\___ | \\_| \\/ \\/ |_/ px4 starting. 18446744073709551615 WARNING: setRealtimeSched failed (not run as root?) ERROR [param] importing from 'rootfs/eeprom/parameters' failed (-1) Command 'param' failed, returned 1 SYS_AUTOSTART: curr: 0 -> new: 4010 SYS_MC_EST_GROUP: curr: 2 -> new: 1 INFO [dataman] Unkown restart, data manager file 'rootfs/fs/microsd/dataman' size is 11797680 bytes BAT_N_CELLS: curr: 0 -> new: 3 CAL_GYRO0_ID: curr: 0 -> new: 2293768 CAL_ACC0_ID: curr: 0 -> new: 1376264 CAL_ACC1_ID: curr: 0 -> new: 1310728 CAL_MAG0_ID: curr: 0 -> new: 196616 so this is good, first run sets up the px4 parameters for SITL mode. Second run has less output. This app is also an interactive console where you can type commands. Type 'help' to see what they are and just type ctrl-C to kill it. You can do that and restart it any time, that's a great way to reset any wonky state if you need to (it's equivalent to a Pixhawk hardware reboot).","title":"Build SITL version"},{"location":"px4_build/#arm-embedded-tools","text":"If you plan to build the PX4 firmware for real Pixhawk hardware then you will need the gcc cross-compiler for ARM Cortex-M4 chipset. You can get this compiler by PX4 DevGuide, specifically this is in their ubuntu_sim_nuttx.sh setup script. After following those setup instructions you can verify the install by entering this command arm-none-eabi-gcc --version . You should see the following output: arm-none-eabi-gcc (GNU Tools for Arm Embedded Processors 7-2017-q4-major) 7.2.1 20170904 (release) [ARM/embedded-7-branch revision 255204] Copyright (C) 2017 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.","title":"ARM embedded tools"},{"location":"px4_build/#build-px4-for-arm-hardware","text":"Now you can build the PX4 firmware for running on real pixhawk hardware: make px4_fmu-v4 This build will take a little longer because it is building a lot more including the NuttX real time OS, all the drivers for the sensors in the Pixhawk flight controller, and more. It is also running the compiler in super size-squeezing mode so it can fit all that in a 1 megabyte ROM !! One nice tid bit is you can plug in your pixhawk USB, and type make px4fmu-v2_default upload to flash the hardware with these brand new bits, so you don't need to use QGroundControl for that.","title":"Build PX4 for ARM hardware"},{"location":"px4_build/#some-useful-parameters","text":"PX4 has many customizable parameters (over 700 of them, in fact) and to get best results with Cosys-AirSim we have found the following parameters are handy: // be sure to enable the new position estimator module: param set SYS_MC_EST_GROUP 2 // increase default limits on cruise speed so you can move around a large map more quickly. param MPC_XY_CRUISE 10 param MPC_XY_VEL_MAX 10 param MPC_Z_VEL_MAX_DN 2 // increase timeout for auto-disarm on landing so that any long running app doesn't have to worry about it param COM_DISARM_LAND 60 // make it possible to fly without radio control attached (do NOT do this one on a real drone) param NAV_RCL_ACT 0 // enable new syslogger to get more information from PX4 logs param set SYS_LOGGER 1","title":"Some Useful Parameters"},{"location":"px4_build/#using-bashonwindows","text":"See Bash on Windows Toolchain .","title":"Using BashOnWindows"},{"location":"px4_lockstep/","text":"LockStep The latest version of PX4 supports a new lockstep feature when communicating with the simulator over TCP. Lockstep is an important feature because it synchronizes PX4 and the simulator so they essentially use the same clock time. This makes PX4 behave normally even during unusually long delays in Simulator performance. It is recommended that when you are running a lockstep enabled version of PX4 in SITL mode that you tell AirSim to use a SteppableClock , and set UseTcp to true and LockStep to true . { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"ClockType\": \"SteppableClock\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"UseTcp\": true, \"LockStep\": true, ... This causes AirSim to not use a \"realtime\" clock, but instead it advances the clock in step which each sensor update sent to PX4. This way PX4 thinks time is progressing smoothly no matter how long it takes AirSim to really process that update loop. This has the following advantages: AirSim can be used on slow machines that cannot process updates quickly. You can debug AirSim and hit a breakpoint, and when you resume PX4 will behave normally. You can enable very slow sensors like the Lidar with large number of simulated points, and PX4 will still behave normally. There will be some side effects to lockstep , namely, slower update loops caused by running AirSim on an underpowered machine or from expensive sensors (like Lidar) will create some visible jerkiness in the simulated flight if you look at the updates on screen in realtime. Disabling LockStep If you are running PX4 in cygwin, there is an open issue with lockstep . PX4 is configured to use lockstep by default. To disable this feature, first disable it in PX4 : Navigate to boards/px4/sitl/ in your local PX4 repository Edit default.cmake and find the following line: set(ENABLE_LOCKSTEP_SCHEDULER yes) Change this line to: set(ENABLE_LOCKSTEP_SCHEDULER no) Disable it in AirSim by setting LockStep to false and either removing any \"ClockType\": \"SteppableClock\" setting or resetting ClockType back to default: { ... \"ClockType\": \"\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"LockStep\": false, ... Now you can run PX4 SITL as you normally would ( make px4_sitl_default none_iris ) and it will use the host system time without waiting on AirSim.","title":"PX4 Lockstep"},{"location":"px4_lockstep/#lockstep","text":"The latest version of PX4 supports a new lockstep feature when communicating with the simulator over TCP. Lockstep is an important feature because it synchronizes PX4 and the simulator so they essentially use the same clock time. This makes PX4 behave normally even during unusually long delays in Simulator performance. It is recommended that when you are running a lockstep enabled version of PX4 in SITL mode that you tell AirSim to use a SteppableClock , and set UseTcp to true and LockStep to true . { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"ClockType\": \"SteppableClock\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"UseTcp\": true, \"LockStep\": true, ... This causes AirSim to not use a \"realtime\" clock, but instead it advances the clock in step which each sensor update sent to PX4. This way PX4 thinks time is progressing smoothly no matter how long it takes AirSim to really process that update loop. This has the following advantages: AirSim can be used on slow machines that cannot process updates quickly. You can debug AirSim and hit a breakpoint, and when you resume PX4 will behave normally. You can enable very slow sensors like the Lidar with large number of simulated points, and PX4 will still behave normally. There will be some side effects to lockstep , namely, slower update loops caused by running AirSim on an underpowered machine or from expensive sensors (like Lidar) will create some visible jerkiness in the simulated flight if you look at the updates on screen in realtime.","title":"LockStep"},{"location":"px4_lockstep/#disabling-lockstep","text":"If you are running PX4 in cygwin, there is an open issue with lockstep . PX4 is configured to use lockstep by default. To disable this feature, first disable it in PX4 : Navigate to boards/px4/sitl/ in your local PX4 repository Edit default.cmake and find the following line: set(ENABLE_LOCKSTEP_SCHEDULER yes) Change this line to: set(ENABLE_LOCKSTEP_SCHEDULER no) Disable it in AirSim by setting LockStep to false and either removing any \"ClockType\": \"SteppableClock\" setting or resetting ClockType back to default: { ... \"ClockType\": \"\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"LockStep\": false, ... Now you can run PX4 SITL as you normally would ( make px4_sitl_default none_iris ) and it will use the host system time without waiting on AirSim.","title":"Disabling LockStep"},{"location":"px4_logging/","text":"PX4/MavLink Logging Thanks to Chris Lovett for developing various tools for PX4/MavLink logging mentioned on this page! Logging MavLink Messages AirSim can capture mavlink log files if you add the following to the PX4 section of your settings.json file: { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"PX4\": { ..., \"Logs\": \"c:/temp/mavlink\" } } } AirSim will create a timestamped log file in this folder for each \"armed/disarmed\" flight session. You will then see log files organized by date in d:\\temp\\logs, specifically input.mavlink and output.mavlink files. MavLink LogViewer For MavLink enabled drones, you can also use our Log Viewer to visualize the streams of data. If you enable this form of realtime logging you should not use the \"Logs\" setting above, these two forms of logging are mutually exclusive. PX4 Log in SITL Mode In SITL mode, please a log file is produced when drone is armed. The SITL terminal will contain the path to the log file, it should look something like this INFO [logger] Opened log file: rootfs/fs/microsd/log/2017-03-27/20_02_49.ulg PX4 Log in HITL Mode If you are using Pixhawk hardware in HIL mode, then set parameter SYS_LOGGER=1 using QGroundControl. PX4 will write log file on device which you can download at later date using QGroundControl. Debugging a bad flight You can use these *.mavlink log files to debug a bad flight using the LogViewer . For example, AirSim/PX4 flight may misbehave if you run it on an under powered computer. The following shows what might happen in that situation. In this flight we ran a simple commander takeoff test as performed by PythonClient/multirotor/stability_test.py and the flight started off fine, but then went crazy at the end and the drone crashed. So why is that? What can the log file show? Here we've plotted the following 5 metrics: - hil_gps.alt - the simulated altitude sent from AirSim to PX4 - telemetry.update_rate - the rate AirSim is performing the critical drone update loop in updates per second. - telemetry.update_time - the average time taken inside AirSim performing the critical drone update loop. - telemetry.actuation_delay - this is a very interesting metric measuring how long it takes PX4 to send back updated actuator controls message (motor power) - actuator_controls.0 - the actuator controls signal from PX4 for the first rotor. What we see then with these metrics is that things started off nicely, with nice flat altitude, high update rate in the 275 to 300 fps range, and a nice low update time inside AirSim around 113 microseconds, and a nice low actuation delay in the round trip from PX4. The actuator controls also stabilize quickly to a nice flat line. But then the update_time starts to climb, at the same time the actuation_delay climbs and we see a little tip in actuator_controls. This dip should not happen, the PX4 is panicking over loss of update rate but it recovers. But then we see actuator controls go crazy, a huge spike in actuation delay, and around this time we see a message from AirSim saying lockstep disabled . A delay over 100 millisecond triggers AirSim into jumping out of lockstep mode and the PX4 goes nuts and the drone crashes. The button line is that if a simple takeoff cannot maintain steady smooth flight and you see these kinds of spikes and uneven update rates then it means you are running AirSim on a computer that does not have enough horsepower. This is what a simple takeoff and hover and land should look like: Here you see the update_rate sticking the target of 333 updates per second. You also see the update_time a nice flat 39 microseconds and the actuator_delay somewhere between 1.1 and 1.7 milliseconds, and the resulting actuator_controls a lovely flat line.","title":"PX4/MavLink Logging"},{"location":"px4_logging/#px4mavlink-logging","text":"Thanks to Chris Lovett for developing various tools for PX4/MavLink logging mentioned on this page!","title":"PX4/MavLink Logging"},{"location":"px4_logging/#logging-mavlink-messages","text":"AirSim can capture mavlink log files if you add the following to the PX4 section of your settings.json file: { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"PX4\": { ..., \"Logs\": \"c:/temp/mavlink\" } } } AirSim will create a timestamped log file in this folder for each \"armed/disarmed\" flight session. You will then see log files organized by date in d:\\temp\\logs, specifically input.mavlink and output.mavlink files.","title":"Logging MavLink Messages"},{"location":"px4_logging/#mavlink-logviewer","text":"For MavLink enabled drones, you can also use our Log Viewer to visualize the streams of data. If you enable this form of realtime logging you should not use the \"Logs\" setting above, these two forms of logging are mutually exclusive.","title":"MavLink LogViewer"},{"location":"px4_logging/#px4-log-in-sitl-mode","text":"In SITL mode, please a log file is produced when drone is armed. The SITL terminal will contain the path to the log file, it should look something like this INFO [logger] Opened log file: rootfs/fs/microsd/log/2017-03-27/20_02_49.ulg","title":"PX4 Log in SITL Mode"},{"location":"px4_logging/#px4-log-in-hitl-mode","text":"If you are using Pixhawk hardware in HIL mode, then set parameter SYS_LOGGER=1 using QGroundControl. PX4 will write log file on device which you can download at later date using QGroundControl.","title":"PX4 Log in HITL Mode"},{"location":"px4_logging/#debugging-a-bad-flight","text":"You can use these *.mavlink log files to debug a bad flight using the LogViewer . For example, AirSim/PX4 flight may misbehave if you run it on an under powered computer. The following shows what might happen in that situation. In this flight we ran a simple commander takeoff test as performed by PythonClient/multirotor/stability_test.py and the flight started off fine, but then went crazy at the end and the drone crashed. So why is that? What can the log file show? Here we've plotted the following 5 metrics: - hil_gps.alt - the simulated altitude sent from AirSim to PX4 - telemetry.update_rate - the rate AirSim is performing the critical drone update loop in updates per second. - telemetry.update_time - the average time taken inside AirSim performing the critical drone update loop. - telemetry.actuation_delay - this is a very interesting metric measuring how long it takes PX4 to send back updated actuator controls message (motor power) - actuator_controls.0 - the actuator controls signal from PX4 for the first rotor. What we see then with these metrics is that things started off nicely, with nice flat altitude, high update rate in the 275 to 300 fps range, and a nice low update time inside AirSim around 113 microseconds, and a nice low actuation delay in the round trip from PX4. The actuator controls also stabilize quickly to a nice flat line. But then the update_time starts to climb, at the same time the actuation_delay climbs and we see a little tip in actuator_controls. This dip should not happen, the PX4 is panicking over loss of update rate but it recovers. But then we see actuator controls go crazy, a huge spike in actuation delay, and around this time we see a message from AirSim saying lockstep disabled . A delay over 100 millisecond triggers AirSim into jumping out of lockstep mode and the PX4 goes nuts and the drone crashes. The button line is that if a simple takeoff cannot maintain steady smooth flight and you see these kinds of spikes and uneven update rates then it means you are running AirSim on a computer that does not have enough horsepower. This is what a simple takeoff and hover and land should look like: Here you see the update_rate sticking the target of 333 updates per second. You also see the update_time a nice flat 39 microseconds and the actuator_delay somewhere between 1.1 and 1.7 milliseconds, and the resulting actuator_controls a lovely flat line.","title":"Debugging a bad flight"},{"location":"px4_multi_vehicle/","text":"Setting up multi-vehicle PX4 simulation The PX4 SITL stack comes with a sitl_multiple_run.sh shell script that runs multiple instances of the PX4 binary. This would allow the SITL stack to listen to connections from multiple Cosys-AirSim vehicles on multiple TCP ports starting from 4560. However, the provided script does not let us view the PX4 console. If you want to run the instances manually while being able to view each instance's console ( Recommended ) see this section Setting up multiple instances of PX4 Software-in-Loop Note you have to build PX4 with make px4_sitl_default none_iris as shown here before trying to run multiple PX4 instances. From your bash (or Cygwin) terminal go to the PX4 Firmware directory and run the sitl_multiple_run.sh script while specifying the number of vehicles you need cd PX4-Autopilot ./Tools/sitl_multiple_run.sh 2 # 2 here is the number of vehicles/instances This starts multiple instances that listen to TCP ports 4560 to 4560+i where 'i' is the number of vehicles/instances specified You should get a confirmation message that says that old instances have been stopped and new instances have been started killing running instances starting instance 0 in /cygdrive/c/PX4/home/PX4/Firmware/build/px4_sitl_default/instance_0 starting instance 1 in /cygdrive/c/PX4/home/PX4/Firmware/build/px4_sitl_default/instance_1 Now edit AirSim settings file to make sure you have matching TCP port settings for the set number of vehicles and to make sure that both vehicles do not spawn on the same point. For example, these settings would spawn two PX4Multirotors where one of them would try to connect to PX4 SITL at port 4560 and the other at port 4561 . It also makes sure the vehicles spawn at 0,1,0 and 0,-1,0 to avoid collision: json { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"Drone1\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": false, \"UseTcp\": true, \"TcpPort\": 4560, \"ControlPortLocal\": 14540, \"ControlPortRemote\": 14580, \"X\": 0, \"Y\": 1, \"Z\": 0 }, \"Drone2\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": false, \"UseTcp\": true, \"TcpPort\": 4561, \"ControlPortLocal\": 14541, \"ControlPortRemote\": 14581, \"X\": 0, \"Y\": -1, \"Z\": 0 } } } You can add more than two vehicles but you will need to make sure you adjust the TCP port for each (ie: vehicle 3's port would be 4562 and so on..) and adjust the spawn point. Now run your Unreal Cosys-AirSim environment and it should connect to SITL PX4 via TCP. If you are running the instances with the PX4 console visible , you should see a bunch of messages from each SITL PX4 window. Specifically, the following messages tell you that Cosys-AirSim is connected properly and GPS fusion is stable: INFO [simulator] Simulator connected on UDP port 14560 INFO [mavlink] partner IP: 127.0.0.1 INFO [ecl/EKF] EKF GPS checks passed (WGS-84 origin set) INFO [ecl/EKF] EKF commencing GPS fusion If you do not see these messages then check your port settings. You should also be able to use QGroundControl with SITL mode. Make sure there is no Pixhawk hardware plugged in, otherwise QGroundControl will choose to use that instead. Note that as we don't have a physical board, an RC cannot be connected directly to it. So the alternatives are either use XBox 360 Controller or connect your RC using USB (for example, in case of FrSky Taranis X9D Plus) or using trainer USB cable to your PC. This makes your RC look like a joystick. You will need to do extra set up in QGroundControl to use virtual joystick for RC control. You do not need to do this unless you plan to fly a drone manually in Cosys-AirSim. Autonomous flight using the Python API does not require RC, see No Remote Control . Starting SITL instances with PX4 console If you want to start your SITL instances while being able to view the PX4 console, you will need to run the shell scripts found here rather than sitl_multiple_run.sh . Here is how you would do so: Note This script also assumes PX4 is built with make px4_sitl_default none_iris as shown here before trying to run multiple PX4 instances. From your bash (or Cygwin) terminal go to the PX4 directory and get the scripts (place them in a subdirectory called Scripts win the PX4 directory as shown) cd PX4 mkdir -p Scripts cd Scripts wget https://github.com/Cosys-Lab/Cosys-AirSim/raw/main/PX4Scripts/sitl_kill.sh wget https://github.com/Cosys-Lab/Cosys-AirSim/raw/main/PX4Scripts/run_airsim_sitl.sh Note the shell scripts expect the Scripts and Firmware directories to be within the same parent directory. Also, you may need to make the scripts executable by running chmod +x sitl_kill.sh and chmod +x run_airsim_sitl.sh . Run the sitl_kill.sh script to kill all active PX4 SITL instances ./sitl_kill.sh Run the run_airsim_sitl.sh script while specifying which instance you would like to run in the current terminal window (the first instance would be numbered 0) ./run_airsim_sitl.sh 0 # first instance = 0 You should see the PX4 instance starting and waiting for Cosys-AirSim's connection as it would with a single instance. ``` _ _ ___ | ___ \\ \\ \\ / / / | | |_/ / \\ V / / /| | | / / \\ / / | | | | / /^\\ \\ ___ | _| \\/ \\/ | / px4 starting. INFO [px4] Calling startup script: /bin/sh /cygdrive/c/PX4/home/PX4/Firmware/etc/init.d-posix/rcS 0 INFO [dataman] Unknown restart, data manager file './dataman' size is 11798680 bytes INFO [simulator] Waiting for simulator to connect on TCP port 4560 4. Open a new terminal and go to the Scripts directory and start the next instance cd PX4 cd Scripts ./run_airsim_sitl.sh 1 # ,2,3,4,..,etc ``` Repeat step 4 for as many instances as you would like to start Run your Unreal Cosys-AirSim environment and it should connect to SITL PX4 via TCP (assuming your settings.json file has the right ports).","title":"PX4 Multi-vehicle in SITL"},{"location":"px4_multi_vehicle/#setting-up-multi-vehicle-px4-simulation","text":"The PX4 SITL stack comes with a sitl_multiple_run.sh shell script that runs multiple instances of the PX4 binary. This would allow the SITL stack to listen to connections from multiple Cosys-AirSim vehicles on multiple TCP ports starting from 4560. However, the provided script does not let us view the PX4 console. If you want to run the instances manually while being able to view each instance's console ( Recommended ) see this section","title":"Setting up multi-vehicle PX4 simulation"},{"location":"px4_multi_vehicle/#setting-up-multiple-instances-of-px4-software-in-loop","text":"Note you have to build PX4 with make px4_sitl_default none_iris as shown here before trying to run multiple PX4 instances. From your bash (or Cygwin) terminal go to the PX4 Firmware directory and run the sitl_multiple_run.sh script while specifying the number of vehicles you need cd PX4-Autopilot ./Tools/sitl_multiple_run.sh 2 # 2 here is the number of vehicles/instances This starts multiple instances that listen to TCP ports 4560 to 4560+i where 'i' is the number of vehicles/instances specified You should get a confirmation message that says that old instances have been stopped and new instances have been started killing running instances starting instance 0 in /cygdrive/c/PX4/home/PX4/Firmware/build/px4_sitl_default/instance_0 starting instance 1 in /cygdrive/c/PX4/home/PX4/Firmware/build/px4_sitl_default/instance_1 Now edit AirSim settings file to make sure you have matching TCP port settings for the set number of vehicles and to make sure that both vehicles do not spawn on the same point. For example, these settings would spawn two PX4Multirotors where one of them would try to connect to PX4 SITL at port 4560 and the other at port 4561 . It also makes sure the vehicles spawn at 0,1,0 and 0,-1,0 to avoid collision: json { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"Drone1\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": false, \"UseTcp\": true, \"TcpPort\": 4560, \"ControlPortLocal\": 14540, \"ControlPortRemote\": 14580, \"X\": 0, \"Y\": 1, \"Z\": 0 }, \"Drone2\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": false, \"UseTcp\": true, \"TcpPort\": 4561, \"ControlPortLocal\": 14541, \"ControlPortRemote\": 14581, \"X\": 0, \"Y\": -1, \"Z\": 0 } } } You can add more than two vehicles but you will need to make sure you adjust the TCP port for each (ie: vehicle 3's port would be 4562 and so on..) and adjust the spawn point. Now run your Unreal Cosys-AirSim environment and it should connect to SITL PX4 via TCP. If you are running the instances with the PX4 console visible , you should see a bunch of messages from each SITL PX4 window. Specifically, the following messages tell you that Cosys-AirSim is connected properly and GPS fusion is stable: INFO [simulator] Simulator connected on UDP port 14560 INFO [mavlink] partner IP: 127.0.0.1 INFO [ecl/EKF] EKF GPS checks passed (WGS-84 origin set) INFO [ecl/EKF] EKF commencing GPS fusion If you do not see these messages then check your port settings. You should also be able to use QGroundControl with SITL mode. Make sure there is no Pixhawk hardware plugged in, otherwise QGroundControl will choose to use that instead. Note that as we don't have a physical board, an RC cannot be connected directly to it. So the alternatives are either use XBox 360 Controller or connect your RC using USB (for example, in case of FrSky Taranis X9D Plus) or using trainer USB cable to your PC. This makes your RC look like a joystick. You will need to do extra set up in QGroundControl to use virtual joystick for RC control. You do not need to do this unless you plan to fly a drone manually in Cosys-AirSim. Autonomous flight using the Python API does not require RC, see No Remote Control .","title":"Setting up multiple instances of PX4 Software-in-Loop"},{"location":"px4_multi_vehicle/#starting-sitl-instances-with-px4-console","text":"If you want to start your SITL instances while being able to view the PX4 console, you will need to run the shell scripts found here rather than sitl_multiple_run.sh . Here is how you would do so: Note This script also assumes PX4 is built with make px4_sitl_default none_iris as shown here before trying to run multiple PX4 instances. From your bash (or Cygwin) terminal go to the PX4 directory and get the scripts (place them in a subdirectory called Scripts win the PX4 directory as shown) cd PX4 mkdir -p Scripts cd Scripts wget https://github.com/Cosys-Lab/Cosys-AirSim/raw/main/PX4Scripts/sitl_kill.sh wget https://github.com/Cosys-Lab/Cosys-AirSim/raw/main/PX4Scripts/run_airsim_sitl.sh Note the shell scripts expect the Scripts and Firmware directories to be within the same parent directory. Also, you may need to make the scripts executable by running chmod +x sitl_kill.sh and chmod +x run_airsim_sitl.sh . Run the sitl_kill.sh script to kill all active PX4 SITL instances ./sitl_kill.sh Run the run_airsim_sitl.sh script while specifying which instance you would like to run in the current terminal window (the first instance would be numbered 0) ./run_airsim_sitl.sh 0 # first instance = 0 You should see the PX4 instance starting and waiting for Cosys-AirSim's connection as it would with a single instance. ``` _ _ ___ | ___ \\ \\ \\ / / / | | |_/ / \\ V / / /| | | / / \\ / / | | | | / /^\\ \\ ___ | _| \\/ \\/ | / px4 starting. INFO [px4] Calling startup script: /bin/sh /cygdrive/c/PX4/home/PX4/Firmware/etc/init.d-posix/rcS 0 INFO [dataman] Unknown restart, data manager file './dataman' size is 11798680 bytes INFO [simulator] Waiting for simulator to connect on TCP port 4560 4. Open a new terminal and go to the Scripts directory and start the next instance cd PX4 cd Scripts ./run_airsim_sitl.sh 1 # ,2,3,4,..,etc ``` Repeat step 4 for as many instances as you would like to start Run your Unreal Cosys-AirSim environment and it should connect to SITL PX4 via TCP (assuming your settings.json file has the right ports).","title":"Starting SITL instances with PX4 console"},{"location":"px4_setup/","text":"PX4 Setup for AirSim The PX4 software stack is an open source very popular flight controller with support for wide variety of boards and sensors as well as built-in capability for higher level tasks such as mission planning. Please visit px4.io for more information. Warning : While all releases of AirSim are always tested with PX4 to ensure the support, setting up PX4 is not a trivial task. Unless you have at least intermediate level of experience with PX4 stack, we recommend you use simple_flight , which is now a default in AirSim. Supported Hardware The following Pixhawk hardware has been tested with AirSim: Pixhawk PX4 2.4.8 PixFalcon PixRacer Pixhawk 2.1 Pixhawk 4 mini from Holybro Pixhawk 4 from Holybro Version 1.11.2 of the PX4 firmware also works on the Pixhawk 4 devices. Setting up PX4 Hardware-in-Loop For this you will need one of the supported device listed above. For manual flight you will also need RC + transmitter. Make sure your RC receiver is bound with its RC transmitter. Connect the RC transmitter to the flight controller's RC port. Refer to your RC manual and PX4 docs for more information. Download QGroundControl , launch it and connect your flight controller to the USB port. Use QGroundControl to flash the latest PX4 Flight Stack. See also initial firmware setup video . In QGroundControl, configure your Pixhawk for HIL simulation by selecting the HIL Quadrocopter X airframe. After PX4 reboots, check that \"HIL Quadrocopter X\" is indeed selected. In QGroundControl, go to Radio tab and calibrate (make sure the remote control is on and the receiver is showing the indicator for the binding). Go to the Flight Mode tab and chose one of the remote control switches as \"Mode Channel\". Then set (for example) Stabilized and Attitude flight modes for two positions of the switch. Go to the Tuning section of QGroundControl and set appropriate values. For example, for Fly Sky's FS-TH9X remote control, the following settings give a more realistic feel: Hover Throttle = mid+1 mark, Roll and pitch sensitivity = mid-3 mark, Altitude and position control sensitivity = mid-2 mark. In AirSim settings file, specify PX4 for your vehicle config like this: { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"ClockType\": \"SteppableClock\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": true, \"LockStep\": true, \"Sensors\":{ \"Barometer\":{ \"SensorType\": 1, \"Enabled\": true, \"PressureFactorSigma\": 0.0001825 } }, \"Parameters\": { \"NAV_RCL_ACT\": 0, \"NAV_DLL_ACT\": 0, \"COM_OBL_ACT\": 1, \"LPE_LAT\": 47.641468, \"LPE_LON\": -122.140165 } } } } Notice the PX4 [simulator] is using TCP, which is why we need to add: \"UseTcp\": true, . Notice we are also enabling LockStep , see PX4 LockStep for more information. The Barometer setting keeps PX4 happy because the default AirSim barometer has a bit too much noise generation. This setting clamps that down a bit which allows PX4 to achieve GPS lock more quickly. After above setup you should be able to use a remote control (RC) to fly with AirSim. You can usually arm the vehicle by lowering and bringing two sticks of RC together down and in-wards. You don't need QGroundControl after the initial setup. Typically the Stabilized (instead of Manual) mode gives better experience for beginners. See PX4 Basic Flying Guide . You can also control the drone from Python APIs . See Walkthrough Demo Video and Unreal AirSim Setup Video that shows you all the setup steps in this document. Setting up PX4 Software-in-Loop The PX4 SITL mode doesn't require you to have separate device such as a Pixhawk or Pixracer. This is in fact the recommended way to use PX4 with simulators by PX4 team. However, this is indeed harder to set up. Please see this dedicated page for setting up PX4 in SITL mode. FAQ Drone doesn't fly properly, it just goes \"crazy\". There are a few reasons that can cause this. First, make sure your drone doesn't fall down large distance when starting the simulator. This might happen if you have created a custom Unreal environment and Player Start is placed too high above the ground. It seems that when this happens internal calibration in PX4 gets confused. You should also use QGroundControl and make sure you can arm and takeoff in QGroundControl properly. Can I use Arducopter or other MavLink implementations? Our code is tested with the PX4 firmware . We have not tested Arducopter or other mavlink implementations. Some of the flight API's do use the PX4 custom modes in the MAV_CMD_DO_SET_MODE messages (like PX4_CUSTOM_MAIN_MODE_AUTO) It is not finding my Pixhawk hardware Check your settings.json file for this line \"SerialPort\":\"*,115200\". The asterisk here means \"find any serial port that looks like a Pixhawk device, but this doesn't always work for all types of Pixhawk hardware. So on Windows you can find the actual COM port using Device Manager, look under \"Ports (COM & LPT), plug the device in and see what new COM port shows up. Let's say you see a new port named \"USB Serial Port (COM5)\". Well, then change the SerialPort setting to this: \"SerialPort\":\"COM5,115200\". On Linux, the device can be found by running \"ls /dev/serial/by-id\" if you see a device name listed that looks like this usb-3D_Robotics_PX4_FMU_v2.x_0-if00 then you can use that name to connect, like this: \"SerialPort\":\"/dev/serial/by-id/usb-3D_Robotics_PX4_FMU_v2.x_0-if00\". Note this long name is actually a symbolic link to the real name, if you use \"ls -l ...\" you can find that symbolic link, it is usually something like \"/dev/ttyACM0\", so this will also work \"SerialPort\":\"/dev/ttyACM0,115200\". But that mapping is similar to windows, it is automatically assigned and can change, whereas the long name will work even if the actual TTY serial device mapping changes. WARN [commander] Takeoff denied, disarm and re-try This happens if you try and take off when PX4 still has not computed the home position. PX4 will report the home position once it is happy with the GPS signal, and you will see these messages: INFO [commander] home: 47.6414680, -122.1401672, 119.99 INFO [tone_alarm] home_set Up until this point in time, however, the PX4 will reject takeoff commands. When I tell the drone to do something it always lands For example, you use DroneShell moveToPosition -z -20 -x 50 -y 0 which it does, but when it gets to the target location the drone starts to land. This is the default behavior of PX4 when offboard mode completes. To set the drone to hover instead set this PX4 parameter: param set COM_OBL_ACT 1 I get message length mismatches errors You might need to set MAV_PROTO_VER parameter in QGC to \"Always use version 1\". Please see this issue more details.","title":"PX4 Setup for AirSim"},{"location":"px4_setup/#px4-setup-for-airsim","text":"The PX4 software stack is an open source very popular flight controller with support for wide variety of boards and sensors as well as built-in capability for higher level tasks such as mission planning. Please visit px4.io for more information. Warning : While all releases of AirSim are always tested with PX4 to ensure the support, setting up PX4 is not a trivial task. Unless you have at least intermediate level of experience with PX4 stack, we recommend you use simple_flight , which is now a default in AirSim.","title":"PX4 Setup for AirSim"},{"location":"px4_setup/#supported-hardware","text":"The following Pixhawk hardware has been tested with AirSim: Pixhawk PX4 2.4.8 PixFalcon PixRacer Pixhawk 2.1 Pixhawk 4 mini from Holybro Pixhawk 4 from Holybro Version 1.11.2 of the PX4 firmware also works on the Pixhawk 4 devices.","title":"Supported Hardware"},{"location":"px4_setup/#setting-up-px4-hardware-in-loop","text":"For this you will need one of the supported device listed above. For manual flight you will also need RC + transmitter. Make sure your RC receiver is bound with its RC transmitter. Connect the RC transmitter to the flight controller's RC port. Refer to your RC manual and PX4 docs for more information. Download QGroundControl , launch it and connect your flight controller to the USB port. Use QGroundControl to flash the latest PX4 Flight Stack. See also initial firmware setup video . In QGroundControl, configure your Pixhawk for HIL simulation by selecting the HIL Quadrocopter X airframe. After PX4 reboots, check that \"HIL Quadrocopter X\" is indeed selected. In QGroundControl, go to Radio tab and calibrate (make sure the remote control is on and the receiver is showing the indicator for the binding). Go to the Flight Mode tab and chose one of the remote control switches as \"Mode Channel\". Then set (for example) Stabilized and Attitude flight modes for two positions of the switch. Go to the Tuning section of QGroundControl and set appropriate values. For example, for Fly Sky's FS-TH9X remote control, the following settings give a more realistic feel: Hover Throttle = mid+1 mark, Roll and pitch sensitivity = mid-3 mark, Altitude and position control sensitivity = mid-2 mark. In AirSim settings file, specify PX4 for your vehicle config like this: { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"ClockType\": \"SteppableClock\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": true, \"LockStep\": true, \"Sensors\":{ \"Barometer\":{ \"SensorType\": 1, \"Enabled\": true, \"PressureFactorSigma\": 0.0001825 } }, \"Parameters\": { \"NAV_RCL_ACT\": 0, \"NAV_DLL_ACT\": 0, \"COM_OBL_ACT\": 1, \"LPE_LAT\": 47.641468, \"LPE_LON\": -122.140165 } } } } Notice the PX4 [simulator] is using TCP, which is why we need to add: \"UseTcp\": true, . Notice we are also enabling LockStep , see PX4 LockStep for more information. The Barometer setting keeps PX4 happy because the default AirSim barometer has a bit too much noise generation. This setting clamps that down a bit which allows PX4 to achieve GPS lock more quickly. After above setup you should be able to use a remote control (RC) to fly with AirSim. You can usually arm the vehicle by lowering and bringing two sticks of RC together down and in-wards. You don't need QGroundControl after the initial setup. Typically the Stabilized (instead of Manual) mode gives better experience for beginners. See PX4 Basic Flying Guide . You can also control the drone from Python APIs . See Walkthrough Demo Video and Unreal AirSim Setup Video that shows you all the setup steps in this document.","title":"Setting up PX4 Hardware-in-Loop"},{"location":"px4_setup/#setting-up-px4-software-in-loop","text":"The PX4 SITL mode doesn't require you to have separate device such as a Pixhawk or Pixracer. This is in fact the recommended way to use PX4 with simulators by PX4 team. However, this is indeed harder to set up. Please see this dedicated page for setting up PX4 in SITL mode.","title":"Setting up PX4 Software-in-Loop"},{"location":"px4_setup/#faq","text":"","title":"FAQ"},{"location":"px4_setup/#drone-doesnt-fly-properly-it-just-goes-crazy","text":"There are a few reasons that can cause this. First, make sure your drone doesn't fall down large distance when starting the simulator. This might happen if you have created a custom Unreal environment and Player Start is placed too high above the ground. It seems that when this happens internal calibration in PX4 gets confused. You should also use QGroundControl and make sure you can arm and takeoff in QGroundControl properly.","title":"Drone doesn't fly properly, it just goes \"crazy\"."},{"location":"px4_setup/#can-i-use-arducopter-or-other-mavlink-implementations","text":"Our code is tested with the PX4 firmware . We have not tested Arducopter or other mavlink implementations. Some of the flight API's do use the PX4 custom modes in the MAV_CMD_DO_SET_MODE messages (like PX4_CUSTOM_MAIN_MODE_AUTO)","title":"Can I use Arducopter or other MavLink implementations?"},{"location":"px4_setup/#it-is-not-finding-my-pixhawk-hardware","text":"Check your settings.json file for this line \"SerialPort\":\"*,115200\". The asterisk here means \"find any serial port that looks like a Pixhawk device, but this doesn't always work for all types of Pixhawk hardware. So on Windows you can find the actual COM port using Device Manager, look under \"Ports (COM & LPT), plug the device in and see what new COM port shows up. Let's say you see a new port named \"USB Serial Port (COM5)\". Well, then change the SerialPort setting to this: \"SerialPort\":\"COM5,115200\". On Linux, the device can be found by running \"ls /dev/serial/by-id\" if you see a device name listed that looks like this usb-3D_Robotics_PX4_FMU_v2.x_0-if00 then you can use that name to connect, like this: \"SerialPort\":\"/dev/serial/by-id/usb-3D_Robotics_PX4_FMU_v2.x_0-if00\". Note this long name is actually a symbolic link to the real name, if you use \"ls -l ...\" you can find that symbolic link, it is usually something like \"/dev/ttyACM0\", so this will also work \"SerialPort\":\"/dev/ttyACM0,115200\". But that mapping is similar to windows, it is automatically assigned and can change, whereas the long name will work even if the actual TTY serial device mapping changes.","title":"It is not finding my Pixhawk hardware"},{"location":"px4_setup/#warn-commander-takeoff-denied-disarm-and-re-try","text":"This happens if you try and take off when PX4 still has not computed the home position. PX4 will report the home position once it is happy with the GPS signal, and you will see these messages: INFO [commander] home: 47.6414680, -122.1401672, 119.99 INFO [tone_alarm] home_set Up until this point in time, however, the PX4 will reject takeoff commands.","title":"WARN [commander] Takeoff denied, disarm and re-try"},{"location":"px4_setup/#when-i-tell-the-drone-to-do-something-it-always-lands","text":"For example, you use DroneShell moveToPosition -z -20 -x 50 -y 0 which it does, but when it gets to the target location the drone starts to land. This is the default behavior of PX4 when offboard mode completes. To set the drone to hover instead set this PX4 parameter: param set COM_OBL_ACT 1","title":"When I tell the drone to do something it always lands"},{"location":"px4_setup/#i-get-message-length-mismatches-errors","text":"You might need to set MAV_PROTO_VER parameter in QGC to \"Always use version 1\". Please see this issue more details.","title":"I get message length mismatches errors"},{"location":"px4_sitl/","text":"Setting up PX4 Software-in-Loop The PX4 software provides a \"software-in-loop\" simulation (SITL) version of their stack that runs in Linux. If you are on Windows then you can use the Cygwin Toolchain or you can use the Windows subsystem for Linux and follow the PX4 Linux toolchain setup. If you are using WSL2 please read these additional instructions . Note that every time you stop the unreal app you have to restart the px4 app. From your bash terminal follow these steps for Linux and follow all the instructions under NuttX based hardware to install prerequisites. We've also included our own copy of the PX4 build instructions which is a bit more concise about what we need exactly. Get the PX4 source code and build the posix SITL version of PX4: mkdir -p PX4 cd PX4 git clone https://github.com/PX4/PX4-Autopilot.git --recursive bash ./PX4-Autopilot/Tools/setup/ubuntu.sh --no-nuttx --no-sim-tools cd PX4-Autopilot And find the latest stable release from https://github.com/PX4/PX4-Autopilot/releases and checkout the source code matching that release, for example: git checkout v1.11.3 Use following command to build and start PX4 firmware in SITL mode: make px4_sitl_default none_iris If you are using older version v1.8.* use this command instead: make posix_sitl_ekf2 none_iris . You should see a message saying the SITL PX4 app is waiting for the simulator (Cosys-AirSim) to connect. You will also see information about which ports are configured for mavlink connection to the PX4 app. The default ports have changed recently, so check them closely to make sure Cosys-AirSim settings are correct. INFO [simulator] Waiting for simulator to connect on TCP port 4560 INFO [init] Mixer: etc/mixers/quad_w.main.mix on /dev/pwm_output0 INFO [mavlink] mode: Normal, data rate: 4000000 B/s on udp port 14570 remote port 14550 INFO [mavlink] mode: Onboard, data rate: 4000000 B/s on udp port 14580 remote port 14540 Note: this is also an interactive PX4 console, type help to see the list of commands you can enter here. They are mostly low level PX4 commands, but some of them can be useful for debugging. Now edit Cosys-AirSim settings file to make sure you have matching UDP and TCP port settings: json { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"ClockType\": \"SteppableClock\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": false, \"LockStep\": true, \"UseTcp\": true, \"TcpPort\": 4560, \"ControlPortLocal\": 14540, \"ControlPortRemote\": 14580, \"Sensors\":{ \"Barometer\":{ \"SensorType\": 1, \"Enabled\": true, \"PressureFactorSigma\": 0.0001825 } }, \"Parameters\": { \"NAV_RCL_ACT\": 0, \"NAV_DLL_ACT\": 0, \"COM_OBL_ACT\": 1, \"LPE_LAT\": 47.641468, \"LPE_LON\": -122.140165 } } } } Notice the PX4 [simulator] is using TCP, which is why we need to add: \"UseTcp\": true, . Notice we are also enabling LockStep , see PX4 LockStep for more information. The Barometer setting keeps PX4 happy because the default Cosys-AirSim barometer has a bit too much noise generation. This setting clamps that down a bit which allows PX4 to achieve GPS lock more quickly. Open incoming TCP port 4560 and incoming UDP port 14540 using your firewall configuration. Now run your Unreal Cosys-AirSim environment and it should connect to SITL PX4 via TCP. You should see a bunch of messages from the SITL PX4 window. Specifically, the following messages tell you that Cosys-AirSim is connected properly and GPS fusion is stable: INFO [simulator] Simulator connected on UDP port 14560 INFO [mavlink] partner IP: 127.0.0.1 INFO [ecl/EKF] EKF GPS checks passed (WGS-84 origin set) INFO [ecl/EKF] EKF commencing GPS fusion If you do not see these messages then check your port settings. You should also be able to use QGroundControl with SITL mode. Make sure there is no Pixhawk hardware plugged in, otherwise QGroundControl will choose to use that instead. Note that as we don't have a physical board, an RC cannot be connected directly to it. So the alternatives are either use XBox 360 Controller or connect your RC using USB (for example, in case of FrSky Taranis X9D Plus) or using trainer USB cable to your PC. This makes your RC look like a joystick. You will need to do extra set up in QGroundControl to use virtual joystick for RC control. You do not need to do this unless you plan to fly a drone manually in Cosys-AirSim. Autonomous flight using the Python API does not require RC, see No Remote Control below. Setting GPS origin Notice the above settings are provided in the params section of the settings.json file: \"LPE_LAT\": 47.641468, \"LPE_LON\": -122.140165, PX4 SITL mode needs to be configured to get the home location correct. The home location needs to be set to the same coordinates defined in OriginGeopoint . You can also run the following in the SITL PX4 console window to check that these values are set correctly. param show LPE_LAT param show LPE_LON Smooth Offboard Transitions Notice the above setting is provided in the params section of the settings.json file: \"COM_OBL_ACT\": 1 This tells the drone automatically hover after each offboard control command finishes (the default setting is to land). Hovering is a smoother transition between multiple offboard commands. You can check this setting by running the following PX4 console command: param show COM_OBL_ACT Check the Home Position If you are using DroneShell to execute commands (arm, takeoff, etc) then you should wait until the Home position is set. You will see the PX4 SITL console output this message: INFO [commander] home: 47.6414680, -122.1401672, 119.99 INFO [tone_alarm] home_set Now DroneShell 'pos' command should report this position and the commands should be accepted by PX4. If you attempt to takeoff without a home position you will see the message: WARN [commander] Takeoff denied, disarm and re-try After home position is set check the local position reported by 'pos' command : Local position: x=-0.0326988, y=0.00656854, z=5.48506 If the z coordinate is large like this then takeoff might not work as expected. Resetting the SITL and simulation should fix that problem. WSL 2 Windows Subsystem for Linux version 2 operates in a Virtual Machine. This requires additional setup - see additional instructions . No Remote Control Notice the above setting is provided in the params section of the settings.json file: \"NAV_RCL_ACT\": 0, \"NAV_DLL_ACT\": 0, This is required if you plan to fly the SITL mode PX4 with no remote control, just using python scripts, for example. These parameters stop the PX4 from triggering \"failsafe mode on\" every time a move command is finished. You can use the following PX4 command to check these values are set correctly: param show NAV_RCL_ACT param show NAV_DLL_ACT NOTE: Do NOT do this on a real drone as it is too dangerous to fly without these failsafe measures. Manually set parameters You can also run the following in the PX4 console to set all these parameters manually: param set NAV_RCL_ACT 0 param set NAV_DLL_ACT 0 Setting up multi-vehicle simulation You can simulate multiple drones in SITL mode using Cosys-AirSim. However, this requires setting up multiple instances of the PX4 firmware simulator to be able to listen for each vehicle's connection on a separate TCP port (4560, 4561, etc). Please see this dedicated page for instructions on setting up multiple instances of PX4 in SITL mode. Using VirtualBox Ubuntu If you want to run the above posix_sitl in a VirtualBox Ubuntu machine then it will have a different ip address from localhost. So in this case you need to edit the settings file and change the UdpIp and SitlIp to the ip address of your virtual machine set the LocalIpAddress to the address of your host machine running the Unreal engine. Remote Controller There are several options for flying the simulated drone using a remote control or joystick like xbox gamepad. See remote controllers","title":"PX4 in SITL"},{"location":"px4_sitl/#setting-up-px4-software-in-loop","text":"The PX4 software provides a \"software-in-loop\" simulation (SITL) version of their stack that runs in Linux. If you are on Windows then you can use the Cygwin Toolchain or you can use the Windows subsystem for Linux and follow the PX4 Linux toolchain setup. If you are using WSL2 please read these additional instructions . Note that every time you stop the unreal app you have to restart the px4 app. From your bash terminal follow these steps for Linux and follow all the instructions under NuttX based hardware to install prerequisites. We've also included our own copy of the PX4 build instructions which is a bit more concise about what we need exactly. Get the PX4 source code and build the posix SITL version of PX4: mkdir -p PX4 cd PX4 git clone https://github.com/PX4/PX4-Autopilot.git --recursive bash ./PX4-Autopilot/Tools/setup/ubuntu.sh --no-nuttx --no-sim-tools cd PX4-Autopilot And find the latest stable release from https://github.com/PX4/PX4-Autopilot/releases and checkout the source code matching that release, for example: git checkout v1.11.3 Use following command to build and start PX4 firmware in SITL mode: make px4_sitl_default none_iris If you are using older version v1.8.* use this command instead: make posix_sitl_ekf2 none_iris . You should see a message saying the SITL PX4 app is waiting for the simulator (Cosys-AirSim) to connect. You will also see information about which ports are configured for mavlink connection to the PX4 app. The default ports have changed recently, so check them closely to make sure Cosys-AirSim settings are correct. INFO [simulator] Waiting for simulator to connect on TCP port 4560 INFO [init] Mixer: etc/mixers/quad_w.main.mix on /dev/pwm_output0 INFO [mavlink] mode: Normal, data rate: 4000000 B/s on udp port 14570 remote port 14550 INFO [mavlink] mode: Onboard, data rate: 4000000 B/s on udp port 14580 remote port 14540 Note: this is also an interactive PX4 console, type help to see the list of commands you can enter here. They are mostly low level PX4 commands, but some of them can be useful for debugging. Now edit Cosys-AirSim settings file to make sure you have matching UDP and TCP port settings: json { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"ClockType\": \"SteppableClock\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": false, \"LockStep\": true, \"UseTcp\": true, \"TcpPort\": 4560, \"ControlPortLocal\": 14540, \"ControlPortRemote\": 14580, \"Sensors\":{ \"Barometer\":{ \"SensorType\": 1, \"Enabled\": true, \"PressureFactorSigma\": 0.0001825 } }, \"Parameters\": { \"NAV_RCL_ACT\": 0, \"NAV_DLL_ACT\": 0, \"COM_OBL_ACT\": 1, \"LPE_LAT\": 47.641468, \"LPE_LON\": -122.140165 } } } } Notice the PX4 [simulator] is using TCP, which is why we need to add: \"UseTcp\": true, . Notice we are also enabling LockStep , see PX4 LockStep for more information. The Barometer setting keeps PX4 happy because the default Cosys-AirSim barometer has a bit too much noise generation. This setting clamps that down a bit which allows PX4 to achieve GPS lock more quickly. Open incoming TCP port 4560 and incoming UDP port 14540 using your firewall configuration. Now run your Unreal Cosys-AirSim environment and it should connect to SITL PX4 via TCP. You should see a bunch of messages from the SITL PX4 window. Specifically, the following messages tell you that Cosys-AirSim is connected properly and GPS fusion is stable: INFO [simulator] Simulator connected on UDP port 14560 INFO [mavlink] partner IP: 127.0.0.1 INFO [ecl/EKF] EKF GPS checks passed (WGS-84 origin set) INFO [ecl/EKF] EKF commencing GPS fusion If you do not see these messages then check your port settings. You should also be able to use QGroundControl with SITL mode. Make sure there is no Pixhawk hardware plugged in, otherwise QGroundControl will choose to use that instead. Note that as we don't have a physical board, an RC cannot be connected directly to it. So the alternatives are either use XBox 360 Controller or connect your RC using USB (for example, in case of FrSky Taranis X9D Plus) or using trainer USB cable to your PC. This makes your RC look like a joystick. You will need to do extra set up in QGroundControl to use virtual joystick for RC control. You do not need to do this unless you plan to fly a drone manually in Cosys-AirSim. Autonomous flight using the Python API does not require RC, see No Remote Control below.","title":"Setting up PX4 Software-in-Loop"},{"location":"px4_sitl/#setting-gps-origin","text":"Notice the above settings are provided in the params section of the settings.json file: \"LPE_LAT\": 47.641468, \"LPE_LON\": -122.140165, PX4 SITL mode needs to be configured to get the home location correct. The home location needs to be set to the same coordinates defined in OriginGeopoint . You can also run the following in the SITL PX4 console window to check that these values are set correctly. param show LPE_LAT param show LPE_LON","title":"Setting GPS origin"},{"location":"px4_sitl/#smooth-offboard-transitions","text":"Notice the above setting is provided in the params section of the settings.json file: \"COM_OBL_ACT\": 1 This tells the drone automatically hover after each offboard control command finishes (the default setting is to land). Hovering is a smoother transition between multiple offboard commands. You can check this setting by running the following PX4 console command: param show COM_OBL_ACT","title":"Smooth Offboard Transitions"},{"location":"px4_sitl/#check-the-home-position","text":"If you are using DroneShell to execute commands (arm, takeoff, etc) then you should wait until the Home position is set. You will see the PX4 SITL console output this message: INFO [commander] home: 47.6414680, -122.1401672, 119.99 INFO [tone_alarm] home_set Now DroneShell 'pos' command should report this position and the commands should be accepted by PX4. If you attempt to takeoff without a home position you will see the message: WARN [commander] Takeoff denied, disarm and re-try After home position is set check the local position reported by 'pos' command : Local position: x=-0.0326988, y=0.00656854, z=5.48506 If the z coordinate is large like this then takeoff might not work as expected. Resetting the SITL and simulation should fix that problem.","title":"Check the Home Position"},{"location":"px4_sitl/#wsl-2","text":"Windows Subsystem for Linux version 2 operates in a Virtual Machine. This requires additional setup - see additional instructions .","title":"WSL 2"},{"location":"px4_sitl/#no-remote-control","text":"Notice the above setting is provided in the params section of the settings.json file: \"NAV_RCL_ACT\": 0, \"NAV_DLL_ACT\": 0, This is required if you plan to fly the SITL mode PX4 with no remote control, just using python scripts, for example. These parameters stop the PX4 from triggering \"failsafe mode on\" every time a move command is finished. You can use the following PX4 command to check these values are set correctly: param show NAV_RCL_ACT param show NAV_DLL_ACT NOTE: Do NOT do this on a real drone as it is too dangerous to fly without these failsafe measures.","title":"No Remote Control"},{"location":"px4_sitl/#manually-set-parameters","text":"You can also run the following in the PX4 console to set all these parameters manually: param set NAV_RCL_ACT 0 param set NAV_DLL_ACT 0","title":"Manually set parameters"},{"location":"px4_sitl/#setting-up-multi-vehicle-simulation","text":"You can simulate multiple drones in SITL mode using Cosys-AirSim. However, this requires setting up multiple instances of the PX4 firmware simulator to be able to listen for each vehicle's connection on a separate TCP port (4560, 4561, etc). Please see this dedicated page for instructions on setting up multiple instances of PX4 in SITL mode.","title":"Setting up multi-vehicle simulation"},{"location":"px4_sitl/#using-virtualbox-ubuntu","text":"If you want to run the above posix_sitl in a VirtualBox Ubuntu machine then it will have a different ip address from localhost. So in this case you need to edit the settings file and change the UdpIp and SitlIp to the ip address of your virtual machine set the LocalIpAddress to the address of your host machine running the Unreal engine.","title":"Using VirtualBox Ubuntu"},{"location":"px4_sitl/#remote-controller","text":"There are several options for flying the simulated drone using a remote control or joystick like xbox gamepad. See remote controllers","title":"Remote Controller"},{"location":"px4_sitl_wsl2/","text":"PX4 Software-in-Loop with WSL 2 The Windows subsystem for Linux version 2 uses a Virtual Machine which has a separate IP address from your Windows host machine. This means PX4 cannot find AirSim using \"localhost\" which is the default behavior for PX4. You will notice that on Windows ipconfig returns a new ethernet adapter for WSL like this (notice the vEthernet has (WSL) in the name: Ethernet adapter vEthernet (WSL): Connection-specific DNS Suffix . : Link-local IPv6 Address . . . . . : fe80::1192:f9a5:df88:53ba%44 IPv4 Address. . . . . . . . . . . : 172.31.64.1 Subnet Mask . . . . . . . . . . . : 255.255.240.0 Default Gateway . . . . . . . . . : This address 172.31.64.1 is the address that WSL 2 can use to reach your Windows host machine. Starting with this PX4 Change Request (which correlates to version v1.12.0-beta1 or newer) PX4 in SITL mode can now connect to AirSim on a different (remote) IP address. To enable this make sure you have a version of PX4 containing this fix and set the following environment variable in linux: export PX4_SIM_HOST_ADDR=172.31.64.1 Note: Be sure to update the above address 172.31.64.1 to match what you see from your ipconfig command. Open incoming TCP port 4560 and incoming UDP port 14540 using your firewall configuration. Now on the linux side run ip address show and copy the eth0 inet address, it should be something like 172.31.66.156 . This is the address Windows needs to know in order to find PX4. Edit your AirSim settings file and add LocalHostIp to tell AirSim to use the WSL ethernet adapter address instead of the default localhost . This will cause AirSim to open the TCP port on that adapter which is the address that the PX4 app will be looking for. Also tell AirSim to connect the ControlIp UDP channel by setting ControlIp to the magic string remote . This resolves to the WSL 2 remote ip address found in the TCP socket. { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"ClockType\": \"SteppableClock\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": false, \"LockStep\": true, \"UseTcp\": true, \"TcpPort\": 4560, \"ControlIp\": \"remote\", \"ControlPortLocal\": 14540, \"ControlPortRemote\": 14580, \"LocalHostIp\": \"172.31.64.1\", \"Sensors\":{ \"Barometer\":{ \"SensorType\": 1, \"Enabled\": true, \"PressureFactorSigma\": 0.0001825 } }, \"Parameters\": { \"NAV_RCL_ACT\": 0, \"NAV_DLL_ACT\": 0, \"COM_OBL_ACT\": 1, \"LPE_LAT\": 47.641468, \"LPE_LON\": -122.140165 } } } } See PX4 LockStep for more information. The \"Barometer\" setting keeps PX4 happy because the default AirSim barometer has a bit too much noise generation. This setting clamps that down a bit. If your local repo does not include this PX4 commit , please edit the Linux file in ROMFS/px4fmu_common/init.d-posix/rcS and make sure it is looking for the PX4_SIM_HOST_ADDR environment variable and is passing that through to the PX4 simulator like this: # If PX4_SIM_HOST_ADDR environment variable is empty use localhost. if [ -z \"${PX4_SIM_HOST_ADDR}\" ]; then echo \"PX4 SIM HOST: localhost\" simulator start -c $simulator_tcp_port else echo \"PX4 SIM HOST: $PX4_SIM_HOST_ADDR\" simulator start -t $PX4_SIM_HOST_ADDR $simulator_tcp_port fi Note: this code might already be there depending on the version of PX4 you are using. Note: please be patient when waiting for the message: INFO [simulator] Simulator connected on TCP port 4560. It can take a little longer to establish the remote connection than it does with localhost . Now you can proceed with the steps shown in Setting up PX4 Software-in-Loop .","title":"PX4 SITL with WSL 2"},{"location":"px4_sitl_wsl2/#px4-software-in-loop-with-wsl-2","text":"The Windows subsystem for Linux version 2 uses a Virtual Machine which has a separate IP address from your Windows host machine. This means PX4 cannot find AirSim using \"localhost\" which is the default behavior for PX4. You will notice that on Windows ipconfig returns a new ethernet adapter for WSL like this (notice the vEthernet has (WSL) in the name: Ethernet adapter vEthernet (WSL): Connection-specific DNS Suffix . : Link-local IPv6 Address . . . . . : fe80::1192:f9a5:df88:53ba%44 IPv4 Address. . . . . . . . . . . : 172.31.64.1 Subnet Mask . . . . . . . . . . . : 255.255.240.0 Default Gateway . . . . . . . . . : This address 172.31.64.1 is the address that WSL 2 can use to reach your Windows host machine. Starting with this PX4 Change Request (which correlates to version v1.12.0-beta1 or newer) PX4 in SITL mode can now connect to AirSim on a different (remote) IP address. To enable this make sure you have a version of PX4 containing this fix and set the following environment variable in linux: export PX4_SIM_HOST_ADDR=172.31.64.1 Note: Be sure to update the above address 172.31.64.1 to match what you see from your ipconfig command. Open incoming TCP port 4560 and incoming UDP port 14540 using your firewall configuration. Now on the linux side run ip address show and copy the eth0 inet address, it should be something like 172.31.66.156 . This is the address Windows needs to know in order to find PX4. Edit your AirSim settings file and add LocalHostIp to tell AirSim to use the WSL ethernet adapter address instead of the default localhost . This will cause AirSim to open the TCP port on that adapter which is the address that the PX4 app will be looking for. Also tell AirSim to connect the ControlIp UDP channel by setting ControlIp to the magic string remote . This resolves to the WSL 2 remote ip address found in the TCP socket. { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"ClockType\": \"SteppableClock\", \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"UseSerial\": false, \"LockStep\": true, \"UseTcp\": true, \"TcpPort\": 4560, \"ControlIp\": \"remote\", \"ControlPortLocal\": 14540, \"ControlPortRemote\": 14580, \"LocalHostIp\": \"172.31.64.1\", \"Sensors\":{ \"Barometer\":{ \"SensorType\": 1, \"Enabled\": true, \"PressureFactorSigma\": 0.0001825 } }, \"Parameters\": { \"NAV_RCL_ACT\": 0, \"NAV_DLL_ACT\": 0, \"COM_OBL_ACT\": 1, \"LPE_LAT\": 47.641468, \"LPE_LON\": -122.140165 } } } } See PX4 LockStep for more information. The \"Barometer\" setting keeps PX4 happy because the default AirSim barometer has a bit too much noise generation. This setting clamps that down a bit. If your local repo does not include this PX4 commit , please edit the Linux file in ROMFS/px4fmu_common/init.d-posix/rcS and make sure it is looking for the PX4_SIM_HOST_ADDR environment variable and is passing that through to the PX4 simulator like this: # If PX4_SIM_HOST_ADDR environment variable is empty use localhost. if [ -z \"${PX4_SIM_HOST_ADDR}\" ]; then echo \"PX4 SIM HOST: localhost\" simulator start -c $simulator_tcp_port else echo \"PX4 SIM HOST: $PX4_SIM_HOST_ADDR\" simulator start -t $PX4_SIM_HOST_ADDR $simulator_tcp_port fi Note: this code might already be there depending on the version of PX4 you are using. Note: please be patient when waiting for the message: INFO [simulator] Simulator connected on TCP port 4560. It can take a little longer to establish the remote connection than it does with localhost . Now you can proceed with the steps shown in Setting up PX4 Software-in-Loop .","title":"PX4 Software-in-Loop with WSL 2"},{"location":"remote_control/","text":"Remote Control To fly manually, you need remote control or RC. If you don't have one then you can use APIs to fly programmatically or use so-called Computer Vision mode to move around using keyboard. RC Setup for Default Config By default AirSim uses simple_flight as its flight controller which connects to RC via USB port to your computer. You can either use XBox controller or FrSky Taranis X9D Plus . Note that XBox 360 controller is not precise enough and is not recommended if you wanted more real world experience. See FAQ below if things are not working. Other Devices AirSim can detect large variety of devices however devices other than above might need extra configuration. In future we will add ability to set this config through settings.json. For now, if things are not working then you might want to try workarounds such as x360ce or change code in SimJoystick.cpp file . Note on FrSky Taranis X9D Plus FrSky Taranis X9D Plus is real UAV remote control with an advantage that it has USB port so it can be directly connected to PC. You can download AirSim config file and follow this tutorial to import it in your RC. You should then see \"sim\" model in RC with all channels configured properly. Note on Linux Currently default config on Linux is for using Xbox controller. This means other devices might not work properly. In future we will add ability to configure RC in settings.json but for now you might have to change code in SimJoystick.cpp file to use other devices. RC Setup for PX4 AirSim supports PX4 flight controller however it requires different setup. There are many remote control options that you can use with quadrotors. We have successfully used FrSky Taranis X9D Plus, FlySky FS-TH9X and Futaba 14SG with AirSim. Following are the high level steps to configure your RC: If you are going to use Hardware-in-Loop mode, you need transmitter for your specific brand of RC and bind it. You can find this information in RC's user guide. For Hardware-in-Loop mode, you connect transmitter to Pixhawk. Usually you can find online doc or YouTube video tutorial on how to do that. Calibrate your RC in QGroundControl . See PX4 RC configuration and Please see this guide for more information. Using XBox 360 USB Gamepad You can also use an xbox controller in SITL mode, it just won't be as precise as a real RC controller. See xbox controller for details on how to set that up. Using Playstation 3 controller A Playstation 3 controller is confirmed to work as an AirSim controller. On Windows, an emulator to make it look like an Xbox 360 controller, is required however. Many different solutions are available online, for example x360ce Xbox 360 Controller Emulator . DJI Controller Nils Tijtgat wrote an excellent blog on how to get the DJI controller working with AirSim . FAQ I'm using default config and AirSim says my RC is not detected on USB. This typically happens if you have multiple RCs and or XBox/Playstation gamepads etc connected. In Windows, hit Windows+S key and search for \"Set up USB Game controllers\" (in older versions of Windows try \"joystick\"). This will show you all game controllers connected to your PC. If you don't see yours than Windows haven't detected it and so you need to first solve that issue. If you do see yours but not at the top of the list (i.e. index 0) than you need to tell AirSim because AirSim by default tries to use RC at index 0. To do this, navigate to your ~/Documents/AirSim folder, open up settings.json and add/modify following setting. Below tells AirSim to use RC at index = 2. { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", \"RC\": { \"RemoteControlID\": 2 } } } } Vehicle seems unstable when using XBox/PS3 controller Regular gamepads are not very precise and have lot of random noise. Most of the times you may see significant offsets as well (i.e. output is not zero when sticks are at zero). So this behavior is expected. Where is RC calibration in AirSim? We haven't implemented it yet. This means your RC firmware will need to have a capability to do calibration for now. My RC is not working with PX4 setup. First you want to make sure your RC is working in QGroundControl . If it doesn't then it will sure not work in AirSim. The PX4 mode is suitable for folks who have at least intermediate level of experience to deal with various issues related to PX4 and we would generally refer you to get help from PX4 forums.","title":"Remote Control"},{"location":"remote_control/#remote-control","text":"To fly manually, you need remote control or RC. If you don't have one then you can use APIs to fly programmatically or use so-called Computer Vision mode to move around using keyboard.","title":"Remote Control"},{"location":"remote_control/#rc-setup-for-default-config","text":"By default AirSim uses simple_flight as its flight controller which connects to RC via USB port to your computer. You can either use XBox controller or FrSky Taranis X9D Plus . Note that XBox 360 controller is not precise enough and is not recommended if you wanted more real world experience. See FAQ below if things are not working.","title":"RC Setup for Default Config"},{"location":"remote_control/#other-devices","text":"AirSim can detect large variety of devices however devices other than above might need extra configuration. In future we will add ability to set this config through settings.json. For now, if things are not working then you might want to try workarounds such as x360ce or change code in SimJoystick.cpp file .","title":"Other Devices"},{"location":"remote_control/#note-on-frsky-taranis-x9d-plus","text":"FrSky Taranis X9D Plus is real UAV remote control with an advantage that it has USB port so it can be directly connected to PC. You can download AirSim config file and follow this tutorial to import it in your RC. You should then see \"sim\" model in RC with all channels configured properly.","title":"Note on FrSky Taranis X9D Plus"},{"location":"remote_control/#note-on-linux","text":"Currently default config on Linux is for using Xbox controller. This means other devices might not work properly. In future we will add ability to configure RC in settings.json but for now you might have to change code in SimJoystick.cpp file to use other devices.","title":"Note on Linux"},{"location":"remote_control/#rc-setup-for-px4","text":"AirSim supports PX4 flight controller however it requires different setup. There are many remote control options that you can use with quadrotors. We have successfully used FrSky Taranis X9D Plus, FlySky FS-TH9X and Futaba 14SG with AirSim. Following are the high level steps to configure your RC: If you are going to use Hardware-in-Loop mode, you need transmitter for your specific brand of RC and bind it. You can find this information in RC's user guide. For Hardware-in-Loop mode, you connect transmitter to Pixhawk. Usually you can find online doc or YouTube video tutorial on how to do that. Calibrate your RC in QGroundControl . See PX4 RC configuration and Please see this guide for more information.","title":"RC Setup for PX4"},{"location":"remote_control/#using-xbox-360-usb-gamepad","text":"You can also use an xbox controller in SITL mode, it just won't be as precise as a real RC controller. See xbox controller for details on how to set that up.","title":"Using XBox 360 USB Gamepad"},{"location":"remote_control/#using-playstation-3-controller","text":"A Playstation 3 controller is confirmed to work as an AirSim controller. On Windows, an emulator to make it look like an Xbox 360 controller, is required however. Many different solutions are available online, for example x360ce Xbox 360 Controller Emulator .","title":"Using Playstation 3 controller"},{"location":"remote_control/#dji-controller","text":"Nils Tijtgat wrote an excellent blog on how to get the DJI controller working with AirSim .","title":"DJI Controller"},{"location":"remote_control/#faq","text":"","title":"FAQ"},{"location":"remote_control/#im-using-default-config-and-airsim-says-my-rc-is-not-detected-on-usb","text":"This typically happens if you have multiple RCs and or XBox/Playstation gamepads etc connected. In Windows, hit Windows+S key and search for \"Set up USB Game controllers\" (in older versions of Windows try \"joystick\"). This will show you all game controllers connected to your PC. If you don't see yours than Windows haven't detected it and so you need to first solve that issue. If you do see yours but not at the top of the list (i.e. index 0) than you need to tell AirSim because AirSim by default tries to use RC at index 0. To do this, navigate to your ~/Documents/AirSim folder, open up settings.json and add/modify following setting. Below tells AirSim to use RC at index = 2. { \"SettingsVersion\": 2.0, \"SimMode\": \"Multirotor\", \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", \"RC\": { \"RemoteControlID\": 2 } } } }","title":"I'm using default config and AirSim says my RC is not detected on USB."},{"location":"remote_control/#vehicle-seems-unstable-when-using-xboxps3-controller","text":"Regular gamepads are not very precise and have lot of random noise. Most of the times you may see significant offsets as well (i.e. output is not zero when sticks are at zero). So this behavior is expected.","title":"Vehicle seems unstable when using XBox/PS3 controller"},{"location":"remote_control/#where-is-rc-calibration-in-airsim","text":"We haven't implemented it yet. This means your RC firmware will need to have a capability to do calibration for now.","title":"Where is RC calibration in AirSim?"},{"location":"remote_control/#my-rc-is-not-working-with-px4-setup","text":"First you want to make sure your RC is working in QGroundControl . If it doesn't then it will sure not work in AirSim. The PX4 mode is suitable for folks who have at least intermediate level of experience to deal with various issues related to PX4 and we would generally refer you to get help from PX4 forums.","title":"My RC is not working with PX4 setup."},{"location":"retexturing/","text":"Runtime Texture Swapping How to Make An Actor Retexturable To be made texture-swappable, an actor must derive from the parent class TextureShuffleActor. The parent class can be set via the settings tab in the actor's blueprint. After setting the parent class to TextureShuffActor, the object gains the member DynamicMaterial. DynamicMaterial needs to be set--on all actor instances in the scene--to TextureSwappableMaterial. Warning: Statically setting the Dynamic Material in the blueprint class may cause rendering errors. It seems to work better to set it on all the actor instances in the scene, using the details panel. How to Define the Set(s) of Textures to Choose From Typically, certain subsets of actors will share a set of texture options with each other. (e.g. walls that are part of the same building) It's easy to set up these groupings by using Unreal Engine's group editing functionality. Select all the instances that should have the same texture selection, and add the textures to all of them simultaneously via the Details panel. Use the same technique to add descriptive tags to groups of actors, which will be used to address them in the API. It's ideal to work from larger groupings to smaller groupings, simply deselecting actors to narrow down the grouping as you go, and applying any individual actor properties last. How to Swap Textures from the API The following API is available in C++ and python. (C++ shown) std::vector simSwapTextures(const std::string& tags, int tex_id); The string of \",\" or \", \" delimited tags identifies on which actors to perform the swap. The tex_id indexes the array of textures assigned to each actor undergoing a swap. The function will return the list of objects which matched the provided tags and had the texture swap perfomed. If tex_id is out-of-bounds for some object's texture set, it will be taken modulo the number of textures that were available. Demo (Python): import cosysairsim as airsim import time c = airsim.client.MultirotorClient() print(c.simSwapTextures(\"furniture\", 0)) time.sleep(2) print(c.simSwapTextures(\"chair\", 1)) time.sleep(2) print(c.simSwapTextures(\"table\", 1)) time.sleep(2) print(c.simSwapTextures(\"chair, right\", 0)) Results: ['RetexturableChair', 'RetexturableChair2', 'RetexturableTable'] ['RetexturableChair', 'RetexturableChair2'] ['RetexturableTable'] ['RetexturableChair2'] Note that in this example, different textures were chosen on each actor for the same index value. You can also use the simSetObjectMaterial and simSetObjectMaterialFromTexture APIs to set an object's material to any material asset or filepath of a texture. For more information on using these APIs, see Texture APIs .","title":"Domain Randomization"},{"location":"retexturing/#runtime-texture-swapping","text":"","title":"Runtime Texture Swapping"},{"location":"retexturing/#how-to-make-an-actor-retexturable","text":"To be made texture-swappable, an actor must derive from the parent class TextureShuffleActor. The parent class can be set via the settings tab in the actor's blueprint. After setting the parent class to TextureShuffActor, the object gains the member DynamicMaterial. DynamicMaterial needs to be set--on all actor instances in the scene--to TextureSwappableMaterial. Warning: Statically setting the Dynamic Material in the blueprint class may cause rendering errors. It seems to work better to set it on all the actor instances in the scene, using the details panel.","title":"How to Make An Actor Retexturable"},{"location":"retexturing/#how-to-define-the-sets-of-textures-to-choose-from","text":"Typically, certain subsets of actors will share a set of texture options with each other. (e.g. walls that are part of the same building) It's easy to set up these groupings by using Unreal Engine's group editing functionality. Select all the instances that should have the same texture selection, and add the textures to all of them simultaneously via the Details panel. Use the same technique to add descriptive tags to groups of actors, which will be used to address them in the API. It's ideal to work from larger groupings to smaller groupings, simply deselecting actors to narrow down the grouping as you go, and applying any individual actor properties last.","title":"How to Define the Set(s) of Textures to Choose From"},{"location":"retexturing/#how-to-swap-textures-from-the-api","text":"The following API is available in C++ and python. (C++ shown) std::vector simSwapTextures(const std::string& tags, int tex_id); The string of \",\" or \", \" delimited tags identifies on which actors to perform the swap. The tex_id indexes the array of textures assigned to each actor undergoing a swap. The function will return the list of objects which matched the provided tags and had the texture swap perfomed. If tex_id is out-of-bounds for some object's texture set, it will be taken modulo the number of textures that were available. Demo (Python): import cosysairsim as airsim import time c = airsim.client.MultirotorClient() print(c.simSwapTextures(\"furniture\", 0)) time.sleep(2) print(c.simSwapTextures(\"chair\", 1)) time.sleep(2) print(c.simSwapTextures(\"table\", 1)) time.sleep(2) print(c.simSwapTextures(\"chair, right\", 0)) Results: ['RetexturableChair', 'RetexturableChair2', 'RetexturableTable'] ['RetexturableChair', 'RetexturableChair2'] ['RetexturableTable'] ['RetexturableChair2'] Note that in this example, different textures were chosen on each actor for the same index value. You can also use the simSetObjectMaterial and simSetObjectMaterialFromTexture APIs to set an object's material to any material asset or filepath of a texture. For more information on using these APIs, see Texture APIs .","title":"How to Swap Textures from the API"},{"location":"ros_cplusplus/","text":"airsim_ros_pkgs A ROS2 wrapper over the Cosys-AirSim C++ client library. All coordinates and data are in the right-handed coordinate frame of the ROS standard and not in NED except for geo points. The following was tested on Ubuntu 22.04 with ROS2 Iron. Build Build Cosys-AirSim as per the instructions. Make sure that you have set up the environment variables for ROS. Add the source command to your .bashrc for convenience (replace iron with specific version name) - echo \"source /opt/ros/iron/setup.bash\" >> ~/.bashrc source ~/.bashrc -- Install dependencies with rosdep, if not already installed - apt-get install python3-rosdep sudo rosdep init rosdep update rosdep install --from-paths src -y --ignore-src --skip-keys pcl --skip-keys message_runtime --skip-keys message_generation Build ROS package colcon build --cmake-args -DCMAKE_BUILD_TYPE=Release Running source install/setup.bash ros2 launch airsim_ros_pkgs airsim_node.launch.py Using Cosys-Airsim ROS wrapper The ROS wrapper is composed of two ROS nodes - the first is a wrapper over Cosys-AirSim's multirotor C++ client library, and the second is a simple PD position controller. Let's look at the ROS API for both nodes: Cosys-Airsim ROS Wrapper Node Publishers: The publishers will be automatically created based on the settings in the settings.json file for all vehicles and the sensors. /airsim_node/VEHICLE-NAME/car_state airsim_interfaces::CarState The state of the car if the vehicle is of this sim-mode type. /airsim_node/VEHICLE-NAME/computervision_state airsim_interfaces::ComputerVisionState The state of the computer vision actor if the vehicle is of this sim-mode type. /airsim_node/origin_geo_point airsim_interfaces::GPSYaw GPS coordinates corresponding to global frame. This is set in the airsim's settings.json file under the OriginGeopoint key. /airsim_node/VEHICLE-NAME/global_gps sensor_msgs::NavSatFix This the current GPS coordinates of the drone in airsim. /airsim_node/VEHICLE-NAME/environment airsim_interfaces::Environment /airsim_node/VEHICLE-NAME/odom_local nav_msgs::Odometry Odometry frame (default name: odom_local, launch name and frame type are configurable) wrt take-off point. /airsim_node/VEHICLE-NAME/CAMERA-NAME_IMAGE-TYPE/camera_info sensor_msgs::CameraInfo Optionally if the image type is annotation the annotation layer name is also included in the topic name. /airsim_node/VEHICLE-NAME/CAMERA-NAME_IMAGE-TYPE/image sensor_msgs::Image RGB or float image depending on image type requested in settings.json. Optionally if the image type is annotation the annotation layer name is also included in the topic name. /tf tf2_msgs::TFMessage /airsim_node/VEHICLE-NAME/altimeter/SENSOR_NAME airsim_interfaces::Altimeter This the current altimeter reading for altitude, pressure, and QNH /airsim_node/VEHICLE-NAME/imu/SENSOR_NAME sensor_msgs::Imu IMU sensor data. /airsim_node/VEHICLE-NAME/magnetometer/SENSOR_NAME sensor_msgs::MagneticField Measurement of magnetic field vector/compass. /airsim_node/VEHICLE-NAME/distance/SENSOR_NAME sensor_msgs::Range Measurement of distance from an active ranger, such as infrared or IR /airsim_node/VEHICLE-NAME/lidar/points/SENSOR_NAME/ sensor_msgs::PointCloud2 LIDAR pointcloud /airsim_node/VEHICLE-NAME/lidar/labels/SENSOR_NAME/ airsim_interfaces::StringArray Custom message type with an array of string that are the labels for each point in the pointcloud of the lidar sensor /airsim_node/VEHICLE-NAME/gpulidar/points/SENSOR_NAME/ sensor_msgs::PointCloud2 GPU LIDAR pointcloud. The instance segmentation/annotation color data is stored in the rgb field of the pointcloud. The intensity data is stored as well in the intensity field /airsim_node/VEHICLE-NAME/echo/active/points/SENSOR_NAME/ sensor_msgs::PointCloud2 Echo sensor pointcloud for active sensing /airsim_node/VEHICLE-NAME/echo/passive/points/SENSOR_NAME/ sensor_msgs::PointCloud2 Echo sensor pointcloud for passive sensing /airsim_node/VEHICLE-NAME/echo/active/labels/SENSOR_NAME/ airsim_interfaces::StringArray Custom message type with an array of string that are the labels for each point in the pointcloud for the active echo pointcloud /airsim_node/VEHICLE-NAME/echo/passive/labels/SENSOR_NAME/ airsim_interfaces::StringArray Custom message type with an array of string that are the labels for each point in the pointcloud for the passive echo pointcloud /airsim_node/instance_segmentation_labels airsim_interfaces::InstanceSegmentationList Custom message type with an array of a custom messages that are the names, color and index of the instance segmentation system for each object in the world. /airsim_node/object_transforms airsim_interfaces::ObjectTransformsList Custom message type with an array of geometry_msgs::TransformStamped that are the transforms of all objects in the world, each child frame ID is the object name. Subscribers: /airsim_node/VEHICLE-NAME/vel_cmd_body_frame airsim_interfaces::VelCmd /airsim_node/VEHICLE-NAME/vel_cmd_world_frame airsim_interfaces::VelCmd /airsim_node/all_robots/vel_cmd_body_frame airsim_interfaces::VelCmd Set velocity command for all drones. /airsim_node/all_robots/vel_cmd_world_frame airsim_interfaces::VelCmd /airsim_node/group_of_robots/vel_cmd_body_frame airsim_interfaces::VelCmdGroup Set velocity command for a specific set of drones. /airsim_node/group_of_robots/vel_cmd_world_frame airsim_interfaces::VelCmdGroup Set velocity command for a specific set of drones. /gimbal_angle_euler_cmd airsim_interfaces::GimbalAngleEulerCmd Gimbal set point in euler angles. /gimbal_angle_quat_cmd airsim_interfaces::GimbalAngleQuatCmd Gimbal set point in quaternion. /airsim_node/VEHICLE-NAME/car_cmd airsim_interfaces::CarControls Throttle, brake, steering and gear selections for control. Both automatic and manual transmission control possible, see the car_joy.py script for use. Services: /airsim_node/VEHICLE-NAME/land airsim_interfaces::Land /airsim_node/VEHICLE-NAME/takeoff airsim_interfaces::Takeoff /airsim_node/all_robots/land airsim_interfaces::Land land all drones /airsim_node/all_robots/takeoff airsim_interfaces::Takeoff take-off all drones /airsim_node/group_of_robots/land airsim_interfaces::LandGroup land a specific set of drones /airsim_node/group_of_robots/takeoff airsim_interfaces::TakeoffGroup take-off a specific set of drones /airsim_node/reset airsim_interfaces::Reset Resets all vehicles /airsim_node/instance_segmentation_refresh airsim_interfaces::RefreshInstanceSegmentation Refresh the instance segmentation list /airsim_node/object_transforms_refresh airsim_interfaces::RefreshObjectTransforms Refresh the object transforms list Parameters: /airsim_node/host_ip [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: localhost The IP of the machine running the airsim RPC API server. /airsim_node/host_port [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 41451 The port of the machine running the airsim RPC API server. /airsim_node/enable_api_control [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: false Set the API control and arm the drones on startup. If not set to true no control is available. /airsim_node/enable_object_transforms_list [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: true Retrieve the object transforms list from the airsim API at the start or with the service to refresh. If disabled this is not available but can save time on startup. /airsim_node/host_port [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 41451 The port of the machine running the airsim RPC API server. /airsim_node/is_vulkan [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: True If using Vulkan, the image encoding is switched from rgb8 to bgr8. /airsim_node/world_frame_id [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: world /airsim_node/odom_frame_id [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: odom_local /airsim_node/update_airsim_control_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for updating drone odom and state from airsim, and sending in control commands. The current RPClib interface to unreal engine maxes out at 50 Hz. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/update_airsim_img_response_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for receiving images from all cameras in airsim. The speed will depend on number of images requested and their resolution. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/update_lidar_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for receiving images from all Lidar data in airsim. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/update_gpulidar_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for receiving images from all GPU-Lidar data in airsim. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/update_echo_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for receiving images from all echo sensor data in airsim. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/publish_clock [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: false Will publish the ros /clock topic if set to true. Simple PID Position Controller Node Parameters: PD controller parameters: /pd_position_node/kp_x [double], /pd_position_node/kp_y [double], /pd_position_node/kp_z [double], /pd_position_node/kp_yaw [double] Proportional gains /pd_position_node/kd_x [double], /pd_position_node/kd_y [double], /pd_position_node/kd_z [double], /pd_position_node/kd_yaw [double] Derivative gains /pd_position_node/reached_thresh_xyz [double] Threshold euler distance (meters) from current position to setpoint position /pd_position_node/reached_yaw_degrees [double] Threshold yaw distance (degrees) from current position to setpoint position /pd_position_node/update_control_every_n_sec [double] Default: 0.01 seconds Services: /airsim_node/VEHICLE-NAME/gps_goal [Request: airsim_interfaces::SetGPSPosition ] Target gps position + yaw. In absolute altitude. /airsim_node/VEHICLE-NAME/local_position_goal [Request: airsim_interfaces::SetLocalPosition ] Target local position + yaw in global frame. Subscribers: /airsim_node/origin_geo_point airsim_interfaces::GPSYaw Listens to home geo coordinates published by airsim_node . /airsim_node/VEHICLE-NAME/odom_local nav_msgs::Odometry Listens to odometry published by airsim_node Publishers: /vel_cmd_world_frame airsim_interfaces::VelCmd Sends velocity command to airsim_node /vel_cmd_body_frame airsim_interfaces::VelCmd Sends velocity command to airsim_node Global params Dynamic constraints. These can be changed in dynamic_constraints.launch : /max_vel_horz_abs [double] Maximum horizontal velocity of the drone (meters/second) /max_vel_vert_abs [double] Maximum vertical velocity of the drone (meters/second) /max_yaw_rate_degree [double] Maximum yaw rate (degrees/second)","title":"ROS2: AirSim ROS C++ Wrapper"},{"location":"ros_cplusplus/#airsim_ros_pkgs","text":"A ROS2 wrapper over the Cosys-AirSim C++ client library. All coordinates and data are in the right-handed coordinate frame of the ROS standard and not in NED except for geo points. The following was tested on Ubuntu 22.04 with ROS2 Iron.","title":"airsim_ros_pkgs"},{"location":"ros_cplusplus/#build","text":"Build Cosys-AirSim as per the instructions. Make sure that you have set up the environment variables for ROS. Add the source command to your .bashrc for convenience (replace iron with specific version name) - echo \"source /opt/ros/iron/setup.bash\" >> ~/.bashrc source ~/.bashrc -- Install dependencies with rosdep, if not already installed - apt-get install python3-rosdep sudo rosdep init rosdep update rosdep install --from-paths src -y --ignore-src --skip-keys pcl --skip-keys message_runtime --skip-keys message_generation Build ROS package colcon build --cmake-args -DCMAKE_BUILD_TYPE=Release","title":"Build"},{"location":"ros_cplusplus/#running","text":"source install/setup.bash ros2 launch airsim_ros_pkgs airsim_node.launch.py","title":"Running"},{"location":"ros_cplusplus/#using-cosys-airsim-ros-wrapper","text":"The ROS wrapper is composed of two ROS nodes - the first is a wrapper over Cosys-AirSim's multirotor C++ client library, and the second is a simple PD position controller. Let's look at the ROS API for both nodes:","title":"Using Cosys-Airsim ROS wrapper"},{"location":"ros_cplusplus/#cosys-airsim-ros-wrapper-node","text":"","title":"Cosys-Airsim ROS Wrapper Node"},{"location":"ros_cplusplus/#publishers","text":"The publishers will be automatically created based on the settings in the settings.json file for all vehicles and the sensors. /airsim_node/VEHICLE-NAME/car_state airsim_interfaces::CarState The state of the car if the vehicle is of this sim-mode type. /airsim_node/VEHICLE-NAME/computervision_state airsim_interfaces::ComputerVisionState The state of the computer vision actor if the vehicle is of this sim-mode type. /airsim_node/origin_geo_point airsim_interfaces::GPSYaw GPS coordinates corresponding to global frame. This is set in the airsim's settings.json file under the OriginGeopoint key. /airsim_node/VEHICLE-NAME/global_gps sensor_msgs::NavSatFix This the current GPS coordinates of the drone in airsim. /airsim_node/VEHICLE-NAME/environment airsim_interfaces::Environment /airsim_node/VEHICLE-NAME/odom_local nav_msgs::Odometry Odometry frame (default name: odom_local, launch name and frame type are configurable) wrt take-off point. /airsim_node/VEHICLE-NAME/CAMERA-NAME_IMAGE-TYPE/camera_info sensor_msgs::CameraInfo Optionally if the image type is annotation the annotation layer name is also included in the topic name. /airsim_node/VEHICLE-NAME/CAMERA-NAME_IMAGE-TYPE/image sensor_msgs::Image RGB or float image depending on image type requested in settings.json. Optionally if the image type is annotation the annotation layer name is also included in the topic name. /tf tf2_msgs::TFMessage /airsim_node/VEHICLE-NAME/altimeter/SENSOR_NAME airsim_interfaces::Altimeter This the current altimeter reading for altitude, pressure, and QNH /airsim_node/VEHICLE-NAME/imu/SENSOR_NAME sensor_msgs::Imu IMU sensor data. /airsim_node/VEHICLE-NAME/magnetometer/SENSOR_NAME sensor_msgs::MagneticField Measurement of magnetic field vector/compass. /airsim_node/VEHICLE-NAME/distance/SENSOR_NAME sensor_msgs::Range Measurement of distance from an active ranger, such as infrared or IR /airsim_node/VEHICLE-NAME/lidar/points/SENSOR_NAME/ sensor_msgs::PointCloud2 LIDAR pointcloud /airsim_node/VEHICLE-NAME/lidar/labels/SENSOR_NAME/ airsim_interfaces::StringArray Custom message type with an array of string that are the labels for each point in the pointcloud of the lidar sensor /airsim_node/VEHICLE-NAME/gpulidar/points/SENSOR_NAME/ sensor_msgs::PointCloud2 GPU LIDAR pointcloud. The instance segmentation/annotation color data is stored in the rgb field of the pointcloud. The intensity data is stored as well in the intensity field /airsim_node/VEHICLE-NAME/echo/active/points/SENSOR_NAME/ sensor_msgs::PointCloud2 Echo sensor pointcloud for active sensing /airsim_node/VEHICLE-NAME/echo/passive/points/SENSOR_NAME/ sensor_msgs::PointCloud2 Echo sensor pointcloud for passive sensing /airsim_node/VEHICLE-NAME/echo/active/labels/SENSOR_NAME/ airsim_interfaces::StringArray Custom message type with an array of string that are the labels for each point in the pointcloud for the active echo pointcloud /airsim_node/VEHICLE-NAME/echo/passive/labels/SENSOR_NAME/ airsim_interfaces::StringArray Custom message type with an array of string that are the labels for each point in the pointcloud for the passive echo pointcloud /airsim_node/instance_segmentation_labels airsim_interfaces::InstanceSegmentationList Custom message type with an array of a custom messages that are the names, color and index of the instance segmentation system for each object in the world. /airsim_node/object_transforms airsim_interfaces::ObjectTransformsList Custom message type with an array of geometry_msgs::TransformStamped that are the transforms of all objects in the world, each child frame ID is the object name.","title":"Publishers:"},{"location":"ros_cplusplus/#subscribers","text":"/airsim_node/VEHICLE-NAME/vel_cmd_body_frame airsim_interfaces::VelCmd /airsim_node/VEHICLE-NAME/vel_cmd_world_frame airsim_interfaces::VelCmd /airsim_node/all_robots/vel_cmd_body_frame airsim_interfaces::VelCmd Set velocity command for all drones. /airsim_node/all_robots/vel_cmd_world_frame airsim_interfaces::VelCmd /airsim_node/group_of_robots/vel_cmd_body_frame airsim_interfaces::VelCmdGroup Set velocity command for a specific set of drones. /airsim_node/group_of_robots/vel_cmd_world_frame airsim_interfaces::VelCmdGroup Set velocity command for a specific set of drones. /gimbal_angle_euler_cmd airsim_interfaces::GimbalAngleEulerCmd Gimbal set point in euler angles. /gimbal_angle_quat_cmd airsim_interfaces::GimbalAngleQuatCmd Gimbal set point in quaternion. /airsim_node/VEHICLE-NAME/car_cmd airsim_interfaces::CarControls Throttle, brake, steering and gear selections for control. Both automatic and manual transmission control possible, see the car_joy.py script for use.","title":"Subscribers:"},{"location":"ros_cplusplus/#services","text":"/airsim_node/VEHICLE-NAME/land airsim_interfaces::Land /airsim_node/VEHICLE-NAME/takeoff airsim_interfaces::Takeoff /airsim_node/all_robots/land airsim_interfaces::Land land all drones /airsim_node/all_robots/takeoff airsim_interfaces::Takeoff take-off all drones /airsim_node/group_of_robots/land airsim_interfaces::LandGroup land a specific set of drones /airsim_node/group_of_robots/takeoff airsim_interfaces::TakeoffGroup take-off a specific set of drones /airsim_node/reset airsim_interfaces::Reset Resets all vehicles /airsim_node/instance_segmentation_refresh airsim_interfaces::RefreshInstanceSegmentation Refresh the instance segmentation list /airsim_node/object_transforms_refresh airsim_interfaces::RefreshObjectTransforms Refresh the object transforms list","title":"Services:"},{"location":"ros_cplusplus/#parameters","text":"/airsim_node/host_ip [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: localhost The IP of the machine running the airsim RPC API server. /airsim_node/host_port [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 41451 The port of the machine running the airsim RPC API server. /airsim_node/enable_api_control [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: false Set the API control and arm the drones on startup. If not set to true no control is available. /airsim_node/enable_object_transforms_list [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: true Retrieve the object transforms list from the airsim API at the start or with the service to refresh. If disabled this is not available but can save time on startup. /airsim_node/host_port [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 41451 The port of the machine running the airsim RPC API server. /airsim_node/is_vulkan [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: True If using Vulkan, the image encoding is switched from rgb8 to bgr8. /airsim_node/world_frame_id [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: world /airsim_node/odom_frame_id [string] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: odom_local /airsim_node/update_airsim_control_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for updating drone odom and state from airsim, and sending in control commands. The current RPClib interface to unreal engine maxes out at 50 Hz. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/update_airsim_img_response_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for receiving images from all cameras in airsim. The speed will depend on number of images requested and their resolution. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/update_lidar_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for receiving images from all Lidar data in airsim. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/update_gpulidar_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for receiving images from all GPU-Lidar data in airsim. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/update_echo_every_n_sec [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: 0.01 seconds. Timer callback frequency for receiving images from all echo sensor data in airsim. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. /airsim_node/publish_clock [double] Set in: $(airsim_ros_pkgs)/launch/airsim_node.launch Default: false Will publish the ros /clock topic if set to true.","title":"Parameters:"},{"location":"ros_cplusplus/#simple-pid-position-controller-node","text":"","title":"Simple PID Position Controller Node"},{"location":"ros_cplusplus/#parameters_1","text":"PD controller parameters: /pd_position_node/kp_x [double], /pd_position_node/kp_y [double], /pd_position_node/kp_z [double], /pd_position_node/kp_yaw [double] Proportional gains /pd_position_node/kd_x [double], /pd_position_node/kd_y [double], /pd_position_node/kd_z [double], /pd_position_node/kd_yaw [double] Derivative gains /pd_position_node/reached_thresh_xyz [double] Threshold euler distance (meters) from current position to setpoint position /pd_position_node/reached_yaw_degrees [double] Threshold yaw distance (degrees) from current position to setpoint position /pd_position_node/update_control_every_n_sec [double] Default: 0.01 seconds","title":"Parameters:"},{"location":"ros_cplusplus/#services_1","text":"/airsim_node/VEHICLE-NAME/gps_goal [Request: airsim_interfaces::SetGPSPosition ] Target gps position + yaw. In absolute altitude. /airsim_node/VEHICLE-NAME/local_position_goal [Request: airsim_interfaces::SetLocalPosition ] Target local position + yaw in global frame.","title":"Services:"},{"location":"ros_cplusplus/#subscribers_1","text":"/airsim_node/origin_geo_point airsim_interfaces::GPSYaw Listens to home geo coordinates published by airsim_node . /airsim_node/VEHICLE-NAME/odom_local nav_msgs::Odometry Listens to odometry published by airsim_node","title":"Subscribers:"},{"location":"ros_cplusplus/#publishers_1","text":"/vel_cmd_world_frame airsim_interfaces::VelCmd Sends velocity command to airsim_node /vel_cmd_body_frame airsim_interfaces::VelCmd Sends velocity command to airsim_node","title":"Publishers:"},{"location":"ros_cplusplus/#global-params","text":"Dynamic constraints. These can be changed in dynamic_constraints.launch : /max_vel_horz_abs [double] Maximum horizontal velocity of the drone (meters/second) /max_vel_vert_abs [double] Maximum vertical velocity of the drone (meters/second) /max_yaw_rate_degree [double] Maximum yaw rate (degrees/second)","title":"Global params"},{"location":"ros_python/","text":"How to use AirSim with Robot Operating System (ROS) AirSim and ROS can be integrated using Python. Some example ROS node are provided demonstrating how to publish data from AirSim as ROS topics. Prerequisites These instructions are for Ubuntu 20.04, ROS Noetic, UE 5.4 and latest Cosys-AirSim release. You should have these components installed and working before proceeding. Note that you need to install the Python module first for this to work. More information here in the section 'Installing AirSim Package'. Publish node There is one single Python script airsim_publish.py that can be used as a ROS Node. It can be used in two ways: - Get and publish the entire TF tree of map, vehicle and sensors; vehicle movement groundtruth ; all sensor data as well as the poses of world objects. - Replays a route rosbag that holds an existing trajectory of a vehicle. The script will then replay this trajectory while recording all sensor data for each pose of the trajectory. It generates a new rosbag holding both the route and sensor data as well as all TF information. This allows for better performance and deterministic datasets over the same route. Example launch files Some basic launch files are available for the ROS node in these two configurations mentioned above. - airsim_publish.launch : This shows all available parameters for the node. It also shows how to use the node in the first configuration. - record_route.launch : This is a variant of the one above but only exposing and enabling those to create a route rosbag for the second configuration. It will automatically record a rosbag as well. - replay_route_record_sensors.launch : This is the script to use a route rosbag created with the previous launch file type to replay it and record all sensor and TF data and create a single merged rosbag. Setup Setup workspace and Airsim package Option A: Create a new ROS package in your catkin workspace following these instructions. Create a new ROS package called AirSim or whatever you like. If you don't already have a catkin workspace, you should first work through the ROS beginner tutorials. In the ROS package directory you made, copy the ROS node scripts from the AirSim/ros/python_ws/src/airsimros directory to your ROS package. Change the code below to match your AirSim and catkin workspace paths. cp AirSim/ros/python_ws/src/airsim/scripts ../catkin_ws/src/airsimros Option B: Use provided workspace Airsim/ros/python_ws itself is already a workspace which can be used out of the box after building. For building see below. Build ROS AirSim package Change directory to your top level catkin workspace folder i.e. cd ~/catkin_ws and run catkin_make This will build the AirSim package. Next, run source devel/setup.bash so ROS can find the new package. You can add this command to your ~/.bashrc to load your catkin workspace automatically. s## How to run ROS AirSim nodes First make sure you are running an AirSim project and that the simulation is playing. The implemented AirSim node can be run using rosrun airsimros airsim_publish.py . Or alternatively you can use launch files such as the example ones that can be found in AirSim/ros/python_ws/src/airsim/launch like rosrun airsimros airsim_publish.launch .","title":"ROS: AirSim ROS Python Wrapper"},{"location":"ros_python/#how-to-use-airsim-with-robot-operating-system-ros","text":"AirSim and ROS can be integrated using Python. Some example ROS node are provided demonstrating how to publish data from AirSim as ROS topics.","title":"How to use AirSim with Robot Operating System (ROS)"},{"location":"ros_python/#prerequisites","text":"These instructions are for Ubuntu 20.04, ROS Noetic, UE 5.4 and latest Cosys-AirSim release. You should have these components installed and working before proceeding. Note that you need to install the Python module first for this to work. More information here in the section 'Installing AirSim Package'.","title":"Prerequisites"},{"location":"ros_python/#publish-node","text":"There is one single Python script airsim_publish.py that can be used as a ROS Node. It can be used in two ways: - Get and publish the entire TF tree of map, vehicle and sensors; vehicle movement groundtruth ; all sensor data as well as the poses of world objects. - Replays a route rosbag that holds an existing trajectory of a vehicle. The script will then replay this trajectory while recording all sensor data for each pose of the trajectory. It generates a new rosbag holding both the route and sensor data as well as all TF information. This allows for better performance and deterministic datasets over the same route.","title":"Publish node"},{"location":"ros_python/#example-launch-files","text":"Some basic launch files are available for the ROS node in these two configurations mentioned above. - airsim_publish.launch : This shows all available parameters for the node. It also shows how to use the node in the first configuration. - record_route.launch : This is a variant of the one above but only exposing and enabling those to create a route rosbag for the second configuration. It will automatically record a rosbag as well. - replay_route_record_sensors.launch : This is the script to use a route rosbag created with the previous launch file type to replay it and record all sensor and TF data and create a single merged rosbag.","title":"Example launch files"},{"location":"ros_python/#setup","text":"","title":"Setup"},{"location":"ros_python/#setup-workspace-and-airsim-package","text":"","title":"Setup workspace and Airsim package"},{"location":"ros_python/#option-a-create-a-new-ros-package-in-your-catkin-workspace-following-these-instructions","text":"Create a new ROS package called AirSim or whatever you like. If you don't already have a catkin workspace, you should first work through the ROS beginner tutorials. In the ROS package directory you made, copy the ROS node scripts from the AirSim/ros/python_ws/src/airsimros directory to your ROS package. Change the code below to match your AirSim and catkin workspace paths. cp AirSim/ros/python_ws/src/airsim/scripts ../catkin_ws/src/airsimros","title":"Option A: Create a new ROS package in your catkin workspace following these instructions."},{"location":"ros_python/#option-b-use-provided-workspace","text":"Airsim/ros/python_ws itself is already a workspace which can be used out of the box after building. For building see below.","title":"Option B: Use provided workspace"},{"location":"ros_python/#build-ros-airsim-package","text":"Change directory to your top level catkin workspace folder i.e. cd ~/catkin_ws and run catkin_make This will build the AirSim package. Next, run source devel/setup.bash so ROS can find the new package. You can add this command to your ~/.bashrc to load your catkin workspace automatically. s## How to run ROS AirSim nodes First make sure you are running an AirSim project and that the simulation is playing. The implemented AirSim node can be run using rosrun airsimros airsim_publish.py . Or alternatively you can use launch files such as the example ones that can be found in AirSim/ros/python_ws/src/airsim/launch like rosrun airsimros airsim_publish.launch .","title":"Build ROS AirSim package"},{"location":"sensors/","text":"Sensors in Cosys-AirSim Cosys-AirSim currently supports the following sensors. Each sensor is associated with a integer enum specifying its sensor type. Camera Barometer = 1 Imu = 2 Gps = 3 Magnetometer = 4 Distance Sensor = 5 Lidar = 6 Echo = 7 GPULidar = 8 Uwb = 10 Wi-Fi = 11 Note : Cameras are configured differently than the other sensors and do not have an enum associated with them. Look at general settings and image API for camera config and API. Default sensors If no sensors are specified in the settings.json , then the following sensors are enabled by default based on the sim mode. Multirotor Imu Magnetometer Gps Barometer Car Gps ComputerVision None Behind the scenes, 'createDefaultSensorSettings' method in AirSimSettings.hpp which sets up the above sensors with their default parameters, depending on the sim mode specified in the settings.json file. Configuring the default sensor list The default sensor list can be configured in settings json: \"DefaultSensors\": { \"Barometer\": { \"SensorType\": 1, \"Enabled\" : true, \"PressureFactorSigma\": 0.001825, \"PressureFactorTau\": 3600, \"UncorrelatedNoiseSigma\": 2.7, \"UpdateLatency\": 0, \"UpdateFrequency\": 50, \"StartupDelay\": 0 }, \"Imu\": { \"SensorType\": 2, \"Enabled\" : true, \"GenerateNoise\": false, \"AngularRandomWalk\": 0.3, \"GyroBiasStabilityTau\": 500, \"GyroBiasStability\": 4.6, \"VelocityRandomWalk\": 0.24, \"AccelBiasStabilityTau\": 800, \"AccelBiasStability\": 36 }, \"Gps\": { \"SensorType\": 3, \"Enabled\" : true, \"EphTimeConstant\": 0.9, \"EpvTimeConstant\": 0.9, \"EphInitial\": 25, \"EpvInitial\": 25, \"EphFinal\": 0.1, \"EpvFinal\": 0.1, \"EphMin3d\": 3, \"EphMin2d\": 4, \"UpdateLatency\": 0.2, \"UpdateFrequency\": 50, \"StartupDelay\": 1 }, \"Magnetometer\": { \"SensorType\": 4, \"Enabled\" : true, \"NoiseSigma\": 0.005, \"ScaleFactor\": 1, \"NoiseBias\": 0, \"UpdateLatency\": 0, \"UpdateFrequency\": 50, \"StartupDelay\": 0 }, \"Distance\": { \"SensorType\": 5, \"Enabled\" : true, \"MinDistance\": 0.2, \"MaxDistance\": 40, \"X\": 0, \"Y\": 0, \"Z\": -1, \"Yaw\": 0, \"Pitch\": 0, \"Roll\": 0, \"DrawDebugPoints\": false } }, Configuring vehicle-specific sensor list A vehicle can override a subset of the default sensors listed above. A Lidar and Distance sensor are not added to a vehicle by default, so those you need to add this way. Each sensor must have a valid \"SensorType\" and a subset of the properties can be defined that override the default values shown above and you can set Enabled to false to disable a specific type of sensor. \"Vehicles\": { \"Drone1\": { \"VehicleType\": \"SimpleFlight\", \"AutoCreate\": true, ... \"Sensors\": { \"MyLidar1\": { \"SensorType\": 6, \"Enabled\" : true, \"NumberOfChannels\": 16, \"PointsPerSecond\": 10000, \"X\": 0, \"Y\": 0, \"Z\": -1, \"DrawDebugPoints\": true }, \"MyLidar2\": { \"SensorType\": 6, \"Enabled\" : true, \"NumberOfChannels\": 4, \"PointsPerSecond\": 10000, \"X\": 0, \"Y\": 0, \"Z\": -1, \"DrawDebugPoints\": true } } } } Sensor specific settings Each sensor-type has its own set of settings as well. Please see lidar for example of Lidar specific settings. Please see echo for example of Echo specific settings. Please see GPU lidar for example of GPU Lidar specific settings. Sensor APIs Barometer msr::airlib::BarometerBase::Output getBarometerData(const std::string& barometer_name, const std::string& vehicle_name); barometer_data = client.getBarometerData(barometer_name = \"\", vehicle_name = \"\") IMU msr::airlib::ImuBase::Output getImuData(const std::string& imu_name = \"\", const std::string& vehicle_name = \"\"); imu_data = client.getImuData(imu_name = \"\", vehicle_name = \"\") GPS msr::airlib::GpsBase::Output getGpsData(const std::string& gps_name = \"\", const std::string& vehicle_name = \"\"); gps_data = client.getGpsData(gps_name = \"\", vehicle_name = \"\") Magnetometer msr::airlib::MagnetometerBase::Output getMagnetometerData(const std::string& magnetometer_name = \"\", const std::string& vehicle_name = \"\"); magnetometer_data = client.getMagnetometerData(magnetometer_name = \"\", vehicle_name = \"\") Distance sensor msr::airlib::DistanceSensorData getDistanceSensorData(const std::string& distance_sensor_name = \"\", const std::string& vehicle_name = \"\"); distance_sensor_data = client.getDistanceSensorData(distance_sensor_name = \"\", vehicle_name = \"\") Lidar See lidar for Lidar API. Echo See echo for Echo API. GPU Lidar See GPU Lidar for GPU Lidar API. UWB/Wi-Fi These sensors are still experimental and are currently not documented. Please refer to the source code for more information.","title":"Sensors"},{"location":"sensors/#sensors-in-cosys-airsim","text":"Cosys-AirSim currently supports the following sensors. Each sensor is associated with a integer enum specifying its sensor type. Camera Barometer = 1 Imu = 2 Gps = 3 Magnetometer = 4 Distance Sensor = 5 Lidar = 6 Echo = 7 GPULidar = 8 Uwb = 10 Wi-Fi = 11 Note : Cameras are configured differently than the other sensors and do not have an enum associated with them. Look at general settings and image API for camera config and API.","title":"Sensors in Cosys-AirSim"},{"location":"sensors/#default-sensors","text":"If no sensors are specified in the settings.json , then the following sensors are enabled by default based on the sim mode.","title":"Default sensors"},{"location":"sensors/#multirotor","text":"Imu Magnetometer Gps Barometer","title":"Multirotor"},{"location":"sensors/#car","text":"Gps","title":"Car"},{"location":"sensors/#computervision","text":"None Behind the scenes, 'createDefaultSensorSettings' method in AirSimSettings.hpp which sets up the above sensors with their default parameters, depending on the sim mode specified in the settings.json file.","title":"ComputerVision"},{"location":"sensors/#configuring-the-default-sensor-list","text":"The default sensor list can be configured in settings json: \"DefaultSensors\": { \"Barometer\": { \"SensorType\": 1, \"Enabled\" : true, \"PressureFactorSigma\": 0.001825, \"PressureFactorTau\": 3600, \"UncorrelatedNoiseSigma\": 2.7, \"UpdateLatency\": 0, \"UpdateFrequency\": 50, \"StartupDelay\": 0 }, \"Imu\": { \"SensorType\": 2, \"Enabled\" : true, \"GenerateNoise\": false, \"AngularRandomWalk\": 0.3, \"GyroBiasStabilityTau\": 500, \"GyroBiasStability\": 4.6, \"VelocityRandomWalk\": 0.24, \"AccelBiasStabilityTau\": 800, \"AccelBiasStability\": 36 }, \"Gps\": { \"SensorType\": 3, \"Enabled\" : true, \"EphTimeConstant\": 0.9, \"EpvTimeConstant\": 0.9, \"EphInitial\": 25, \"EpvInitial\": 25, \"EphFinal\": 0.1, \"EpvFinal\": 0.1, \"EphMin3d\": 3, \"EphMin2d\": 4, \"UpdateLatency\": 0.2, \"UpdateFrequency\": 50, \"StartupDelay\": 1 }, \"Magnetometer\": { \"SensorType\": 4, \"Enabled\" : true, \"NoiseSigma\": 0.005, \"ScaleFactor\": 1, \"NoiseBias\": 0, \"UpdateLatency\": 0, \"UpdateFrequency\": 50, \"StartupDelay\": 0 }, \"Distance\": { \"SensorType\": 5, \"Enabled\" : true, \"MinDistance\": 0.2, \"MaxDistance\": 40, \"X\": 0, \"Y\": 0, \"Z\": -1, \"Yaw\": 0, \"Pitch\": 0, \"Roll\": 0, \"DrawDebugPoints\": false } },","title":"Configuring the default sensor list"},{"location":"sensors/#configuring-vehicle-specific-sensor-list","text":"A vehicle can override a subset of the default sensors listed above. A Lidar and Distance sensor are not added to a vehicle by default, so those you need to add this way. Each sensor must have a valid \"SensorType\" and a subset of the properties can be defined that override the default values shown above and you can set Enabled to false to disable a specific type of sensor. \"Vehicles\": { \"Drone1\": { \"VehicleType\": \"SimpleFlight\", \"AutoCreate\": true, ... \"Sensors\": { \"MyLidar1\": { \"SensorType\": 6, \"Enabled\" : true, \"NumberOfChannels\": 16, \"PointsPerSecond\": 10000, \"X\": 0, \"Y\": 0, \"Z\": -1, \"DrawDebugPoints\": true }, \"MyLidar2\": { \"SensorType\": 6, \"Enabled\" : true, \"NumberOfChannels\": 4, \"PointsPerSecond\": 10000, \"X\": 0, \"Y\": 0, \"Z\": -1, \"DrawDebugPoints\": true } } } }","title":"Configuring vehicle-specific sensor list"},{"location":"sensors/#sensor-specific-settings","text":"Each sensor-type has its own set of settings as well. Please see lidar for example of Lidar specific settings. Please see echo for example of Echo specific settings. Please see GPU lidar for example of GPU Lidar specific settings.","title":"Sensor specific settings"},{"location":"sensors/#sensor-apis","text":"","title":"Sensor APIs"},{"location":"sensors/#barometer","text":"msr::airlib::BarometerBase::Output getBarometerData(const std::string& barometer_name, const std::string& vehicle_name); barometer_data = client.getBarometerData(barometer_name = \"\", vehicle_name = \"\")","title":"Barometer"},{"location":"sensors/#imu","text":"msr::airlib::ImuBase::Output getImuData(const std::string& imu_name = \"\", const std::string& vehicle_name = \"\"); imu_data = client.getImuData(imu_name = \"\", vehicle_name = \"\")","title":"IMU"},{"location":"sensors/#gps","text":"msr::airlib::GpsBase::Output getGpsData(const std::string& gps_name = \"\", const std::string& vehicle_name = \"\"); gps_data = client.getGpsData(gps_name = \"\", vehicle_name = \"\")","title":"GPS"},{"location":"sensors/#magnetometer","text":"msr::airlib::MagnetometerBase::Output getMagnetometerData(const std::string& magnetometer_name = \"\", const std::string& vehicle_name = \"\"); magnetometer_data = client.getMagnetometerData(magnetometer_name = \"\", vehicle_name = \"\")","title":"Magnetometer"},{"location":"sensors/#distance-sensor","text":"msr::airlib::DistanceSensorData getDistanceSensorData(const std::string& distance_sensor_name = \"\", const std::string& vehicle_name = \"\"); distance_sensor_data = client.getDistanceSensorData(distance_sensor_name = \"\", vehicle_name = \"\") Lidar See lidar for Lidar API. Echo See echo for Echo API. GPU Lidar See GPU Lidar for GPU Lidar API. UWB/Wi-Fi These sensors are still experimental and are currently not documented. Please refer to the source code for more information.","title":"Distance sensor"},{"location":"settings/","text":"Cosys-AirSim Settings A good basic settings file that works with many of the examples can be found here as settings_example.json . It shows many of the custom sensors and vehicles that were added by Cosys-Lab. Where are Settings Stored? Cosys-AirSim is searching for the settings definition in the following order. The first match will be used: Looking at the (absolute) path specified by the -settings command line argument. For example, in Windows: AirSim.exe -settings=\"C:\\path\\to\\settings.json\" In Linux ./Blocks.sh -settings=\"/home/$USER/path/to/settings.json\" Looking for a json document passed as a command line argument by the -settings argument. For example, in Windows: AirSim.exe -settings={\"foo\":\"bar\"} In Linux ./Blocks.sh -settings={\"foo\":\"bar\"} Looking in the folder of the executable for a file called settings.json . This will be a deep location where the actual executable of the Editor or binary is stored. For e.g. with the Blocks binary, the location searched is /LinuxNoEditor/Blocks/Binaries/Linux/settings.json . Searching for settings.json in the folder from where the executable is launched This is a top-level directory containing the launch script or executable. For e.g. Linux: /LinuxNoEditor/settings.json , Windows: /WindowsNoEditor/settings.json Note that this path changes depending on where its invoked from. On Linux, if executing the Blocks.sh script from inside LinuxNoEditor folder like ./Blocks.sh , then the previous mentioned path is used. However, if launched from outside LinuxNoEditor folder such as ./LinuxNoEditor/Blocks.sh , then /settings.json will be used. Looking in the AirSim subfolder for a file called settings.json . The AirSim subfolder is located at Documents\\AirSim on Windows and ~/Documents/AirSim on Linux systems. The file is in usual json format . On first startup Cosys-AirSim would create settings.json file with no settings at the users home folder. To avoid problems, always use ASCII format to save json file. How to Chose Between Car/SkidVehicle/Multirotor? The default is to use multirotor. To use car simple set \"SimMode\": \"Car\" like this: { \"SettingsVersion\": 2.0, \"SimMode\": \"Car\" } To choose multirotor or skid vehicle, set \"SimMode\": \"Multirotor\" or \"SimMode\": \"SkidVehicle\" respectively. If you want to prompt user to select vehicle type then use \"SimMode\": \"\" . Available Settings and Their Defaults Below are complete list of settings available along with their default values. If any of the settings is missing from json file, then default value is used. Some default values are simply specified as \"\" which means actual value may be chosen based on the vehicle you are using. For example, ViewMode setting has default value \"\" which translates to \"FlyWithMe\" for drones and \"SpringArmChase\" for cars. Note this does not include most sensor types. WARNING: Do not copy paste all of below in your settings.json. We strongly recommend adding only those settings that you don't want default values. Only required element is \"SettingsVersion\" . { \"SimMode\": \"\", \"ClockType\": \"\", \"ClockSpeed\": 1, \"LocalHostIp\": \"127.0.0.1\", \"ApiServerPort\": 41451, \"RecordUIVisible\": true, \"MoveWorldOrigin\": false, \"LogMessagesVisible\": true, \"ShowLosDebugLines\": false, \"ViewMode\": \"\", \"RpcEnabled\": true, \"EngineSound\": true, \"PhysicsEngineName\": \"\", \"SpeedUnitFactor\": 1.0, \"SpeedUnitLabel\": \"m/s\", \"Wind\": { \"X\": 0, \"Y\": 0, \"Z\": 0 }, \"CameraDirector\": { \"FollowDistance\": -3, \"X\": NaN, \"Y\": NaN, \"Z\": NaN, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN }, \"Recording\": { \"RecordOnMove\": false, \"RecordInterval\": 0.05, \"Folder\": \"\", \"Enabled\": false, \"Cameras\": [ { \"CameraName\": \"0\", \"ImageType\": 0, \"PixelsAsFloat\": false, \"VehicleName\": \"\", \"Compress\": true } ] }, \"CameraDefaults\": { \"CaptureSettings\": [ { \"ImageType\": 0, \"Width\": 256, \"Height\": 144, \"FOV_Degrees\": 90, \"AutoExposureSpeed\": 100, \"AutoExposureBias\": 0, \"AutoExposureMaxBrightness\": 0.64, \"AutoExposureMinBrightness\": 0.03, \"MotionBlurAmount\": 0, \"TargetGamma\": 1.0, \"ProjectionMode\": \"\", \"OrthoWidth\": 5.12, \"MotionBlurAmount\": 1, \"MotionBlurMax\": 10, \"ChromaticAberrationScale\": 2, \"IgnoreMarked\": false, \"LumenGIEnable\": true, \"LumenReflectionEnable\": true, \"LumenFinalQuality\": 1, \"LumenSceneDetail\": 1, \"LumenSceneLightningDetail\": 1 } ], \"NoiseSettings\": [ { \"Enabled\": false, \"ImageType\": 0, \"RandContrib\": 0.2, \"RandSpeed\": 100000.0, \"RandSize\": 500.0, \"RandDensity\": 2, \"HorzWaveContrib\":0.03, \"HorzWaveStrength\": 0.08, \"HorzWaveVertSize\": 1.0, \"HorzWaveScreenSize\": 1.0, \"HorzNoiseLinesContrib\": 1.0, \"HorzNoiseLinesDensityY\": 0.01, \"HorzNoiseLinesDensityXY\": 0.5, \"HorzDistortionContrib\": 1.0, \"HorzDistortionStrength\": 0.002, \"LensDistortionEnable\": true, \"LensDistortionAreaFalloff\": 2, \"LensDistortionAreaRadius\": 1, \"LensDistortionInvert\": false } ], \"Gimbal\": { \"Stabilization\": 0, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN }, \"X\": NaN, \"Y\": NaN, \"Z\": NaN, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN, \"UnrealEngine\": { \"PixelFormatOverride\": [ { \"ImageType\": 0, \"PixelFormat\": 0 } ] } }, \"OriginGeopoint\": { \"Latitude\": 47.641468, \"Longitude\": -122.140165, \"Altitude\": 122 }, \"TimeOfDay\": { \"Enabled\": false, \"StartDateTime\": \"\", \"CelestialClockSpeed\": 1, \"StartDateTimeDst\": false, \"UpdateIntervalSecs\": 60 }, \"SubWindows\": [ {\"WindowID\": 0, \"CameraName\": \"0\", \"ImageType\": 3, \"VehicleName\": \"\", \"Visible\": false}, {\"WindowID\": 1, \"CameraName\": \"0\", \"ImageType\": 5, \"VehicleName\": \"\", \"Visible\": false}, {\"WindowID\": 2, \"CameraName\": \"0\", \"ImageType\": 0, \"VehicleName\": \"\", \"Visible\": false} ], \"PawnPaths\": { \"BareboneCar\": {\"PawnBP\": \"Class'/AirSim/VehicleAdv/Vehicle/VehicleAdvPawn.VehicleAdvPawn_C'\"}, \"DefaultCar\": {\"PawnBP\": \"Class'/AirSim/VehicleAdv/SUV/SuvCarPawn.SuvCarPawn_C'\"}, \"DefaultQuadrotor\": {\"PawnBP\": \"Class'/AirSim/Blueprints/BP_FlyingPawn.BP_FlyingPawn_C'\"}, \"DefaultComputerVision\": {\"PawnBP\": \"Class'/AirSim/Blueprints/BP_ComputerVisionPawn.BP_ComputerVisionPawn_C'\"} }, \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", \"DefaultVehicleState\": \"Armed\", \"AutoCreate\": true, \"PawnPath\": \"\", \"EnableCollisionPassthrough\": false, \"EnableCollisions\": true, \"AllowAPIAlways\": true, \"EnableTrace\": false, \"RC\": { \"RemoteControlID\": 0, \"AllowAPIWhenDisconnected\": false }, \"Cameras\": { //same elements as CameraDefaults above, key as name }, \"X\": NaN, \"Y\": NaN, \"Z\": NaN, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN }, \"PhysXCar\": { \"VehicleType\": \"PhysXCar\", \"DefaultVehicleState\": \"\", \"AutoCreate\": true, \"PawnPath\": \"\", \"EnableCollisionPassthrough\": false, \"EnableCollisions\": true, \"RC\": { \"RemoteControlID\": -1 }, \"Cameras\": { \"MyCamera1\": { //same elements as elements inside CameraDefaults above }, \"MyCamera2\": { //same elements as elements inside CameraDefaults above }, }, \"X\": NaN, \"Y\": NaN, \"Z\": NaN, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN } } } SimMode SimMode determines which simulation mode will be used. Below are currently supported values: - \"\" : prompt user to select vehicle type multirotor or car - \"Multirotor\" : Use multirotor simulation - \"Car\" : Use car simulation - \"ComputerVision\" : Use only camera, no vehicle or physics - \"SkidVehicle\" : use skid-steering vehicle simulation ViewMode The ViewMode determines which camera to use as default and how camera will follow the vehicle. For multirotors, the default ViewMode is \"FlyWithMe\" while for cars the default ViewMode is \"SpringArmChase\" . FlyWithMe : Chase the vehicle from behind with 6 degrees of freedom GroundObserver : Chase the vehicle from 6' above the ground but with full freedom in XY plane. Fpv : View the scene from front camera of vehicle Manual : Don't move camera automatically. Use arrow keys and ASWD keys for move camera manually. SpringArmChase : Chase the vehicle with camera mounted on (invisible) arm that is attached to the vehicle via spring (so it has some latency in movement). NoDisplay : This will freeze rendering for main screen however rendering for subwindows, recording and APIs remain active. This mode is useful to save resources in \"headless\" mode where you are only interested in getting images and don't care about what gets rendered on main screen. This may also improve FPS for recording images. Annotation The annotation system allows you to choose different groundtruth labeling techniques to create more data from your simulation. Find more info here for defining the settings. TimeOfDay This setting controls the position of Sun in the environment. By default Enabled is false which means Sun's position is left at whatever was the default in the environment and it doesn't change over the time. If Enabled is true then Sun position is computed using longitude, latitude and altitude specified in OriginGeopoint section for the date specified in StartDateTime in the string format as %Y-%m-%d %H:%M:%S , for example, 2018-02-12 15:20:00 . If this string is empty then current date and time is used. If StartDateTimeDst is true then we adjust for day light savings time. The Sun's position is then continuously updated at the interval specified in UpdateIntervalSecs . In some cases, it might be desirable to have celestial clock run faster or slower than simulation clock. This can be specified using CelestialClockSpeed , for example, value 100 means for every 1 second of simulation clock, Sun's position is advanced by 100 seconds so Sun will move in sky much faster. Also see Time of Day API . OriginGeopoint This setting specifies the latitude, longitude and altitude of the Player Start component placed in the Unreal environment. The vehicle's home point is computed using this transformation. Note that all coordinates exposed via APIs are using NED system in SI units which means each vehicle starts at (0, 0, 0) in NED system. Time of Day settings are computed for geographical coordinates specified in OriginGeopoint . SubWindows This setting determines what is shown in each of 3 subwindows which are visible when you press 1,2,3 keys. WindowID : Can be 0 to 2 CameraName : is any available camera on the vehicle ImageType : integer value determines what kind of image gets shown according to ImageType enum . VehicleName : string allows you to specify the vehicle to use the camera from, used when multiple vehicles are specified in the settings. First vehicle's camera will be used if there are any mistakes such as incorrect vehicle name, or only a single vehicle. Annotation : string allows you to specify the annotation layer to use for the camera. This is only if using the Annotation camera type for ImageType (value is 10). For example, for a single car vehicle, below shows driver view, front bumper view and rear view as scene, depth and surface normals respectively. \"SubWindows\": [ {\"WindowID\": 0, \"ImageType\": 0, \"CameraName\": \"3\", \"Visible\": true}, {\"WindowID\": 1, \"ImageType\": 3, \"CameraName\": \"0\", \"Visible\": true}, {\"WindowID\": 2, \"ImageType\": 6, \"CameraName\": \"4\", \"Visible\": true} ] In case of multiple vehicles, different vehicles can be specified as follows- \"SubWindows\": [ {\"WindowID\": 0, \"CameraName\": \"0\", \"ImageType\": 3, \"VehicleName\": \"Car1\", \"Visible\": false}, {\"WindowID\": 1, \"CameraName\": \"0\", \"ImageType\": 5, \"VehicleName\": \"Car2\", \"Visible\": false}, {\"WindowID\": 2, \"CameraName\": \"0\", \"ImageType\": 0, \"VehicleName\": \"Car1\", \"Visible\": false} ] Recording The recording feature allows you to record data such as position, orientation, velocity along with the captured image at specified intervals. You can start recording by pressing red Record button on lower right or the R key. The data is stored in the Documents\\AirSim folder (or the folder specified using Folder ), in a time stamped subfolder for each recording session, as tab separated file. RecordInterval : specifies minimal interval in seconds between capturing two images. RecordOnMove : specifies that do not record frame if there was vehicle's position or orientation hasn't changed. Folder : Parent folder where timestamped subfolder with recordings are created. Absolute path of the directory must be specified. If not used, then Documents/AirSim folder will be used. E.g. \"Folder\": \"/home//Documents\" Enabled : Whether Recording should start from the beginning itself, setting to true will start recording automatically when the simulation starts. By default, it's set to false Cameras : this element controls which cameras are used to capture images. By default scene image from camera 0 is recorded as compressed png format. This setting is json array so you can specify multiple cameras to capture images, each with potentially different image types . When PixelsAsFloat is true, image is saved as pfm file instead of png file. VehicleName option allows you to specify separate cameras for individual vehicles. If the Cameras element isn't present, Scene image from the default camera of each vehicle will be recorded. If you don't want to record any images and just the vehicle's physics data, then specify the Cameras element but leave it empty, like this: \"Cameras\": [] add the field Annotation , a string allowing you to specify the annotation layer to use for the camera. This is only if using the Annotation camera type for ImageType . For example, the Cameras element below records scene & segmentation images for Car1 & scene for Car2 - \"Cameras\": [ { \"CameraName\": \"0\", \"ImageType\": 0, \"PixelsAsFloat\": false, \"VehicleName\": \"Car1\", \"Compress\": true }, { \"CameraName\": \"0\", \"ImageType\": 5, \"PixelsAsFloat\": false, \"VehicleName\": \"Car1\", \"Compress\": true }, { \"CameraName\": \"0\", \"ImageType\": 0, \"PixelsAsFloat\": false, \"VehicleName\": \"Car2\", \"Compress\": true } ] Check out Modifying Recording Data for details on how to modify the kinematics data being recorded. ClockSpeed This setting allows you to set the speed of simulation clock with respect to wall clock. For example, value of 5.0 would mean simulation clock has 5 seconds elapsed when wall clock has 1 second elapsed (i.e. simulation is running faster). The value of 0.1 means that simulation clock is 10X slower than wall clock. The value of 1 means simulation is running in real time. It is important to realize that quality of simulation may decrease as the simulation clock runs faster. You might see artifacts like object moving past obstacles because collision is not detected. However slowing down simulation clock (i.e. values < 1.0) generally improves the quality of simulation. Wind Settings This setting specifies the wind speed in World frame, in NED direction. Values are in m/s. By default, speed is 0, i.e. no wind. Camera Director Settings This element specifies the settings used for the camera following the vehicle in the ViewPort. FollowDistance : Distance at which camera follows the vehicle, default is -8 (8 meters) for Car, -3 for others. X, Y, Z, Yaw, Roll, Pitch : These elements allows you to specify the position and orientation of the camera relative to the vehicle. Position is in NED coordinates in SI units with origin set to Player Start location in Unreal environment. The orientation is specified in degrees. The CameraDefaults element at root level specifies defaults used for all cameras. These defaults can be overridden for individual camera in Cameras element inside Vehicles as described later. Main settings Like other sensors the pose of the sensor in the vehicle frame can be defined by X, Y, Z, Roll, Pitch, Yaw parameters. Furthermore there are some other settings available: * DrawSensor : Draw the physical sensor in the world on the vehicle with a 3D axes shown where the sensor is. * External : Uncouple the sensor from the vehicle. If enabled, the position and orientation will be relative to Unreal world coordinates. Note that if MoveWorldOrigin in the settings.json is set to true the Unreal coordinates will be moved to be the same origin as the player start location and as such this may effect where the sensor will spawn. * ExternalLocal : When in external mode, if this is enabled the retrieved pose of the sensor will be in Local NED coordinates(from starting position from vehicle) and not converted Unreal NED coordinates which is default. Note that if MoveWorldOrigin in the settings.json is set to true the Unreal coordinates will be moved to be the same origin as the player start location and as such this may effect what coordinates are returned if set to false . Note on ImageType element The ImageType element in JSON array determines which image type that settings applies to. The valid values are described in ImageType section . For example, CaptureSettings element is json array so you can add settings for multiple image types easily. CaptureSettings The CaptureSettings determines how different image types such as scene, depth, disparity, surface normals and segmentation views are rendered. The Width, Height and FOV settings should be self explanatory. The AutoExposureSpeed decides how fast eye adaptation works. We set to generally high value such as 100 to avoid artifacts in image capture. Similarly we set MotionBlurAmount to 0 by default to avoid artifacts in ground truth images. The ProjectionMode decides the projection used by the capture camera and can take value \"perspective\" (default) or \"orthographic\". If projection mode is \"orthographic\" then OrthoWidth determines width of projected area captured in meters. To disable the rendering of certain objects on specific cameras or all, use the IgnoreMarked boolean setting. This requires to mark individual objects that have to be ignore using an Unreal Tag called MarkedIgnore . You can also tweak the motion blur and chromatic Aberration here. Unreal 5 introduces Lumen lightning. Due to the cameras using scene capture components enabling Lumen for them can be costly on performance. Settings have been added specfically for the scene camera to customize the usage of Lumen for Global Illumination and Reflections. The LumenGIEnable and LumenReflectionEnable settings enable or disable Lumen for the camera. The LumenFinalQuality (0.25-2) setting determines the quality of the final image. The LumenSceneDetail (0.25-4) setting determines the quality of the scene. The LumenSceneLightningDetail (0.25-2) setting determines the quality of the lightning in the scene. For explanation of other settings, please see this article . NoiseSettings The NoiseSettings allows to add noise to the specified image type with a goal of simulating camera sensor noise, interference and other artifacts. By default no noise is added, i.e., Enabled: false . If you set Enabled: true then following different types of noise and interference artifacts are enabled, each can be further tuned using setting. The noise effects are implemented as shader created as post processing material in Unreal Engine called CameraSensorNoise . Demo of camera noise and interference simulation: Random noise This adds random noise blobs with following parameters. * RandContrib : This determines blend ratio of noise pixel with image pixel, 0 means no noise and 1 means only noise. * RandSpeed : This determines how fast noise fluctuates, 1 means no fluctuation and higher values like 1E6 means full fluctuation. * RandSize : This determines how coarse noise is, 1 means every pixel has its own noise while higher value means more than 1 pixels share same noise value. * RandDensity : This determines how many pixels out of total will have noise, 1 means all pixels while higher value means lesser number of pixels (exponentially). Horizontal bump distortion This adds horizontal bumps / flickering / ghosting effect. * HorzWaveContrib : This determines blend ratio of noise pixel with image pixel, 0 means no noise and 1 means only noise. * HorzWaveStrength : This determines overall strength of the effect. * HorzWaveVertSize : This determines how many vertical pixels would be effected by the effect. * HorzWaveScreenSize : This determines how much of the screen is effected by the effect. Horizontal noise lines This adds regions of noise on horizontal lines. * HorzNoiseLinesContrib : This determines blend ratio of noise pixel with image pixel, 0 means no noise and 1 means only noise. * HorzNoiseLinesDensityY : This determines how many pixels in horizontal line gets affected. * HorzNoiseLinesDensityXY : This determines how many lines on screen gets affected. Horizontal line distortion This adds fluctuations on horizontal line. * HorzDistortionContrib : This determines blend ratio of noise pixel with image pixel, 0 means no noise and 1 means only noise. * HorzDistortionStrength : This determines how large is the distortion. Radial Lens Distortion This adds radial lens distortion to the camera sensor. * LensDistortionEnable : Enable or disable this feature * LensDistortionAreaFalloff : The size of the area to distort * LensDistortionAreaRadius : The distortion radius * LensDistortionInvert : Set to true to invert and create 'pincushion distortion' or false for 'barrel distortion' Gimbal The Gimbal element allows to freeze camera orientation for pitch, roll and/or yaw. This setting is ignored unless ImageType is -1. The Stabilization is defaulted to 0 meaning no gimbal i.e. camera orientation changes with body orientation on all axis. The value of 1 means full stabilization. The value between 0 to 1 acts as a weight for fixed angles specified (in degrees, in world-frame) in Pitch , Roll and Yaw elements and orientation of the vehicle body. When any of the angles is omitted from json or set to NaN, that angle is not stabilized (i.e. it moves along with vehicle body). UnrealEngine This element contains settings specific to the Unreal Engine. These will be ignored in the Unity project. * PixelFormatOverride : This contains a list of elements that have both a ImageType and PixelFormat setting. Each element allows you to override the default pixel format of the UTextureRenderTarget2D object instantiated for the capture specified by the ImageType setting. Specifying this element allows you to prevent crashes caused by unexpected pixel formats (see #4120 and #4339 for examples of these crashes). A full list of pixel formats can be viewed here . Vehicles Settings Each simulation mode will go through the list of vehicles specified in this setting and create the ones that has \"AutoCreate\": true . Each vehicle specified in this setting has key which becomes the name of the vehicle. If \"Vehicles\" element is missing then this list is populated with default car named \"PhysXCar\" and default multirotor named \"SimpleFlight\". Common Vehicle Setting VehicleType : This could be either PhysXCar , ArduRover or BoxCar for the Car SimMode, SimpleFlight , ArduCopter or PX4Multirotor for the MultiRotor SimMode, ComputerVision for the ComputerVision SimMode and CPHusky or Pioneer for SkidVehicle SimMode. you can use There is no default value therefore this element must be specified. PawnPath : This allows to override the pawn blueprint to use for the vehicle. For example, you may create new pawn blueprint derived from ACarPawn for a warehouse robot in your own project outside the Cosys-AirSim code and then specify its path here. See also PawnPaths . Note that you have to specify your custom pawn blueprint class path inside the global PawnPaths object using your proprietarily defined object name, and quote that name inside the Vehicles setting. For example, { ... \"PawnPaths\": { \"CustomPawn\": {\"PawnBP\": \"Class'/Game/Assets/Blueprints/MyPawn.MyPawn_C'\"} }, \"Vehicles\": { \"MyVehicle\": { \"VehicleType\": ..., \"PawnPath\": \"CustomPawn\", ... } } } DefaultVehicleState : Possible value for multirotors is Armed or Disarmed . AutoCreate : If true then this vehicle would be spawned (if supported by selected sim mode). RC : This sub-element allows to specify which remote controller to use for vehicle using RemoteControlID . The value of -1 means use keyboard (not supported yet for multirotors). The value >= 0 specifies one of many remote controllers connected to the system. The list of available RCs can be seen in Game Controllers panel in Windows, for example. X, Y, Z, Yaw, Roll, Pitch : These elements allows you to specify the initial position and orientation of the vehicle. Position is in NED coordinates in SI units with origin set to Player Start location in Unreal environment. The orientation is specified in degrees. Sensors : This element specifies the sensors associated with the vehicle, see Sensors page for details. IsFpvVehicle : This setting allows to specify which vehicle camera will follow and the view that will be shown when ViewMode is set to Fpv. By default, Cosys-AirSim selects the first vehicle in settings as FPV vehicle. Cameras : This element specifies camera settings for vehicle. The key in this element is name of the available camera and the value is same as CameraDefaults as described above. For example, to change FOV for the front center camera to 120 degrees, you can use this for Vehicles setting: \"Vehicles\": { \"FishEyeDrone\": { \"VehicleType\": \"SimpleFlight\", \"Cameras\": { \"front-center\": { \"CaptureSettings\": [ { \"ImageType\": 0, \"FOV_Degrees\": 120 } ] } } } } Using PX4 By default we use simple_flight so you don't have to do separate HITL or SITL setups. We also support \"PX4\" for advanced users. To use PX4 with Cosys-AirSim, you can use the following for Vehicles setting: \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", } } Additional PX4 Settings The defaults for PX4 is to enable hardware-in-loop setup. There are various other settings available for PX4 as follows with their default values: \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"Lockstep\": true, \"ControlIp\": \"127.0.0.1\", \"ControlPortLocal\": 14540, \"ControlPortRemote\": 14580, \"LogViewerHostIp\": \"127.0.0.1\", \"LogViewerPort\": 14388, \"OffboardCompID\": 1, \"OffboardSysID\": 134, \"QgcHostIp\": \"127.0.0.1\", \"QgcPort\": 14550, \"SerialBaudRate\": 115200, \"SerialPort\": \"*\", \"SimCompID\": 42, \"SimSysID\": 142, \"TcpPort\": 4560, \"UdpIp\": \"127.0.0.1\", \"UdpPort\": 14560, \"UseSerial\": true, \"UseTcp\": false, \"VehicleCompID\": 1, \"VehicleSysID\": 135, \"Model\": \"Generic\", \"LocalHostIp\": \"127.0.0.1\", \"Logs\": \"d:\\\\temp\\\\mavlink\", \"Sensors\": { ... } \"Parameters\": { ... } } } These settings define the MavLink SystemId and ComponentId for the Simulator (SimSysID, SimCompID), and for the vehicle (VehicleSysID, VehicleCompID) and the node that allows remote control of the drone from another app this is called the offboard node (OffboardSysID, OffboardCompID). If you want the simulator to also forward mavlink messages to your ground control app (like QGroundControl) you can also set the UDP address for that in case you want to run that on a different machine (QgcHostIp, QgcPort). The default is local host so QGroundControl should \"just work\" if it is running on the same machine. You can connect the simulator to the LogViewer app, provided in this repo, by setting the UDP address for that (LogViewerHostIp, LogViewerPort). And for each flying drone added to the simulator there is a named block of additional settings. In the above you see the default name \"PX4\". You can change this name from the Unreal Editor when you add a new BP_FlyingPawn asset. You will see these properties grouped under the category \"MavLink\". The MavLink node for this pawn can be remote over UDP or it can be connected to a local serial port. If serial then set UseSerial to true, otherwise set UseSerial to false. For serial connections you also need to set the appropriate SerialBaudRate. The default of 115200 works with Pixhawk version 2 over USB. When communicating with the PX4 drone over serial port both the HIL_ messages and vehicle control messages share the same serial port. When communicating over UDP or TCP PX4 requires two separate channels. If UseTcp is false, then UdpIp, UdpPort are used to send HIL_ messages, otherwise the TcpPort is used. TCP support in PX4 was added in 1.9.2 with the lockstep feature because the guarantee of message delivery that TCP provides is required for the proper functioning of lockstep. Cosys-AirSim becomes a TCP server in that case, and waits for a connection from the PX4 app. The second channel for controlling the vehicle is defined by (ControlIp, ControlPort) and is always a UDP channel. The Sensors section can provide customized settings for simulated sensors, see Sensors . The Parameters section can set PX4 parameters during initialization of the PX4 connection. See Setting up PX4 Software-in-Loop for an example. Using ArduPilot ArduPilot Copter & Rover vehicles are supported in latest Cosys-AirSim main branch & releases v1.3.0 and later. For settings and how to use, please see ArduPilot SITL with Cosys-AirSim Other Settings EngineSound To turn off the engine sound use setting \"EngineSound\": false . Currently this setting applies only to car. PawnPaths This allows you to specify your own vehicle pawn blueprints, for example, you can replace the default car in AirSim with your own car. Your vehicle BP can reside in Content folder of your own Unreal project (i.e. outside of AirSim plugin folder). For example, if you have a car BP located in file Content\\MyCar\\MySedanBP.uasset in your project then you can set \"DefaultCar\": {\"PawnBP\":\"Class'/Game/MyCar/MySedanBP.MySedanBP_C'\"} . The XYZ.XYZ_C is a special notation required to specify class for BP XYZ . Please note that your BP must be derived from CarPawn class. By default this is not the case but you can re-parent the BP using the \"Class Settings\" button in toolbar in UE editor after you open the BP and then choosing \"Car Pawn\" for Parent Class settings in Class Options. It is also a good idea to disable \"Auto Possess Player\" and \"Auto Possess AI\" as well as set AI Controller Class to None in BP details. Please make sure your asset is included for cooking in packaging options if you are creating binary. PhysicsEngineName For cars, we support only PhysX for now (regardless of value in this setting). For multirotors, we support \"FastPhysicsEngine\" and \"ExternalPhysicsEngine\" . \"ExternalPhysicsEngine\" allows the drone to be controlled via setVehiclePose (), keeping the drone in place until the next call. It is especially useful for moving the AirSim drone using an external simulator or on a saved path. LocalHostIp Setting Now when connecting to remote machines you may need to pick a specific Ethernet adapter to reach those machines, for example, it might be over Ethernet or over Wi-Fi, or some other special virtual adapter or a VPN. Your PC may have multiple networks, and those networks might not be allowed to talk to each other, in which case the UDP messages from one network will not get through to the others. So the LocalHostIp allows you to configure how you are reaching those machines. The default of 127.0.0.1 is not able to reach external machines, this default is only used when everything you are talking to is contained on a single PC. ApiServerPort This setting determines the server port that used by airsim clients, default port is 41451. By specifying different ports, the user can run multiple environments in parallel to accelerate data collection process. SpeedUnitFactor Unit conversion factor for speed related to m/s , default is 1. Used in conjunction with SpeedUnitLabel. This may be only used for display purposes for example on-display speed when car is being driven. For example, to get speed in miles/hr use factor 2.23694. SpeedUnitLabel Unit label for speed, default is m/s . Used in conjunction with SpeedUnitFactor.","title":"Settings"},{"location":"settings/#cosys-airsim-settings","text":"A good basic settings file that works with many of the examples can be found here as settings_example.json . It shows many of the custom sensors and vehicles that were added by Cosys-Lab.","title":"Cosys-AirSim Settings"},{"location":"settings/#where-are-settings-stored","text":"Cosys-AirSim is searching for the settings definition in the following order. The first match will be used: Looking at the (absolute) path specified by the -settings command line argument. For example, in Windows: AirSim.exe -settings=\"C:\\path\\to\\settings.json\" In Linux ./Blocks.sh -settings=\"/home/$USER/path/to/settings.json\" Looking for a json document passed as a command line argument by the -settings argument. For example, in Windows: AirSim.exe -settings={\"foo\":\"bar\"} In Linux ./Blocks.sh -settings={\"foo\":\"bar\"} Looking in the folder of the executable for a file called settings.json . This will be a deep location where the actual executable of the Editor or binary is stored. For e.g. with the Blocks binary, the location searched is /LinuxNoEditor/Blocks/Binaries/Linux/settings.json . Searching for settings.json in the folder from where the executable is launched This is a top-level directory containing the launch script or executable. For e.g. Linux: /LinuxNoEditor/settings.json , Windows: /WindowsNoEditor/settings.json Note that this path changes depending on where its invoked from. On Linux, if executing the Blocks.sh script from inside LinuxNoEditor folder like ./Blocks.sh , then the previous mentioned path is used. However, if launched from outside LinuxNoEditor folder such as ./LinuxNoEditor/Blocks.sh , then /settings.json will be used. Looking in the AirSim subfolder for a file called settings.json . The AirSim subfolder is located at Documents\\AirSim on Windows and ~/Documents/AirSim on Linux systems. The file is in usual json format . On first startup Cosys-AirSim would create settings.json file with no settings at the users home folder. To avoid problems, always use ASCII format to save json file.","title":"Where are Settings Stored?"},{"location":"settings/#how-to-chose-between-carskidvehiclemultirotor","text":"The default is to use multirotor. To use car simple set \"SimMode\": \"Car\" like this: { \"SettingsVersion\": 2.0, \"SimMode\": \"Car\" } To choose multirotor or skid vehicle, set \"SimMode\": \"Multirotor\" or \"SimMode\": \"SkidVehicle\" respectively. If you want to prompt user to select vehicle type then use \"SimMode\": \"\" .","title":"How to Chose Between Car/SkidVehicle/Multirotor?"},{"location":"settings/#available-settings-and-their-defaults","text":"Below are complete list of settings available along with their default values. If any of the settings is missing from json file, then default value is used. Some default values are simply specified as \"\" which means actual value may be chosen based on the vehicle you are using. For example, ViewMode setting has default value \"\" which translates to \"FlyWithMe\" for drones and \"SpringArmChase\" for cars. Note this does not include most sensor types. WARNING: Do not copy paste all of below in your settings.json. We strongly recommend adding only those settings that you don't want default values. Only required element is \"SettingsVersion\" . { \"SimMode\": \"\", \"ClockType\": \"\", \"ClockSpeed\": 1, \"LocalHostIp\": \"127.0.0.1\", \"ApiServerPort\": 41451, \"RecordUIVisible\": true, \"MoveWorldOrigin\": false, \"LogMessagesVisible\": true, \"ShowLosDebugLines\": false, \"ViewMode\": \"\", \"RpcEnabled\": true, \"EngineSound\": true, \"PhysicsEngineName\": \"\", \"SpeedUnitFactor\": 1.0, \"SpeedUnitLabel\": \"m/s\", \"Wind\": { \"X\": 0, \"Y\": 0, \"Z\": 0 }, \"CameraDirector\": { \"FollowDistance\": -3, \"X\": NaN, \"Y\": NaN, \"Z\": NaN, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN }, \"Recording\": { \"RecordOnMove\": false, \"RecordInterval\": 0.05, \"Folder\": \"\", \"Enabled\": false, \"Cameras\": [ { \"CameraName\": \"0\", \"ImageType\": 0, \"PixelsAsFloat\": false, \"VehicleName\": \"\", \"Compress\": true } ] }, \"CameraDefaults\": { \"CaptureSettings\": [ { \"ImageType\": 0, \"Width\": 256, \"Height\": 144, \"FOV_Degrees\": 90, \"AutoExposureSpeed\": 100, \"AutoExposureBias\": 0, \"AutoExposureMaxBrightness\": 0.64, \"AutoExposureMinBrightness\": 0.03, \"MotionBlurAmount\": 0, \"TargetGamma\": 1.0, \"ProjectionMode\": \"\", \"OrthoWidth\": 5.12, \"MotionBlurAmount\": 1, \"MotionBlurMax\": 10, \"ChromaticAberrationScale\": 2, \"IgnoreMarked\": false, \"LumenGIEnable\": true, \"LumenReflectionEnable\": true, \"LumenFinalQuality\": 1, \"LumenSceneDetail\": 1, \"LumenSceneLightningDetail\": 1 } ], \"NoiseSettings\": [ { \"Enabled\": false, \"ImageType\": 0, \"RandContrib\": 0.2, \"RandSpeed\": 100000.0, \"RandSize\": 500.0, \"RandDensity\": 2, \"HorzWaveContrib\":0.03, \"HorzWaveStrength\": 0.08, \"HorzWaveVertSize\": 1.0, \"HorzWaveScreenSize\": 1.0, \"HorzNoiseLinesContrib\": 1.0, \"HorzNoiseLinesDensityY\": 0.01, \"HorzNoiseLinesDensityXY\": 0.5, \"HorzDistortionContrib\": 1.0, \"HorzDistortionStrength\": 0.002, \"LensDistortionEnable\": true, \"LensDistortionAreaFalloff\": 2, \"LensDistortionAreaRadius\": 1, \"LensDistortionInvert\": false } ], \"Gimbal\": { \"Stabilization\": 0, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN }, \"X\": NaN, \"Y\": NaN, \"Z\": NaN, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN, \"UnrealEngine\": { \"PixelFormatOverride\": [ { \"ImageType\": 0, \"PixelFormat\": 0 } ] } }, \"OriginGeopoint\": { \"Latitude\": 47.641468, \"Longitude\": -122.140165, \"Altitude\": 122 }, \"TimeOfDay\": { \"Enabled\": false, \"StartDateTime\": \"\", \"CelestialClockSpeed\": 1, \"StartDateTimeDst\": false, \"UpdateIntervalSecs\": 60 }, \"SubWindows\": [ {\"WindowID\": 0, \"CameraName\": \"0\", \"ImageType\": 3, \"VehicleName\": \"\", \"Visible\": false}, {\"WindowID\": 1, \"CameraName\": \"0\", \"ImageType\": 5, \"VehicleName\": \"\", \"Visible\": false}, {\"WindowID\": 2, \"CameraName\": \"0\", \"ImageType\": 0, \"VehicleName\": \"\", \"Visible\": false} ], \"PawnPaths\": { \"BareboneCar\": {\"PawnBP\": \"Class'/AirSim/VehicleAdv/Vehicle/VehicleAdvPawn.VehicleAdvPawn_C'\"}, \"DefaultCar\": {\"PawnBP\": \"Class'/AirSim/VehicleAdv/SUV/SuvCarPawn.SuvCarPawn_C'\"}, \"DefaultQuadrotor\": {\"PawnBP\": \"Class'/AirSim/Blueprints/BP_FlyingPawn.BP_FlyingPawn_C'\"}, \"DefaultComputerVision\": {\"PawnBP\": \"Class'/AirSim/Blueprints/BP_ComputerVisionPawn.BP_ComputerVisionPawn_C'\"} }, \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", \"DefaultVehicleState\": \"Armed\", \"AutoCreate\": true, \"PawnPath\": \"\", \"EnableCollisionPassthrough\": false, \"EnableCollisions\": true, \"AllowAPIAlways\": true, \"EnableTrace\": false, \"RC\": { \"RemoteControlID\": 0, \"AllowAPIWhenDisconnected\": false }, \"Cameras\": { //same elements as CameraDefaults above, key as name }, \"X\": NaN, \"Y\": NaN, \"Z\": NaN, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN }, \"PhysXCar\": { \"VehicleType\": \"PhysXCar\", \"DefaultVehicleState\": \"\", \"AutoCreate\": true, \"PawnPath\": \"\", \"EnableCollisionPassthrough\": false, \"EnableCollisions\": true, \"RC\": { \"RemoteControlID\": -1 }, \"Cameras\": { \"MyCamera1\": { //same elements as elements inside CameraDefaults above }, \"MyCamera2\": { //same elements as elements inside CameraDefaults above }, }, \"X\": NaN, \"Y\": NaN, \"Z\": NaN, \"Pitch\": NaN, \"Roll\": NaN, \"Yaw\": NaN } } }","title":"Available Settings and Their Defaults"},{"location":"settings/#simmode","text":"SimMode determines which simulation mode will be used. Below are currently supported values: - \"\" : prompt user to select vehicle type multirotor or car - \"Multirotor\" : Use multirotor simulation - \"Car\" : Use car simulation - \"ComputerVision\" : Use only camera, no vehicle or physics - \"SkidVehicle\" : use skid-steering vehicle simulation","title":"SimMode"},{"location":"settings/#viewmode","text":"The ViewMode determines which camera to use as default and how camera will follow the vehicle. For multirotors, the default ViewMode is \"FlyWithMe\" while for cars the default ViewMode is \"SpringArmChase\" . FlyWithMe : Chase the vehicle from behind with 6 degrees of freedom GroundObserver : Chase the vehicle from 6' above the ground but with full freedom in XY plane. Fpv : View the scene from front camera of vehicle Manual : Don't move camera automatically. Use arrow keys and ASWD keys for move camera manually. SpringArmChase : Chase the vehicle with camera mounted on (invisible) arm that is attached to the vehicle via spring (so it has some latency in movement). NoDisplay : This will freeze rendering for main screen however rendering for subwindows, recording and APIs remain active. This mode is useful to save resources in \"headless\" mode where you are only interested in getting images and don't care about what gets rendered on main screen. This may also improve FPS for recording images.","title":"ViewMode"},{"location":"settings/#annotation","text":"The annotation system allows you to choose different groundtruth labeling techniques to create more data from your simulation. Find more info here for defining the settings.","title":"Annotation"},{"location":"settings/#timeofday","text":"This setting controls the position of Sun in the environment. By default Enabled is false which means Sun's position is left at whatever was the default in the environment and it doesn't change over the time. If Enabled is true then Sun position is computed using longitude, latitude and altitude specified in OriginGeopoint section for the date specified in StartDateTime in the string format as %Y-%m-%d %H:%M:%S , for example, 2018-02-12 15:20:00 . If this string is empty then current date and time is used. If StartDateTimeDst is true then we adjust for day light savings time. The Sun's position is then continuously updated at the interval specified in UpdateIntervalSecs . In some cases, it might be desirable to have celestial clock run faster or slower than simulation clock. This can be specified using CelestialClockSpeed , for example, value 100 means for every 1 second of simulation clock, Sun's position is advanced by 100 seconds so Sun will move in sky much faster. Also see Time of Day API .","title":"TimeOfDay"},{"location":"settings/#origingeopoint","text":"This setting specifies the latitude, longitude and altitude of the Player Start component placed in the Unreal environment. The vehicle's home point is computed using this transformation. Note that all coordinates exposed via APIs are using NED system in SI units which means each vehicle starts at (0, 0, 0) in NED system. Time of Day settings are computed for geographical coordinates specified in OriginGeopoint .","title":"OriginGeopoint"},{"location":"settings/#subwindows","text":"This setting determines what is shown in each of 3 subwindows which are visible when you press 1,2,3 keys. WindowID : Can be 0 to 2 CameraName : is any available camera on the vehicle ImageType : integer value determines what kind of image gets shown according to ImageType enum . VehicleName : string allows you to specify the vehicle to use the camera from, used when multiple vehicles are specified in the settings. First vehicle's camera will be used if there are any mistakes such as incorrect vehicle name, or only a single vehicle. Annotation : string allows you to specify the annotation layer to use for the camera. This is only if using the Annotation camera type for ImageType (value is 10). For example, for a single car vehicle, below shows driver view, front bumper view and rear view as scene, depth and surface normals respectively. \"SubWindows\": [ {\"WindowID\": 0, \"ImageType\": 0, \"CameraName\": \"3\", \"Visible\": true}, {\"WindowID\": 1, \"ImageType\": 3, \"CameraName\": \"0\", \"Visible\": true}, {\"WindowID\": 2, \"ImageType\": 6, \"CameraName\": \"4\", \"Visible\": true} ] In case of multiple vehicles, different vehicles can be specified as follows- \"SubWindows\": [ {\"WindowID\": 0, \"CameraName\": \"0\", \"ImageType\": 3, \"VehicleName\": \"Car1\", \"Visible\": false}, {\"WindowID\": 1, \"CameraName\": \"0\", \"ImageType\": 5, \"VehicleName\": \"Car2\", \"Visible\": false}, {\"WindowID\": 2, \"CameraName\": \"0\", \"ImageType\": 0, \"VehicleName\": \"Car1\", \"Visible\": false} ]","title":"SubWindows"},{"location":"settings/#recording","text":"The recording feature allows you to record data such as position, orientation, velocity along with the captured image at specified intervals. You can start recording by pressing red Record button on lower right or the R key. The data is stored in the Documents\\AirSim folder (or the folder specified using Folder ), in a time stamped subfolder for each recording session, as tab separated file. RecordInterval : specifies minimal interval in seconds between capturing two images. RecordOnMove : specifies that do not record frame if there was vehicle's position or orientation hasn't changed. Folder : Parent folder where timestamped subfolder with recordings are created. Absolute path of the directory must be specified. If not used, then Documents/AirSim folder will be used. E.g. \"Folder\": \"/home//Documents\" Enabled : Whether Recording should start from the beginning itself, setting to true will start recording automatically when the simulation starts. By default, it's set to false Cameras : this element controls which cameras are used to capture images. By default scene image from camera 0 is recorded as compressed png format. This setting is json array so you can specify multiple cameras to capture images, each with potentially different image types . When PixelsAsFloat is true, image is saved as pfm file instead of png file. VehicleName option allows you to specify separate cameras for individual vehicles. If the Cameras element isn't present, Scene image from the default camera of each vehicle will be recorded. If you don't want to record any images and just the vehicle's physics data, then specify the Cameras element but leave it empty, like this: \"Cameras\": [] add the field Annotation , a string allowing you to specify the annotation layer to use for the camera. This is only if using the Annotation camera type for ImageType . For example, the Cameras element below records scene & segmentation images for Car1 & scene for Car2 - \"Cameras\": [ { \"CameraName\": \"0\", \"ImageType\": 0, \"PixelsAsFloat\": false, \"VehicleName\": \"Car1\", \"Compress\": true }, { \"CameraName\": \"0\", \"ImageType\": 5, \"PixelsAsFloat\": false, \"VehicleName\": \"Car1\", \"Compress\": true }, { \"CameraName\": \"0\", \"ImageType\": 0, \"PixelsAsFloat\": false, \"VehicleName\": \"Car2\", \"Compress\": true } ] Check out Modifying Recording Data for details on how to modify the kinematics data being recorded.","title":"Recording"},{"location":"settings/#clockspeed","text":"This setting allows you to set the speed of simulation clock with respect to wall clock. For example, value of 5.0 would mean simulation clock has 5 seconds elapsed when wall clock has 1 second elapsed (i.e. simulation is running faster). The value of 0.1 means that simulation clock is 10X slower than wall clock. The value of 1 means simulation is running in real time. It is important to realize that quality of simulation may decrease as the simulation clock runs faster. You might see artifacts like object moving past obstacles because collision is not detected. However slowing down simulation clock (i.e. values < 1.0) generally improves the quality of simulation.","title":"ClockSpeed"},{"location":"settings/#wind-settings","text":"This setting specifies the wind speed in World frame, in NED direction. Values are in m/s. By default, speed is 0, i.e. no wind.","title":"Wind Settings"},{"location":"settings/#camera-director-settings","text":"This element specifies the settings used for the camera following the vehicle in the ViewPort. FollowDistance : Distance at which camera follows the vehicle, default is -8 (8 meters) for Car, -3 for others. X, Y, Z, Yaw, Roll, Pitch : These elements allows you to specify the position and orientation of the camera relative to the vehicle. Position is in NED coordinates in SI units with origin set to Player Start location in Unreal environment. The orientation is specified in degrees. The CameraDefaults element at root level specifies defaults used for all cameras. These defaults can be overridden for individual camera in Cameras element inside Vehicles as described later.","title":"Camera Director Settings"},{"location":"settings/#main-settings","text":"Like other sensors the pose of the sensor in the vehicle frame can be defined by X, Y, Z, Roll, Pitch, Yaw parameters. Furthermore there are some other settings available: * DrawSensor : Draw the physical sensor in the world on the vehicle with a 3D axes shown where the sensor is. * External : Uncouple the sensor from the vehicle. If enabled, the position and orientation will be relative to Unreal world coordinates. Note that if MoveWorldOrigin in the settings.json is set to true the Unreal coordinates will be moved to be the same origin as the player start location and as such this may effect where the sensor will spawn. * ExternalLocal : When in external mode, if this is enabled the retrieved pose of the sensor will be in Local NED coordinates(from starting position from vehicle) and not converted Unreal NED coordinates which is default. Note that if MoveWorldOrigin in the settings.json is set to true the Unreal coordinates will be moved to be the same origin as the player start location and as such this may effect what coordinates are returned if set to false .","title":"Main settings"},{"location":"settings/#note-on-imagetype-element","text":"The ImageType element in JSON array determines which image type that settings applies to. The valid values are described in ImageType section . For example, CaptureSettings element is json array so you can add settings for multiple image types easily.","title":"Note on ImageType element"},{"location":"settings/#capturesettings","text":"The CaptureSettings determines how different image types such as scene, depth, disparity, surface normals and segmentation views are rendered. The Width, Height and FOV settings should be self explanatory. The AutoExposureSpeed decides how fast eye adaptation works. We set to generally high value such as 100 to avoid artifacts in image capture. Similarly we set MotionBlurAmount to 0 by default to avoid artifacts in ground truth images. The ProjectionMode decides the projection used by the capture camera and can take value \"perspective\" (default) or \"orthographic\". If projection mode is \"orthographic\" then OrthoWidth determines width of projected area captured in meters. To disable the rendering of certain objects on specific cameras or all, use the IgnoreMarked boolean setting. This requires to mark individual objects that have to be ignore using an Unreal Tag called MarkedIgnore . You can also tweak the motion blur and chromatic Aberration here. Unreal 5 introduces Lumen lightning. Due to the cameras using scene capture components enabling Lumen for them can be costly on performance. Settings have been added specfically for the scene camera to customize the usage of Lumen for Global Illumination and Reflections. The LumenGIEnable and LumenReflectionEnable settings enable or disable Lumen for the camera. The LumenFinalQuality (0.25-2) setting determines the quality of the final image. The LumenSceneDetail (0.25-4) setting determines the quality of the scene. The LumenSceneLightningDetail (0.25-2) setting determines the quality of the lightning in the scene. For explanation of other settings, please see this article .","title":"CaptureSettings"},{"location":"settings/#noisesettings","text":"The NoiseSettings allows to add noise to the specified image type with a goal of simulating camera sensor noise, interference and other artifacts. By default no noise is added, i.e., Enabled: false . If you set Enabled: true then following different types of noise and interference artifacts are enabled, each can be further tuned using setting. The noise effects are implemented as shader created as post processing material in Unreal Engine called CameraSensorNoise . Demo of camera noise and interference simulation:","title":"NoiseSettings"},{"location":"settings/#random-noise","text":"This adds random noise blobs with following parameters. * RandContrib : This determines blend ratio of noise pixel with image pixel, 0 means no noise and 1 means only noise. * RandSpeed : This determines how fast noise fluctuates, 1 means no fluctuation and higher values like 1E6 means full fluctuation. * RandSize : This determines how coarse noise is, 1 means every pixel has its own noise while higher value means more than 1 pixels share same noise value. * RandDensity : This determines how many pixels out of total will have noise, 1 means all pixels while higher value means lesser number of pixels (exponentially).","title":"Random noise"},{"location":"settings/#horizontal-bump-distortion","text":"This adds horizontal bumps / flickering / ghosting effect. * HorzWaveContrib : This determines blend ratio of noise pixel with image pixel, 0 means no noise and 1 means only noise. * HorzWaveStrength : This determines overall strength of the effect. * HorzWaveVertSize : This determines how many vertical pixels would be effected by the effect. * HorzWaveScreenSize : This determines how much of the screen is effected by the effect.","title":"Horizontal bump distortion"},{"location":"settings/#horizontal-noise-lines","text":"This adds regions of noise on horizontal lines. * HorzNoiseLinesContrib : This determines blend ratio of noise pixel with image pixel, 0 means no noise and 1 means only noise. * HorzNoiseLinesDensityY : This determines how many pixels in horizontal line gets affected. * HorzNoiseLinesDensityXY : This determines how many lines on screen gets affected.","title":"Horizontal noise lines"},{"location":"settings/#horizontal-line-distortion","text":"This adds fluctuations on horizontal line. * HorzDistortionContrib : This determines blend ratio of noise pixel with image pixel, 0 means no noise and 1 means only noise. * HorzDistortionStrength : This determines how large is the distortion.","title":"Horizontal line distortion"},{"location":"settings/#radial-lens-distortion","text":"This adds radial lens distortion to the camera sensor. * LensDistortionEnable : Enable or disable this feature * LensDistortionAreaFalloff : The size of the area to distort * LensDistortionAreaRadius : The distortion radius * LensDistortionInvert : Set to true to invert and create 'pincushion distortion' or false for 'barrel distortion'","title":"Radial Lens Distortion"},{"location":"settings/#gimbal","text":"The Gimbal element allows to freeze camera orientation for pitch, roll and/or yaw. This setting is ignored unless ImageType is -1. The Stabilization is defaulted to 0 meaning no gimbal i.e. camera orientation changes with body orientation on all axis. The value of 1 means full stabilization. The value between 0 to 1 acts as a weight for fixed angles specified (in degrees, in world-frame) in Pitch , Roll and Yaw elements and orientation of the vehicle body. When any of the angles is omitted from json or set to NaN, that angle is not stabilized (i.e. it moves along with vehicle body).","title":"Gimbal"},{"location":"settings/#unrealengine","text":"This element contains settings specific to the Unreal Engine. These will be ignored in the Unity project. * PixelFormatOverride : This contains a list of elements that have both a ImageType and PixelFormat setting. Each element allows you to override the default pixel format of the UTextureRenderTarget2D object instantiated for the capture specified by the ImageType setting. Specifying this element allows you to prevent crashes caused by unexpected pixel formats (see #4120 and #4339 for examples of these crashes). A full list of pixel formats can be viewed here .","title":"UnrealEngine"},{"location":"settings/#vehicles-settings","text":"Each simulation mode will go through the list of vehicles specified in this setting and create the ones that has \"AutoCreate\": true . Each vehicle specified in this setting has key which becomes the name of the vehicle. If \"Vehicles\" element is missing then this list is populated with default car named \"PhysXCar\" and default multirotor named \"SimpleFlight\".","title":"Vehicles Settings"},{"location":"settings/#common-vehicle-setting","text":"VehicleType : This could be either PhysXCar , ArduRover or BoxCar for the Car SimMode, SimpleFlight , ArduCopter or PX4Multirotor for the MultiRotor SimMode, ComputerVision for the ComputerVision SimMode and CPHusky or Pioneer for SkidVehicle SimMode. you can use There is no default value therefore this element must be specified. PawnPath : This allows to override the pawn blueprint to use for the vehicle. For example, you may create new pawn blueprint derived from ACarPawn for a warehouse robot in your own project outside the Cosys-AirSim code and then specify its path here. See also PawnPaths . Note that you have to specify your custom pawn blueprint class path inside the global PawnPaths object using your proprietarily defined object name, and quote that name inside the Vehicles setting. For example, { ... \"PawnPaths\": { \"CustomPawn\": {\"PawnBP\": \"Class'/Game/Assets/Blueprints/MyPawn.MyPawn_C'\"} }, \"Vehicles\": { \"MyVehicle\": { \"VehicleType\": ..., \"PawnPath\": \"CustomPawn\", ... } } } DefaultVehicleState : Possible value for multirotors is Armed or Disarmed . AutoCreate : If true then this vehicle would be spawned (if supported by selected sim mode). RC : This sub-element allows to specify which remote controller to use for vehicle using RemoteControlID . The value of -1 means use keyboard (not supported yet for multirotors). The value >= 0 specifies one of many remote controllers connected to the system. The list of available RCs can be seen in Game Controllers panel in Windows, for example. X, Y, Z, Yaw, Roll, Pitch : These elements allows you to specify the initial position and orientation of the vehicle. Position is in NED coordinates in SI units with origin set to Player Start location in Unreal environment. The orientation is specified in degrees. Sensors : This element specifies the sensors associated with the vehicle, see Sensors page for details. IsFpvVehicle : This setting allows to specify which vehicle camera will follow and the view that will be shown when ViewMode is set to Fpv. By default, Cosys-AirSim selects the first vehicle in settings as FPV vehicle. Cameras : This element specifies camera settings for vehicle. The key in this element is name of the available camera and the value is same as CameraDefaults as described above. For example, to change FOV for the front center camera to 120 degrees, you can use this for Vehicles setting: \"Vehicles\": { \"FishEyeDrone\": { \"VehicleType\": \"SimpleFlight\", \"Cameras\": { \"front-center\": { \"CaptureSettings\": [ { \"ImageType\": 0, \"FOV_Degrees\": 120 } ] } } } }","title":"Common Vehicle Setting"},{"location":"settings/#using-px4","text":"By default we use simple_flight so you don't have to do separate HITL or SITL setups. We also support \"PX4\" for advanced users. To use PX4 with Cosys-AirSim, you can use the following for Vehicles setting: \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", } }","title":"Using PX4"},{"location":"settings/#additional-px4-settings","text":"The defaults for PX4 is to enable hardware-in-loop setup. There are various other settings available for PX4 as follows with their default values: \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"Lockstep\": true, \"ControlIp\": \"127.0.0.1\", \"ControlPortLocal\": 14540, \"ControlPortRemote\": 14580, \"LogViewerHostIp\": \"127.0.0.1\", \"LogViewerPort\": 14388, \"OffboardCompID\": 1, \"OffboardSysID\": 134, \"QgcHostIp\": \"127.0.0.1\", \"QgcPort\": 14550, \"SerialBaudRate\": 115200, \"SerialPort\": \"*\", \"SimCompID\": 42, \"SimSysID\": 142, \"TcpPort\": 4560, \"UdpIp\": \"127.0.0.1\", \"UdpPort\": 14560, \"UseSerial\": true, \"UseTcp\": false, \"VehicleCompID\": 1, \"VehicleSysID\": 135, \"Model\": \"Generic\", \"LocalHostIp\": \"127.0.0.1\", \"Logs\": \"d:\\\\temp\\\\mavlink\", \"Sensors\": { ... } \"Parameters\": { ... } } } These settings define the MavLink SystemId and ComponentId for the Simulator (SimSysID, SimCompID), and for the vehicle (VehicleSysID, VehicleCompID) and the node that allows remote control of the drone from another app this is called the offboard node (OffboardSysID, OffboardCompID). If you want the simulator to also forward mavlink messages to your ground control app (like QGroundControl) you can also set the UDP address for that in case you want to run that on a different machine (QgcHostIp, QgcPort). The default is local host so QGroundControl should \"just work\" if it is running on the same machine. You can connect the simulator to the LogViewer app, provided in this repo, by setting the UDP address for that (LogViewerHostIp, LogViewerPort). And for each flying drone added to the simulator there is a named block of additional settings. In the above you see the default name \"PX4\". You can change this name from the Unreal Editor when you add a new BP_FlyingPawn asset. You will see these properties grouped under the category \"MavLink\". The MavLink node for this pawn can be remote over UDP or it can be connected to a local serial port. If serial then set UseSerial to true, otherwise set UseSerial to false. For serial connections you also need to set the appropriate SerialBaudRate. The default of 115200 works with Pixhawk version 2 over USB. When communicating with the PX4 drone over serial port both the HIL_ messages and vehicle control messages share the same serial port. When communicating over UDP or TCP PX4 requires two separate channels. If UseTcp is false, then UdpIp, UdpPort are used to send HIL_ messages, otherwise the TcpPort is used. TCP support in PX4 was added in 1.9.2 with the lockstep feature because the guarantee of message delivery that TCP provides is required for the proper functioning of lockstep. Cosys-AirSim becomes a TCP server in that case, and waits for a connection from the PX4 app. The second channel for controlling the vehicle is defined by (ControlIp, ControlPort) and is always a UDP channel. The Sensors section can provide customized settings for simulated sensors, see Sensors . The Parameters section can set PX4 parameters during initialization of the PX4 connection. See Setting up PX4 Software-in-Loop for an example.","title":"Additional PX4 Settings"},{"location":"settings/#using-ardupilot","text":"ArduPilot Copter & Rover vehicles are supported in latest Cosys-AirSim main branch & releases v1.3.0 and later. For settings and how to use, please see ArduPilot SITL with Cosys-AirSim","title":"Using ArduPilot"},{"location":"settings/#other-settings","text":"","title":"Other Settings"},{"location":"settings/#enginesound","text":"To turn off the engine sound use setting \"EngineSound\": false . Currently this setting applies only to car.","title":"EngineSound"},{"location":"settings/#pawnpaths","text":"This allows you to specify your own vehicle pawn blueprints, for example, you can replace the default car in AirSim with your own car. Your vehicle BP can reside in Content folder of your own Unreal project (i.e. outside of AirSim plugin folder). For example, if you have a car BP located in file Content\\MyCar\\MySedanBP.uasset in your project then you can set \"DefaultCar\": {\"PawnBP\":\"Class'/Game/MyCar/MySedanBP.MySedanBP_C'\"} . The XYZ.XYZ_C is a special notation required to specify class for BP XYZ . Please note that your BP must be derived from CarPawn class. By default this is not the case but you can re-parent the BP using the \"Class Settings\" button in toolbar in UE editor after you open the BP and then choosing \"Car Pawn\" for Parent Class settings in Class Options. It is also a good idea to disable \"Auto Possess Player\" and \"Auto Possess AI\" as well as set AI Controller Class to None in BP details. Please make sure your asset is included for cooking in packaging options if you are creating binary.","title":"PawnPaths"},{"location":"settings/#physicsenginename","text":"For cars, we support only PhysX for now (regardless of value in this setting). For multirotors, we support \"FastPhysicsEngine\" and \"ExternalPhysicsEngine\" . \"ExternalPhysicsEngine\" allows the drone to be controlled via setVehiclePose (), keeping the drone in place until the next call. It is especially useful for moving the AirSim drone using an external simulator or on a saved path.","title":"PhysicsEngineName"},{"location":"settings/#localhostip-setting","text":"Now when connecting to remote machines you may need to pick a specific Ethernet adapter to reach those machines, for example, it might be over Ethernet or over Wi-Fi, or some other special virtual adapter or a VPN. Your PC may have multiple networks, and those networks might not be allowed to talk to each other, in which case the UDP messages from one network will not get through to the others. So the LocalHostIp allows you to configure how you are reaching those machines. The default of 127.0.0.1 is not able to reach external machines, this default is only used when everything you are talking to is contained on a single PC.","title":"LocalHostIp Setting"},{"location":"settings/#apiserverport","text":"This setting determines the server port that used by airsim clients, default port is 41451. By specifying different ports, the user can run multiple environments in parallel to accelerate data collection process.","title":"ApiServerPort"},{"location":"settings/#speedunitfactor","text":"Unit conversion factor for speed related to m/s , default is 1. Used in conjunction with SpeedUnitLabel. This may be only used for display purposes for example on-display speed when car is being driven. For example, to get speed in miles/hr use factor 2.23694.","title":"SpeedUnitFactor"},{"location":"settings/#speedunitlabel","text":"Unit label for speed, default is m/s . Used in conjunction with SpeedUnitFactor.","title":"SpeedUnitLabel"},{"location":"simple_flight/","text":"simple_flight If you don't know what the flight controller does, see What is Flight Controller? . AirSim has a built-in flight controller called simple_flight and it is used by default. You don't need to do anything to use or configure it. AirSim also supports PX4 as another flight controller for advanced users. In the future, we also plan to support ROSFlight and Hackflight . Advantages The advantage of using simple_flight is zero additional setup you need to do and it \"just works\". Also, simple_flight uses a steppable clock which means you can pause the simulation and things are not at mercy of a high variance low precision clock that the operating system provides. Furthermore, simple_flight is simple, cross platform and consists of 100% header-only dependency-free C++ code which means you can literally switch between the simulator and the flight controller code within same code base! Design Normally flight controllers are designed to run on actual hardware of vehicles and their support for running in simulator varies widely. They are often fairly difficult to configure for non-expert users and typically have a complex build, usually lacking cross platform support. All these problems have played a significant part in the design of simple_flight. simple_flight is designed from ground up as library with clean a interface that can work onboard the vehicle as well as in the simulator. The core principle is that the flight controller has no way to specify a special simulation mode and therefore it has no way to know if it is running as a simulation or as a real vehicle. We thus view flight controllers simply as a collection of algorithms packaged in a library. Another key emphasis is to develop this code as dependency-free header-only pure standard C++11 code. This means there is no special build required to compile simple_flight. You just copy its source code to any project you wish and it just works. Control simple_flight can control vehicles by taking in the desired input as angle rate, angle level, velocity or position. Each axis of control can be specified with one of these modes. Internally, simple_flight uses a cascade of PID controllers to finally generate actuator signals. This means that the position PID drives the velocity PID, which in turn drives the angle level PID which finally drives the angle rate PID. State Estimation In the current release, we are using the ground truth from the simulator for our state estimation. We plan to add a complimentary filter-based state estimator for angular velocity and orientation using 2 sensors (gyroscope, accelerometer) in the near future. In a more longer term, we plan to integrate another library to perform velocity and position estimation using 4 sensors (gyroscope, accelerometer, magnetometer and barometer) using an Extended Kalman Filter (EKF). If you have experience in this area, we encourage you to engage with us and contribute! Supported Boards Currently, we have implemented simple_flight interfaces for the simulated board. We plan to implement it for the Pixhawk V2 board and possibly the Naze32 board. We expect all our code to remain unchanged and the implementation would mainly involve adding drivers for various sensors, handling ISRs and managing other board specific details. If you have experience in this area, we encourage you to engage with us and contribute! Configuration To have AirSim use simple_flight, you can specify it in settings.json as shown below. Note that this is default, so you don't have to do it explicitly. \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", } } By default, a vehicle using simple_flight is already armed which is why you would see its propellers spinning. However, if you don't want that then set DefaultVehicleState to Inactive like this: \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", \"DefaultVehicleState\": \"Inactive\" } } In this case, you will need to either manually arm by placing the RC sticks in the down-inward position or using the APIs. For safety reasons, flight controllers disallow API control unless a human operator has consented its use using a switch on his/her RC. Also, when RC control is lost, the vehicle should disable API control and enter hover mode for safety reasons. To simplify things a bit, simple_flight enables API control without human consent using RC and even when RC is not detected by default. However you can change this using the following setting: \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", \"AllowAPIAlways\": true, \"RC\": { \"RemoteControlID\": 0, \"AllowAPIWhenDisconnected\": true } } } Finally, simple_flight uses a steppable clock by default which means that the clock advances when the simulator tells it to advance (unlike the wall clock which advances strictly according to the passage of time). This means the clock can be paused, for example, if code hits a breakpoint and there is zero variance in the clock (clock APIs provided by operating systems might have significant variance unless it is a \"real time\" OS). If you want simple_flight to use a wall clock instead then use following settings: \"ClockType\": \"ScalableClock\"","title":"Simple Flight"},{"location":"simple_flight/#simple_flight","text":"If you don't know what the flight controller does, see What is Flight Controller? . AirSim has a built-in flight controller called simple_flight and it is used by default. You don't need to do anything to use or configure it. AirSim also supports PX4 as another flight controller for advanced users. In the future, we also plan to support ROSFlight and Hackflight .","title":"simple_flight"},{"location":"simple_flight/#advantages","text":"The advantage of using simple_flight is zero additional setup you need to do and it \"just works\". Also, simple_flight uses a steppable clock which means you can pause the simulation and things are not at mercy of a high variance low precision clock that the operating system provides. Furthermore, simple_flight is simple, cross platform and consists of 100% header-only dependency-free C++ code which means you can literally switch between the simulator and the flight controller code within same code base!","title":"Advantages"},{"location":"simple_flight/#design","text":"Normally flight controllers are designed to run on actual hardware of vehicles and their support for running in simulator varies widely. They are often fairly difficult to configure for non-expert users and typically have a complex build, usually lacking cross platform support. All these problems have played a significant part in the design of simple_flight. simple_flight is designed from ground up as library with clean a interface that can work onboard the vehicle as well as in the simulator. The core principle is that the flight controller has no way to specify a special simulation mode and therefore it has no way to know if it is running as a simulation or as a real vehicle. We thus view flight controllers simply as a collection of algorithms packaged in a library. Another key emphasis is to develop this code as dependency-free header-only pure standard C++11 code. This means there is no special build required to compile simple_flight. You just copy its source code to any project you wish and it just works.","title":"Design"},{"location":"simple_flight/#control","text":"simple_flight can control vehicles by taking in the desired input as angle rate, angle level, velocity or position. Each axis of control can be specified with one of these modes. Internally, simple_flight uses a cascade of PID controllers to finally generate actuator signals. This means that the position PID drives the velocity PID, which in turn drives the angle level PID which finally drives the angle rate PID.","title":"Control"},{"location":"simple_flight/#state-estimation","text":"In the current release, we are using the ground truth from the simulator for our state estimation. We plan to add a complimentary filter-based state estimator for angular velocity and orientation using 2 sensors (gyroscope, accelerometer) in the near future. In a more longer term, we plan to integrate another library to perform velocity and position estimation using 4 sensors (gyroscope, accelerometer, magnetometer and barometer) using an Extended Kalman Filter (EKF). If you have experience in this area, we encourage you to engage with us and contribute!","title":"State Estimation"},{"location":"simple_flight/#supported-boards","text":"Currently, we have implemented simple_flight interfaces for the simulated board. We plan to implement it for the Pixhawk V2 board and possibly the Naze32 board. We expect all our code to remain unchanged and the implementation would mainly involve adding drivers for various sensors, handling ISRs and managing other board specific details. If you have experience in this area, we encourage you to engage with us and contribute!","title":"Supported Boards"},{"location":"simple_flight/#configuration","text":"To have AirSim use simple_flight, you can specify it in settings.json as shown below. Note that this is default, so you don't have to do it explicitly. \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", } } By default, a vehicle using simple_flight is already armed which is why you would see its propellers spinning. However, if you don't want that then set DefaultVehicleState to Inactive like this: \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", \"DefaultVehicleState\": \"Inactive\" } } In this case, you will need to either manually arm by placing the RC sticks in the down-inward position or using the APIs. For safety reasons, flight controllers disallow API control unless a human operator has consented its use using a switch on his/her RC. Also, when RC control is lost, the vehicle should disable API control and enter hover mode for safety reasons. To simplify things a bit, simple_flight enables API control without human consent using RC and even when RC is not detected by default. However you can change this using the following setting: \"Vehicles\": { \"SimpleFlight\": { \"VehicleType\": \"SimpleFlight\", \"AllowAPIAlways\": true, \"RC\": { \"RemoteControlID\": 0, \"AllowAPIWhenDisconnected\": true } } } Finally, simple_flight uses a steppable clock by default which means that the clock advances when the simulator tells it to advance (unlike the wall clock which advances strictly according to the passage of time). This means the clock can be paused, for example, if code hits a breakpoint and there is zero variance in the clock (clock APIs provided by operating systems might have significant variance unless it is a \"real time\" OS). If you want simple_flight to use a wall clock instead then use following settings: \"ClockType\": \"ScalableClock\"","title":"Configuration"},{"location":"skid_steer_vehicle/","text":"Unreal Skid Steering Vehicle Model For vehicles that can't use the normal WheeledVehicle setup with normal steering wheels and non-steering wheels, but that use the skid-steering/differential steering (like a tank) an alternative vehicle model was created. It is build using the Chaos engine of Unreal which does not support this vehicle type natively, as such, it can behave unrealistic at times. http://www.robotplatform.com/knowledge/Classification_of_Robots/wheel_control_theory.html Creating a new skid steer vehicle The steps to setup the vehicle are largely the same as a WheeledVehiclePawn with some slight adjustments. 1. Follow this guide to create the skeletal mesh and physics asset. 2. For the wheels setup, the vehicle should have 4 wheels, 2 for the left side and 2 for the right side. Please use SkidWheel as the wheel class. 3. For the vehicle blueprint to create the pawn it is also largely the same as in that tutorial however as class one should use the SkidVehiclePawn sub-class. The vehicle setup parameters are more simplified. 4. To have animated wheels, proper physics and correct steering behavior, please take a look at how the CPHusky is configured in the AirSim plugin. The Husky is a skid steer vehicle and can be used as a reference. Skid steer model within AirSim The skid steer model is a separate SimMode within AirSim. It is fully implemented in similar fashion as the normal Car SimMode. There are already two vehicle types implemented, the ClearPath Husky and Pioneer P3DX robots. To configure the SimMode and vehicle type see the settings.json file documentation . If you create a new vehicle using the Unreal skid steering vehicle model as described above, one can use the PawnPaths setting in the Common Vehicle Settings in the settings.json file to link the custom vehicle pawn. Note that due to a bug in the ChaosVehicles and setting raw YawInput values when rotating on its axis to the left will cause a small forward movement as well.","title":"Skid Steer Vehicles"},{"location":"skid_steer_vehicle/#unreal-skid-steering-vehicle-model","text":"For vehicles that can't use the normal WheeledVehicle setup with normal steering wheels and non-steering wheels, but that use the skid-steering/differential steering (like a tank) an alternative vehicle model was created. It is build using the Chaos engine of Unreal which does not support this vehicle type natively, as such, it can behave unrealistic at times. http://www.robotplatform.com/knowledge/Classification_of_Robots/wheel_control_theory.html","title":"Unreal Skid Steering Vehicle Model"},{"location":"skid_steer_vehicle/#creating-a-new-skid-steer-vehicle","text":"The steps to setup the vehicle are largely the same as a WheeledVehiclePawn with some slight adjustments. 1. Follow this guide to create the skeletal mesh and physics asset. 2. For the wheels setup, the vehicle should have 4 wheels, 2 for the left side and 2 for the right side. Please use SkidWheel as the wheel class. 3. For the vehicle blueprint to create the pawn it is also largely the same as in that tutorial however as class one should use the SkidVehiclePawn sub-class. The vehicle setup parameters are more simplified. 4. To have animated wheels, proper physics and correct steering behavior, please take a look at how the CPHusky is configured in the AirSim plugin. The Husky is a skid steer vehicle and can be used as a reference.","title":"Creating a new skid steer vehicle"},{"location":"skid_steer_vehicle/#skid-steer-model-within-airsim","text":"The skid steer model is a separate SimMode within AirSim. It is fully implemented in similar fashion as the normal Car SimMode. There are already two vehicle types implemented, the ClearPath Husky and Pioneer P3DX robots. To configure the SimMode and vehicle type see the settings.json file documentation . If you create a new vehicle using the Unreal skid steering vehicle model as described above, one can use the PawnPaths setting in the Common Vehicle Settings in the settings.json file to link the custom vehicle pawn. Note that due to a bug in the ChaosVehicles and setting raw YawInput values when rotating on its axis to the left will cause a small forward movement as well.","title":"Skid steer model within AirSim"},{"location":"steering_wheel_installation/","text":"Logitech G920 Steering Wheel Installation To use Logitech G920 steering wheel with Cosys-AirSim follow these steps: Connect the steering wheel to the computer and wait until drivers installation complete. Install Logitech Gaming Software from here Before debug, you\u2019ll have to normalize the values in Cosys-AirSim code. Perform this changes in CarPawn.cpp (according to the current update in the git): In line 382, change \u201cVal\u201d to \u201c1 \u2013 Val\u201d. (the complementary value in the range [0.0,1.0]). In line 388, change \u201cVal\u201d to \u201c5Val - 2.5\u201d (Change the range of the given input from [0.0,1.0] to [-1.0,1.0]). In line 404, change \u201cVal\u201d to \u201c4(1 \u2013 Val)\u201d. (the complementary value in the range [0.0,1.0]). Debug Cosys-AirSim project (while the steering wheel is connected \u2013 it\u2019s important). On Unreal Editor, go to Edit->plugins->input devices and enable \u201cWindows RawInput\u201d. Go to Edit->Project Settings->Raw Input, and add new device configuration: Vendor ID: 0x046d (In case of Logitech G920, otherwise you might need to check it). Product ID: 0xc261 (In case of Logitech G920, otherwise you might need to check it). Under \u201cAxis Properties\u201d, make sure that \u201cGenericUSBController Axis 2\u201d, \u201cGenericUSBController Axis 4\u201d and \u201cGenericUSBController Axis 5\u201d are all enabled with an offset of 1.0. Explanation: axis 2 is responsible for steering movement, axis 4 is for brake and axis 5 is for gas. If you need to configure the clutch, it\u2019s on axis 3. Go to Edit->Project Settings->Input, Under Bindings in \u201cAxis Mappings\u201d: Remove existing mappings from the groups \u201cMoveRight\u201d and \u201cMoveForward\u201d. Add new axis mapping to the group \u201cMoveRight\u201d, use GenericUSBController axis 2 with a scale of 1.0. Add new axis mapping to the group \u201cMoveForward\u201d, use GenericUSBController axis 5 with a scale of 1.0. Add a new group of axis mappings, name it \u201cFootBrake\u201d and add new axis mapping to this group, use GenericUSBController axis 4 with a scale of 1.0. Play and drive ! Pay Attention Notice that in the first time we \"play\" after debug, we need to touch the wheel to \u201creset\u201d the values. Tip In the gaming software, you can configure buttons as keyboard shortcuts, we used it to configure a shortcut to record dataset or to play in full screen.","title":"Steering Wheel"},{"location":"steering_wheel_installation/#logitech-g920-steering-wheel-installation","text":"To use Logitech G920 steering wheel with Cosys-AirSim follow these steps: Connect the steering wheel to the computer and wait until drivers installation complete. Install Logitech Gaming Software from here Before debug, you\u2019ll have to normalize the values in Cosys-AirSim code. Perform this changes in CarPawn.cpp (according to the current update in the git): In line 382, change \u201cVal\u201d to \u201c1 \u2013 Val\u201d. (the complementary value in the range [0.0,1.0]). In line 388, change \u201cVal\u201d to \u201c5Val - 2.5\u201d (Change the range of the given input from [0.0,1.0] to [-1.0,1.0]). In line 404, change \u201cVal\u201d to \u201c4(1 \u2013 Val)\u201d. (the complementary value in the range [0.0,1.0]). Debug Cosys-AirSim project (while the steering wheel is connected \u2013 it\u2019s important). On Unreal Editor, go to Edit->plugins->input devices and enable \u201cWindows RawInput\u201d. Go to Edit->Project Settings->Raw Input, and add new device configuration: Vendor ID: 0x046d (In case of Logitech G920, otherwise you might need to check it). Product ID: 0xc261 (In case of Logitech G920, otherwise you might need to check it). Under \u201cAxis Properties\u201d, make sure that \u201cGenericUSBController Axis 2\u201d, \u201cGenericUSBController Axis 4\u201d and \u201cGenericUSBController Axis 5\u201d are all enabled with an offset of 1.0. Explanation: axis 2 is responsible for steering movement, axis 4 is for brake and axis 5 is for gas. If you need to configure the clutch, it\u2019s on axis 3. Go to Edit->Project Settings->Input, Under Bindings in \u201cAxis Mappings\u201d: Remove existing mappings from the groups \u201cMoveRight\u201d and \u201cMoveForward\u201d. Add new axis mapping to the group \u201cMoveRight\u201d, use GenericUSBController axis 2 with a scale of 1.0. Add new axis mapping to the group \u201cMoveForward\u201d, use GenericUSBController axis 5 with a scale of 1.0. Add a new group of axis mappings, name it \u201cFootBrake\u201d and add new axis mapping to this group, use GenericUSBController axis 4 with a scale of 1.0. Play and drive !","title":"Logitech G920 Steering Wheel Installation"},{"location":"steering_wheel_installation/#pay-attention","text":"Notice that in the first time we \"play\" after debug, we need to touch the wheel to \u201creset\u201d the values.","title":"Pay Attention"},{"location":"steering_wheel_installation/#tip","text":"In the gaming software, you can configure buttons as keyboard shortcuts, we used it to configure a shortcut to record dataset or to play in full screen.","title":"Tip"},{"location":"unreal_blocks/","text":"Setup Blocks Environment for AirSim Blocks environment is available in repo in folder Unreal/Environments/Blocks and is designed to be lightweight in size. That means it is very basic but fast. Here are quick steps to get Blocks environment up and running: Windows from Source Make sure you have built or installed Unreal and built AirSim . Navigate to folder AirSim\\Unreal\\Environments\\Blocks and run update_from_git.bat . Double click on generated .sln file to open in Visual Studio. Make sure Blocks project is the startup project, build configuration is set to DevelopmentEditor_Editor and Win64 . Hit F5 to run. Press the Play button in Unreal Editor. Also see the other documentation for how to use it. Changing Code and Rebuilding For Windows, you can just change the code in Visual Studio, press F5 and re-run. There are few batch files available in folder AirSim\\Unreal\\Environments\\Blocks that lets you sync code, clean etc. Linux from Source Make sure you have built or installed the Unreal Engine and AirSim . Navigate to folder AirSim\\Unreal\\Environments\\Blocks and run update_from_git.sh . Navigate to your UnrealEngine repo folder and run Engine/Binaries/Linux/UE4Editor which will start Unreal Editor. On first start you might not see any projects in UE4 editor. Click on Projects tab, Browse button and then navigate to AirSim/Unreal/Environments/Blocks/Blocks.uproject . If you get prompted for incompatible version and conversion, select In-place conversion which is usually under \"More\" options. If you get prompted for missing modules, make sure to select No , so you don't exit. Finally, when prompted with building AirSim, select Yes. Now it might take a while so go get some coffee :). Press the Play button in Unreal Editor. Also see the other documentation for how to use it. Changing Code and Rebuilding For Linux, make code changes in AirLib or Unreal/Plugins folder and then run ./build.sh to rebuild. This step also copies the build output to Blocks sample project. You can then follow above steps again to re-run. Choosing Your Vehicle: Car or Multirotor By default, AirSim spawns multirotor. You can easily change this to car and use all of AirSim goodies. Please see using car guide. FAQ I see warnings about like \"_BuiltData\" file is missing. These are intermediate files and, you can safely ignore it.","title":"Blocks Environment"},{"location":"unreal_blocks/#setup-blocks-environment-for-airsim","text":"Blocks environment is available in repo in folder Unreal/Environments/Blocks and is designed to be lightweight in size. That means it is very basic but fast. Here are quick steps to get Blocks environment up and running:","title":"Setup Blocks Environment for AirSim"},{"location":"unreal_blocks/#windows-from-source","text":"Make sure you have built or installed Unreal and built AirSim . Navigate to folder AirSim\\Unreal\\Environments\\Blocks and run update_from_git.bat . Double click on generated .sln file to open in Visual Studio. Make sure Blocks project is the startup project, build configuration is set to DevelopmentEditor_Editor and Win64 . Hit F5 to run. Press the Play button in Unreal Editor. Also see the other documentation for how to use it.","title":"Windows from Source"},{"location":"unreal_blocks/#changing-code-and-rebuilding","text":"For Windows, you can just change the code in Visual Studio, press F5 and re-run. There are few batch files available in folder AirSim\\Unreal\\Environments\\Blocks that lets you sync code, clean etc.","title":"Changing Code and Rebuilding"},{"location":"unreal_blocks/#linux-from-source","text":"Make sure you have built or installed the Unreal Engine and AirSim . Navigate to folder AirSim\\Unreal\\Environments\\Blocks and run update_from_git.sh . Navigate to your UnrealEngine repo folder and run Engine/Binaries/Linux/UE4Editor which will start Unreal Editor. On first start you might not see any projects in UE4 editor. Click on Projects tab, Browse button and then navigate to AirSim/Unreal/Environments/Blocks/Blocks.uproject . If you get prompted for incompatible version and conversion, select In-place conversion which is usually under \"More\" options. If you get prompted for missing modules, make sure to select No , so you don't exit. Finally, when prompted with building AirSim, select Yes. Now it might take a while so go get some coffee :). Press the Play button in Unreal Editor. Also see the other documentation for how to use it.","title":"Linux from Source"},{"location":"unreal_blocks/#changing-code-and-rebuilding_1","text":"For Linux, make code changes in AirLib or Unreal/Plugins folder and then run ./build.sh to rebuild. This step also copies the build output to Blocks sample project. You can then follow above steps again to re-run.","title":"Changing Code and Rebuilding"},{"location":"unreal_blocks/#choosing-your-vehicle-car-or-multirotor","text":"By default, AirSim spawns multirotor. You can easily change this to car and use all of AirSim goodies. Please see using car guide.","title":"Choosing Your Vehicle: Car or Multirotor"},{"location":"unreal_blocks/#faq","text":"","title":"FAQ"},{"location":"unreal_blocks/#i-see-warnings-about-like-_builtdata-file-is-missing","text":"These are intermediate files and, you can safely ignore it.","title":"I see warnings about like \"_BuiltData\" file is missing."},{"location":"unreal_custenv/","text":"Creating and Setting Up Unreal Environment This page contains the complete instructions start to finish for setting up Unreal environment with AirSim. The Unreal Marketplace has several environment available that you can start using in just few minutes. It is also possible to use environments available on websites such as turbosquid.com or cgtrader.com with bit more effort (here's tutorial video ). In addition, there also several free environments available. Below we will use a freely downloadable environment from Unreal Marketplace called Landscape Mountain but the steps are same for any other environments. Note for Linux Users There is no Epic Games Launcher for Linux which means that if you need to create custom environment, you will need Windows machine to do that. Once you have Unreal project folder, just copy it over to your Linux machine. Step-by-Step Instructions when using Cosys-AirSim from Precompiled Binaries It is assumed you downloaded the right precompiled Cosys-AirSim plugin from the GitHub releases page for the right Unreal version. In Epic Games Launcher click the Samples tab then scroll down and find Landscape Mountains . Click the Create Project and download this content (~2GB download). Open LandscapeMountains.uproject , it should launch the Unreal Editor. !!!note The Landscape Mountains project is supported up to Unreal Engine version 4.24. If you do not have 4.24 installed, you should see a dialog titled `Select Unreal Engine Version` with a dropdown to select from installed versions. Select 5.4 to migrate the project to a supported engine version. If you have 4.24 installed, you can manually migrate the project by navigating to the corresponding .uproject file in Windows Explorer, right-clicking it, and selecting the `Switch Unreal Engine version...` option. Go to the LandscapeMountains project folder and create a new subfolder called Plugins . Now copy the precompiled AirSim Plugin folder into this newly created folder. This way now your own Unreal project has AirSim plugin. Edit the LandscapeMountains.uproject so that you add the AirSim plugin to the list of plugins to load. json { ... \"Plugins\": [ { \"Name\": \"AirSim\", \"Enabled\": true } ] ... } Edit the Config\\DefaultGame.ini to add the following lines at the end: +MapsToCook=(FilePath=\"/AirSim/AirSimAssets\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/HUDAssets\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Beacons\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Blueprints\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Models\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Sensors\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/StarterContent\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/VehicleAdv\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Weather\") Doing this forces Unreal to include all necessary AirSim content in packaged builds of your project. If using Unreal Engine 5.3/5.4 check here for a fix to the camera scene rendering bug in these engine versions! Close the Unreal Editor and restart it by opening the uproject file again. In Window/World Settings as shown below, set the GameMode Override to AirSimGameMode : Go to 'Edit->Editor Preferences' in Unreal Editor, in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. If you don't do this then UE will be slowed down dramatically when UE window loses focus. Be sure to Save these edits. Hit the Play button in the Unreal Editor. See the other documentation pages for how to use it. Step-by-Step Instructions when using Cosys-AirSim from Source Build Make sure AirSim is built and Unreal 5.4 is installed as described in the installation instructions . In Epic Games Launcher click the Samples tab then scroll down and find Landscape Mountains . Click the Create Project and download this content (~2GB download). Open LandscapeMountains.uproject , it should launch the Unreal Editor. !!!note The Landscape Mountains project is supported up to Unreal Engine version 4.24. If you do not have 4.24 installed, you should see a dialog titled `Select Unreal Engine Version` with a dropdown to select from installed versions. Select 5.4 to migrate the project to a supported engine version. If you have 4.24 installed, you can manually migrate the project by navigating to the corresponding .uproject file in Windows Explorer, right-clicking it, and selecting the `Switch Unreal Engine version...` option. From the File menu select New C++ class , leave default None on the type of class, click Next , leave default name MyClass , and click Create Class . We need to do this because Unreal requires at least one source file in project. It should trigger compile and open up Visual Studio solution LandscapeMountains.sln . Go to your folder for AirSim repo and copy Unreal\\Plugins folder in to your LandscapeMountains folder. This way now your own Unreal project has AirSim plugin. !!!note If the AirSim installation is fresh, i.e, hasn't been built before, make sure that you run `build.cmd` from the root directory once before copying `Unreal\\Plugins` folder so that `AirLib` files are also included. If you have made some changes in the Blocks environment, make sure to run `update_to_git.bat` from `Unreal\\Environments\\Blocks` to update the files in `Unreal\\Plugins`. Edit the LandscapeMountains.uproject so that it looks like this json { \"FileVersion\": 3, \"EngineAssociation\": \"\", \"Category\": \"Samples\", \"Description\": \"\", \"Modules\": [ { \"Name\": \"LandscapeMountains\", \"Type\": \"Runtime\", \"LoadingPhase\": \"Default\", \"AdditionalDependencies\": [ \"AirSim\" ] } ], \"TargetPlatforms\": [ \"MacNoEditor\", \"WindowsNoEditor\" ], \"Plugins\": [ { \"Name\": \"AirSim\", \"Enabled\": true } ] } Edit the Config\\DefaultGame.ini to add the following lines at the end: +MapsToCook=(FilePath=\"/AirSim/AirSimAssets\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/HUDAssets\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Beacons\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Blueprints\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Models\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Sensors\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/StarterContent\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/VehicleAdv\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Weather\") Doing this forces Unreal to include all necessary AirSim content in packaged builds of your project. If using Unreal Engine 5.3/5.4 check here for a fix to the camera scene rendering bug in these engine versions! Close Visual Studio and the Unreal Editor and right-click the LandscapeMountains.uproject in Windows Explorer and select Generate Visual Studio Project Files . This step detects all plugins and source files in your Unreal project and generates .sln file for Visual Studio. !!!tip If the `Generate Visual Studio Project Files` option is missing you may need to reboot your machine for the Unreal Shell extensions to take effect. If it is still missing then open the LandscapeMountains.uproject in the Unreal Editor and select `Refresh Visual Studio Project` from the `File` menu. Reopen LandscapeMountains.sln in Visual Studio, and make sure \"DebugGame Editor\" and \"Win64\" build configuration is the active build configuration. Press F5 to run . This will start the Unreal Editor. The Unreal Editor allows you to edit the environment, assets and other game related settings. First thing you want to do in your environment is set up PlayerStart object. In Landscape Mountains environment, PlayerStart object already exist, and you can find it in the World Outliner . Make sure its location is set up as shown. This is where AirSim plugin will create and place the vehicle. If its too high up then vehicle will fall down as soon as you press play giving potentially random behavior In Window/World Settings as shown below, set the GameMode Override to AirSimGameMode : Go to 'Edit->Editor Preferences' in Unreal Editor, in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. If you don't do this then UE will be slowed down dramatically when UE window loses focus. Be sure to Save these edits. Hit the Play button in the Unreal Editor. See the other documentation pages for how to use it. Congratulations! You are now running AirSim in your own Unreal environment. Updating Your Environment to Latest Version of AirSim Once you have your environment using above instructions, you should frequently update your local AirSim code to latest version from GitHub. Below are the instructions to do this: First put clean.bat (or clean.sh for Linux users) in the root folder of your environment. Run this file to clean up all intermediate files in your Unreal project. Do git pull in your AirSim repo followed by build.cmd (or ./build.sh for Linux users). Replace [your project]/Plugins folder with AirSim/Unreal/Plugins folder. Right-click on your .uproject file and chose \"Generate Visual Studio project files\" option. This is not required for Linux. Choosing Your Vehicle: Car or Multirotor By default, AirSim prompts user for which vehicle to use. You can easily change this by setting SimMode . Please see using car guide. Unreal 5.3/5.4 Scene camera bug Note that Unreal 5.3 and 5.4 breaks camera scene rendering when Effects is not set to the Epic scalability preset. You can use the console command r.DetailMode 2 to fix this at runtime! For the Blocks and other available environments we have made a fix for this. By placing a DefaultScalability.ini file in the Config folder of your Unreal project, you can set the scalability settings to custom values for each one (low, medium, high, epic, cine). As you can see in the Blocks environment, we have added the following to it to fix this bug automatically. You can find the DefaultScalability.ini file in the Unreal/Environments/Blocks folder. Copy this file to your Unreal project's Config folder. [EffectsQuality@0] r.DetailMode=2 [EffectsQuality@1] r.DetailMode=2 [EffectsQuality@2] r.DetailMode=2 [EffectsQuality@3] r.DetailMode=2 [EffectsQuality@Cine] r.DetailMode=2 FAQ What are other cool environments? Unreal Marketplace has dozens of prebuilt extraordinarily detailed environments ranging from Moon to Mars and everything in between. The one we have used for testing is called Modular Neighborhood Pack but you can use any environment. Another free environment is Infinity Blade series . Alternatively, if you look under the Learn tab in Epic Game Launcher, you will find many free samples that you can use. One of our favorites is \"A Boy and His Kite\" which is 100 square miles of highly detailed environment (caution: you will need very beefy PC to run it!). When I press Play button some kind of video starts instead of my vehicle. If the environment comes with MatineeActor, delete it to avoid any startup demo sequences. There might be other ways to remove it as well, for example, click on Blueprints button, then Level Blueprint and then look at Begin Play event in Event Graph. You might want to disconnect any connections that may be starting \"matinee\". Is there easy way to sync code in my Unreal project with code in AirSim repo? Sure, there is! You can find a bunch of .bat files (for linux, .sh ) in AirSim\\Unreal\\Environments\\Blocks . Just copy them over to your own Unreal project. Most of these are quite simple and self-explanatory. I get some error about map. You might have to set default map for your project. For example, if you are using Modular Neighborhood Pack, set the Editor Starter Map as well as Game Default Map to Demo_Map in Project Settings > Maps & Modes. I see \"Add to project\" option for environment but not \"Create project\" option. In this case, create a new blank C++ project with no Starter Content and add your environment in to it. I already have my own Unreal project. How do I use AirSim with it? Copy the Unreal\\Plugins folder from the build you did in the above section into the root of your Unreal project's folder. In your Unreal project's .uproject file, add the key AdditionalDependencies to the \"Modules\" object as we showed in the LandscapeMountains.uproject above. \"AdditionalDependencies\": [ \"AirSim\" ] and the Plugins section to the top level object: \"Plugins\": [ { \"Name\": \"AirSim\", \"Enabled\": true } ]","title":"Custom Unreal Environment"},{"location":"unreal_custenv/#creating-and-setting-up-unreal-environment","text":"This page contains the complete instructions start to finish for setting up Unreal environment with AirSim. The Unreal Marketplace has several environment available that you can start using in just few minutes. It is also possible to use environments available on websites such as turbosquid.com or cgtrader.com with bit more effort (here's tutorial video ). In addition, there also several free environments available. Below we will use a freely downloadable environment from Unreal Marketplace called Landscape Mountain but the steps are same for any other environments.","title":"Creating and Setting Up Unreal Environment"},{"location":"unreal_custenv/#note-for-linux-users","text":"There is no Epic Games Launcher for Linux which means that if you need to create custom environment, you will need Windows machine to do that. Once you have Unreal project folder, just copy it over to your Linux machine.","title":"Note for Linux Users"},{"location":"unreal_custenv/#step-by-step-instructions-when-using-cosys-airsim-from-precompiled-binaries","text":"It is assumed you downloaded the right precompiled Cosys-AirSim plugin from the GitHub releases page for the right Unreal version. In Epic Games Launcher click the Samples tab then scroll down and find Landscape Mountains . Click the Create Project and download this content (~2GB download). Open LandscapeMountains.uproject , it should launch the Unreal Editor. !!!note The Landscape Mountains project is supported up to Unreal Engine version 4.24. If you do not have 4.24 installed, you should see a dialog titled `Select Unreal Engine Version` with a dropdown to select from installed versions. Select 5.4 to migrate the project to a supported engine version. If you have 4.24 installed, you can manually migrate the project by navigating to the corresponding .uproject file in Windows Explorer, right-clicking it, and selecting the `Switch Unreal Engine version...` option. Go to the LandscapeMountains project folder and create a new subfolder called Plugins . Now copy the precompiled AirSim Plugin folder into this newly created folder. This way now your own Unreal project has AirSim plugin. Edit the LandscapeMountains.uproject so that you add the AirSim plugin to the list of plugins to load. json { ... \"Plugins\": [ { \"Name\": \"AirSim\", \"Enabled\": true } ] ... } Edit the Config\\DefaultGame.ini to add the following lines at the end: +MapsToCook=(FilePath=\"/AirSim/AirSimAssets\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/HUDAssets\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Beacons\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Blueprints\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Models\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Sensors\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/StarterContent\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/VehicleAdv\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Weather\") Doing this forces Unreal to include all necessary AirSim content in packaged builds of your project. If using Unreal Engine 5.3/5.4 check here for a fix to the camera scene rendering bug in these engine versions! Close the Unreal Editor and restart it by opening the uproject file again. In Window/World Settings as shown below, set the GameMode Override to AirSimGameMode : Go to 'Edit->Editor Preferences' in Unreal Editor, in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. If you don't do this then UE will be slowed down dramatically when UE window loses focus. Be sure to Save these edits. Hit the Play button in the Unreal Editor. See the other documentation pages for how to use it.","title":"Step-by-Step Instructions when using Cosys-AirSim from Precompiled Binaries"},{"location":"unreal_custenv/#step-by-step-instructions-when-using-cosys-airsim-from-source-build","text":"Make sure AirSim is built and Unreal 5.4 is installed as described in the installation instructions . In Epic Games Launcher click the Samples tab then scroll down and find Landscape Mountains . Click the Create Project and download this content (~2GB download). Open LandscapeMountains.uproject , it should launch the Unreal Editor. !!!note The Landscape Mountains project is supported up to Unreal Engine version 4.24. If you do not have 4.24 installed, you should see a dialog titled `Select Unreal Engine Version` with a dropdown to select from installed versions. Select 5.4 to migrate the project to a supported engine version. If you have 4.24 installed, you can manually migrate the project by navigating to the corresponding .uproject file in Windows Explorer, right-clicking it, and selecting the `Switch Unreal Engine version...` option. From the File menu select New C++ class , leave default None on the type of class, click Next , leave default name MyClass , and click Create Class . We need to do this because Unreal requires at least one source file in project. It should trigger compile and open up Visual Studio solution LandscapeMountains.sln . Go to your folder for AirSim repo and copy Unreal\\Plugins folder in to your LandscapeMountains folder. This way now your own Unreal project has AirSim plugin. !!!note If the AirSim installation is fresh, i.e, hasn't been built before, make sure that you run `build.cmd` from the root directory once before copying `Unreal\\Plugins` folder so that `AirLib` files are also included. If you have made some changes in the Blocks environment, make sure to run `update_to_git.bat` from `Unreal\\Environments\\Blocks` to update the files in `Unreal\\Plugins`. Edit the LandscapeMountains.uproject so that it looks like this json { \"FileVersion\": 3, \"EngineAssociation\": \"\", \"Category\": \"Samples\", \"Description\": \"\", \"Modules\": [ { \"Name\": \"LandscapeMountains\", \"Type\": \"Runtime\", \"LoadingPhase\": \"Default\", \"AdditionalDependencies\": [ \"AirSim\" ] } ], \"TargetPlatforms\": [ \"MacNoEditor\", \"WindowsNoEditor\" ], \"Plugins\": [ { \"Name\": \"AirSim\", \"Enabled\": true } ] } Edit the Config\\DefaultGame.ini to add the following lines at the end: +MapsToCook=(FilePath=\"/AirSim/AirSimAssets\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/HUDAssets\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Beacons\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Blueprints\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Models\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Sensors\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/StarterContent\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/VehicleAdv\") +DirectoriesToAlwaysCook=(Path=\"/AirSim/Weather\") Doing this forces Unreal to include all necessary AirSim content in packaged builds of your project. If using Unreal Engine 5.3/5.4 check here for a fix to the camera scene rendering bug in these engine versions! Close Visual Studio and the Unreal Editor and right-click the LandscapeMountains.uproject in Windows Explorer and select Generate Visual Studio Project Files . This step detects all plugins and source files in your Unreal project and generates .sln file for Visual Studio. !!!tip If the `Generate Visual Studio Project Files` option is missing you may need to reboot your machine for the Unreal Shell extensions to take effect. If it is still missing then open the LandscapeMountains.uproject in the Unreal Editor and select `Refresh Visual Studio Project` from the `File` menu. Reopen LandscapeMountains.sln in Visual Studio, and make sure \"DebugGame Editor\" and \"Win64\" build configuration is the active build configuration. Press F5 to run . This will start the Unreal Editor. The Unreal Editor allows you to edit the environment, assets and other game related settings. First thing you want to do in your environment is set up PlayerStart object. In Landscape Mountains environment, PlayerStart object already exist, and you can find it in the World Outliner . Make sure its location is set up as shown. This is where AirSim plugin will create and place the vehicle. If its too high up then vehicle will fall down as soon as you press play giving potentially random behavior In Window/World Settings as shown below, set the GameMode Override to AirSimGameMode : Go to 'Edit->Editor Preferences' in Unreal Editor, in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. If you don't do this then UE will be slowed down dramatically when UE window loses focus. Be sure to Save these edits. Hit the Play button in the Unreal Editor. See the other documentation pages for how to use it. Congratulations! You are now running AirSim in your own Unreal environment.","title":"Step-by-Step Instructions when using Cosys-AirSim from Source Build"},{"location":"unreal_custenv/#updating-your-environment-to-latest-version-of-airsim","text":"Once you have your environment using above instructions, you should frequently update your local AirSim code to latest version from GitHub. Below are the instructions to do this: First put clean.bat (or clean.sh for Linux users) in the root folder of your environment. Run this file to clean up all intermediate files in your Unreal project. Do git pull in your AirSim repo followed by build.cmd (or ./build.sh for Linux users). Replace [your project]/Plugins folder with AirSim/Unreal/Plugins folder. Right-click on your .uproject file and chose \"Generate Visual Studio project files\" option. This is not required for Linux.","title":"Updating Your Environment to Latest Version of AirSim"},{"location":"unreal_custenv/#choosing-your-vehicle-car-or-multirotor","text":"By default, AirSim prompts user for which vehicle to use. You can easily change this by setting SimMode . Please see using car guide.","title":"Choosing Your Vehicle: Car or Multirotor"},{"location":"unreal_custenv/#unreal-5354-scene-camera-bug","text":"Note that Unreal 5.3 and 5.4 breaks camera scene rendering when Effects is not set to the Epic scalability preset. You can use the console command r.DetailMode 2 to fix this at runtime! For the Blocks and other available environments we have made a fix for this. By placing a DefaultScalability.ini file in the Config folder of your Unreal project, you can set the scalability settings to custom values for each one (low, medium, high, epic, cine). As you can see in the Blocks environment, we have added the following to it to fix this bug automatically. You can find the DefaultScalability.ini file in the Unreal/Environments/Blocks folder. Copy this file to your Unreal project's Config folder. [EffectsQuality@0] r.DetailMode=2 [EffectsQuality@1] r.DetailMode=2 [EffectsQuality@2] r.DetailMode=2 [EffectsQuality@3] r.DetailMode=2 [EffectsQuality@Cine] r.DetailMode=2","title":"Unreal 5.3/5.4 Scene camera bug"},{"location":"unreal_custenv/#faq","text":"","title":"FAQ"},{"location":"unreal_custenv/#what-are-other-cool-environments","text":"Unreal Marketplace has dozens of prebuilt extraordinarily detailed environments ranging from Moon to Mars and everything in between. The one we have used for testing is called Modular Neighborhood Pack but you can use any environment. Another free environment is Infinity Blade series . Alternatively, if you look under the Learn tab in Epic Game Launcher, you will find many free samples that you can use. One of our favorites is \"A Boy and His Kite\" which is 100 square miles of highly detailed environment (caution: you will need very beefy PC to run it!).","title":"What are other cool environments?"},{"location":"unreal_custenv/#when-i-press-play-button-some-kind-of-video-starts-instead-of-my-vehicle","text":"If the environment comes with MatineeActor, delete it to avoid any startup demo sequences. There might be other ways to remove it as well, for example, click on Blueprints button, then Level Blueprint and then look at Begin Play event in Event Graph. You might want to disconnect any connections that may be starting \"matinee\".","title":"When I press Play button some kind of video starts instead of my vehicle."},{"location":"unreal_custenv/#is-there-easy-way-to-sync-code-in-my-unreal-project-with-code-in-airsim-repo","text":"Sure, there is! You can find a bunch of .bat files (for linux, .sh ) in AirSim\\Unreal\\Environments\\Blocks . Just copy them over to your own Unreal project. Most of these are quite simple and self-explanatory.","title":"Is there easy way to sync code in my Unreal project with code in AirSim repo?"},{"location":"unreal_custenv/#i-get-some-error-about-map","text":"You might have to set default map for your project. For example, if you are using Modular Neighborhood Pack, set the Editor Starter Map as well as Game Default Map to Demo_Map in Project Settings > Maps & Modes.","title":"I get some error about map."},{"location":"unreal_custenv/#i-see-add-to-project-option-for-environment-but-not-create-project-option","text":"In this case, create a new blank C++ project with no Starter Content and add your environment in to it.","title":"I see \"Add to project\" option for environment but not \"Create project\" option."},{"location":"unreal_custenv/#i-already-have-my-own-unreal-project-how-do-i-use-airsim-with-it","text":"Copy the Unreal\\Plugins folder from the build you did in the above section into the root of your Unreal project's folder. In your Unreal project's .uproject file, add the key AdditionalDependencies to the \"Modules\" object as we showed in the LandscapeMountains.uproject above. \"AdditionalDependencies\": [ \"AirSim\" ] and the Plugins section to the top level object: \"Plugins\": [ { \"Name\": \"AirSim\", \"Enabled\": true } ]","title":"I already have my own Unreal project. How do I use AirSim with it?"},{"location":"unreal_proj/","text":"Unreal Environment Setting Up the Unreal Project Option 1: Built-in Blocks Environment To get up and running fast, you can use the Blocks project that already comes with Cosys-AirSim. This is not very highly detailed environment to keep the repo size reasonable but we use it for various testing all the times and it is the easiest way to get your feet wet in this strange land. Follow these quick steps . Option 2: Create Your Own Unreal Environment If you want to setup photo-realistic high quality environments, then you will need to create your own Unreal project. This is little bit more involved but worthwhile! Follow this step-by-step guide .","title":"Setting up Unreal Environment"},{"location":"unreal_proj/#unreal-environment","text":"","title":"Unreal Environment"},{"location":"unreal_proj/#setting-up-the-unreal-project","text":"","title":"Setting Up the Unreal Project"},{"location":"unreal_proj/#option-1-built-in-blocks-environment","text":"To get up and running fast, you can use the Blocks project that already comes with Cosys-AirSim. This is not very highly detailed environment to keep the repo size reasonable but we use it for various testing all the times and it is the easiest way to get your feet wet in this strange land. Follow these quick steps .","title":"Option 1: Built-in Blocks Environment"},{"location":"unreal_proj/#option-2-create-your-own-unreal-environment","text":"If you want to setup photo-realistic high quality environments, then you will need to create your own Unreal project. This is little bit more involved but worthwhile! Follow this step-by-step guide .","title":"Option 2: Create Your Own Unreal Environment"},{"location":"using_car/","text":"How to Use Car in Cosys-AirSim By default Cosys-AirSim prompts user for which vehicle to use. You can easily change this by setting SimMode . For example, if you want to use car instead then just set the SimMode in your settings.json which you can find in your ~/Documents/AirSim folder, like this: { \"SettingsVersion\": 2.0, \"SimMode\": \"Car\" } Now when you restart Cosys-AirSim, you should see the car spawned automatically. Manual Driving Please use the keyboard arrow keys to drive manually. Spacebar for the handbrake. In manual drive mode, gears are set in \"auto\". Using APIs You can control the car, get state and images by calling APIs in variety of client languages including C++ and Python. Please see APIs doc for more details. Changing Views By default camera will chase the car from the back. You can get the FPV view by pressing F key and switch back to chasing from back view by pressing / key. More keyboard shortcuts can be seen by pressing F1. Cameras By default car is installed with 5 cameras: center, left and right, driver and reverse. You can chose the images from these camera by specifying the name .","title":"Car Mode"},{"location":"using_car/#how-to-use-car-in-cosys-airsim","text":"By default Cosys-AirSim prompts user for which vehicle to use. You can easily change this by setting SimMode . For example, if you want to use car instead then just set the SimMode in your settings.json which you can find in your ~/Documents/AirSim folder, like this: { \"SettingsVersion\": 2.0, \"SimMode\": \"Car\" } Now when you restart Cosys-AirSim, you should see the car spawned automatically.","title":"How to Use Car in Cosys-AirSim"},{"location":"using_car/#manual-driving","text":"Please use the keyboard arrow keys to drive manually. Spacebar for the handbrake. In manual drive mode, gears are set in \"auto\".","title":"Manual Driving"},{"location":"using_car/#using-apis","text":"You can control the car, get state and images by calling APIs in variety of client languages including C++ and Python. Please see APIs doc for more details.","title":"Using APIs"},{"location":"using_car/#changing-views","text":"By default camera will chase the car from the back. You can get the FPV view by pressing F key and switch back to chasing from back view by pressing / key. More keyboard shortcuts can be seen by pressing F1.","title":"Changing Views"},{"location":"using_car/#cameras","text":"By default car is installed with 5 cameras: center, left and right, driver and reverse. You can chose the images from these camera by specifying the name .","title":"Cameras"},{"location":"voxel_grid/","text":"AirSim provides a feature that constructs ground truth voxel grids of the world directly from Unreal Engine. A voxel grid is a representation of the occupancy of a given world/map, by discretizing into cells of a certain size; and recording a voxel if that particular location is occupied. The logic for constructing the voxel grid is in WorldSimApi.cpp->createVoxelGrid(). For now, the assumption is that the voxel grid is a cube - and the API call from Python is of the structure: simCreateVoxelGrid(self, position, x, y, z, res, of) position (Vector3r): Global position around which voxel grid is centered in m x, y, z (float): Size of each voxel grid dimension in m res (float): Resolution of voxel grid in m of (str): Name of output file to save voxel grid as Within createVoxelGrid() , the main Unreal Engine function that returns occupancy is OverlapBlockingTestByChannel . OverlapBlockingTestByChannel(position, rotation, ECollisionChannel, FCollisionShape, params); This function is called on the positions of all the 'cells' we wish to discretize the map into, and the returned occupancy result is collected into an array voxel_grid_ . The indexing of the cell occupancy values follows the convention of the binvox format. for (float i = 0; i < ncells_x; i++) { for (float k = 0; k < ncells_z; k++) { for (float j = 0; j < ncells_y; j++) { int idx = i + ncells_x * (k + ncells_z * j); FVector position = FVector((i - ncells_x /2) * scale_cm, (j - ncells_y /2) * scale_cm, (k - ncells_z /2) * scale_cm) + position_in_UE_frame; voxel_grid_[idx] = simmode_->GetWorld()->OverlapBlockingTestByChannel(position, FQuat::Identity, ECollisionChannel::ECC_Pawn, FCollisionShape::MakeBox(FVector(scale_cm /2)), params); } } } The occupancy of the map is calculated iteratively over all discretized cells, which can make it an intensive operation depending on the resolution of the cells, and the total size of the area being measured. If the user's map of interest does not change much, it is possible to run the voxel grid operation once on this map, and save the voxel grid and reuse it. For performance, or with dynamic environments, we recommend running the voxel grid generation for a small area around the robot; and subsequently use it for local planning purposes. The voxel grids are stored in the binvox format which can then be converted by the user into an octomap .bt or any other relevant, desired format. Subsequently, these voxel grids/octomaps can be used within mapping/planning. One nifty little utility to visualize a created binvox files is viewvox . Similarly, binvox2bt can convert the binvox to an octomap file. Example voxel grid in Blocks: Blocks voxel grid converted to Octomap format (visualized in rviz): As an example, a voxel grid can be constructed as follows, once the Blocks environment is up and running: import cosysairsim as airsim c = airsim.VehicleClient() center = airsim.Vector3r(0, 0, 0) output_path = os.path.join(os.getcwd(), \"map.binvox\") c.simCreateVoxelGrid(center, 100, 100, 100, 0.5, output_path) And visualized through viewvox map.binvox .","title":"Voxel Grid Generator"},{"location":"voxel_grid/#example-voxel-grid-in-blocks","text":"","title":"Example voxel grid in Blocks:"},{"location":"voxel_grid/#blocks-voxel-grid-converted-to-octomap-format-visualized-in-rviz","text":"As an example, a voxel grid can be constructed as follows, once the Blocks environment is up and running: import cosysairsim as airsim c = airsim.VehicleClient() center = airsim.Vector3r(0, 0, 0) output_path = os.path.join(os.getcwd(), \"map.binvox\") c.simCreateVoxelGrid(center, 100, 100, 100, 0.5, output_path) And visualized through viewvox map.binvox .","title":"Blocks voxel grid converted to Octomap format (visualized in rviz):"},{"location":"working_with_plugin_contents/","text":"How to use plugin contents Plugin contents are not shown in Unreal projects by default. To view plugin content, you need to click on few semi-hidden buttons: Causion Changes you make in content folder are changes to binary files so be careful.","title":"Working with UE Plugin Contents"},{"location":"working_with_plugin_contents/#how-to-use-plugin-contents","text":"Plugin contents are not shown in Unreal projects by default. To view plugin content, you need to click on few semi-hidden buttons: Causion Changes you make in content folder are changes to binary files so be careful.","title":"How to use plugin contents"},{"location":"xbox_controller/","text":"XBox Controller To use an XBox controller with AirSim follow these steps: Connect XBox controller so it shows up in your PC Game Controllers: Launch QGroundControl and you should see a new Joystick tab under settings: Now calibrate the radio, and setup some handy button actions. For example, I set mine so that the 'A' button arms the drone, 'B' put it in manual flight mode, 'X' puts it in altitude hold mode and 'Y' puts it in position hold mode. I also prefer the feel of the controller when I check the box labelled \"Use exponential curve on roll,pitch, yaw\" because this gives me more sensitivity for small movements. QGroundControl will find your Pixhawk via the UDP proxy port 14550 setup by MavLinkTest above. AirSim will find your Pixhawk via the other UDP server port 14570 also setup by MavLinkTest above. You can also use all the QGroundControl controls for autonomous flying at this point too. Connect to Pixhawk serial port using MavLinkTest.exe like this: MavLinkTest.exe -serial:*,115200 -proxy:127.0.0.1:14550 -server:127.0.0.1:14570 Run AirSim Unreal simulator with these ~/Documents/AirSim/settings.json settings: \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"SitlIp\": \"\", \"SitlPort\": 14560, \"UdpIp\": \"127.0.0.1\", \"UdpPort\": 14570, \"UseSerial\": false } } Advanced If the Joystick tab doesn't show up in QGroundControl then Click on the purple \"Q\" icon on left in tool bar to reveal the Preferences panel. Go to General tab and check the Virtual Joystick checkbox. Go back to settings screen (gears icon), click on Parameters tab, type COM_RC_IN_MODE in search box and change its value to either Joystick/No RC Checks or Virtual RC by Joystick . Other Options See remote controller options","title":"XBox Controller"},{"location":"xbox_controller/#xbox-controller","text":"To use an XBox controller with AirSim follow these steps: Connect XBox controller so it shows up in your PC Game Controllers: Launch QGroundControl and you should see a new Joystick tab under settings: Now calibrate the radio, and setup some handy button actions. For example, I set mine so that the 'A' button arms the drone, 'B' put it in manual flight mode, 'X' puts it in altitude hold mode and 'Y' puts it in position hold mode. I also prefer the feel of the controller when I check the box labelled \"Use exponential curve on roll,pitch, yaw\" because this gives me more sensitivity for small movements. QGroundControl will find your Pixhawk via the UDP proxy port 14550 setup by MavLinkTest above. AirSim will find your Pixhawk via the other UDP server port 14570 also setup by MavLinkTest above. You can also use all the QGroundControl controls for autonomous flying at this point too. Connect to Pixhawk serial port using MavLinkTest.exe like this: MavLinkTest.exe -serial:*,115200 -proxy:127.0.0.1:14550 -server:127.0.0.1:14570 Run AirSim Unreal simulator with these ~/Documents/AirSim/settings.json settings: \"Vehicles\": { \"PX4\": { \"VehicleType\": \"PX4Multirotor\", \"SitlIp\": \"\", \"SitlPort\": 14560, \"UdpIp\": \"127.0.0.1\", \"UdpPort\": 14570, \"UseSerial\": false } }","title":"XBox Controller"},{"location":"xbox_controller/#advanced","text":"If the Joystick tab doesn't show up in QGroundControl then Click on the purple \"Q\" icon on left in tool bar to reveal the Preferences panel. Go to General tab and check the Virtual Joystick checkbox. Go back to settings screen (gears icon), click on Parameters tab, type COM_RC_IN_MODE in search box and change its value to either Joystick/No RC Checks or Virtual RC by Joystick .","title":"Advanced"},{"location":"xbox_controller/#other-options","text":"See remote controller options","title":"Other Options"}]}
\ No newline at end of file
diff --git a/sensors/index.html b/sensors/index.html
index d646881..8303e33 100755
--- a/sensors/index.html
+++ b/sensors/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/settings/index.html b/settings/index.html
index 3effa0a..9506f92 100755
--- a/settings/index.html
+++ b/settings/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/simple_flight/index.html b/simple_flight/index.html
index 9a4f84e..6d02df4 100755
--- a/simple_flight/index.html
+++ b/simple_flight/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/sitemap.xml.gz b/sitemap.xml.gz
index dc3ec12..dc78b1a 100755
Binary files a/sitemap.xml.gz and b/sitemap.xml.gz differ
diff --git a/skid_steer_vehicle/index.html b/skid_steer_vehicle/index.html
index 2da75c8..2538647 100755
--- a/skid_steer_vehicle/index.html
+++ b/skid_steer_vehicle/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/steering_wheel_installation/index.html b/steering_wheel_installation/index.html
index 142eb91..f9115c2 100755
--- a/steering_wheel_installation/index.html
+++ b/steering_wheel_installation/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
- Support
-
diff --git a/unreal_blocks/index.html b/unreal_blocks/index.html
index 8fdfb07..e3f9d30 100755
--- a/unreal_blocks/index.html
+++ b/unreal_blocks/index.html
@@ -59,6 +59,8 @@
Custom Unreal Environment
+ Dynamic Objects
+
Using AirSim
-
- Who is Using AirSim
Working with UE Plugin Contents
Formula Student Technion Self-drive
- Support
-
diff --git a/unreal_custenv/index.html b/unreal_custenv/index.html
index 9ca44be..c3bfd91 100755
--- a/unreal_custenv/index.html
+++ b/unreal_custenv/index.html
@@ -91,6 +91,8 @@
+ Dynamic Objects
+
Using AirSim
- Support
-
@@ -508,7 +501,7 @@