Skip to content

The gym simulation environment used in our IEEE-IV paper "Uncertainty-Aware DRL for Autonomous Vehicle Crowd Navigation in Shared Space"

Notifications You must be signed in to change notification settings

Golchoubian/PedMove_gym

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PedMove_gym

This repository provides a 2D gym environment that simulates pedestrian trajectories in the presence of vehicles within a shared space. It is designed for training and testing decision-making algorithms for the ego vehicle navigating among pedestrians. The pedestrian trajectory behaviors are derived from the HBS dataset, which was captured in a shared space in Germany.

A unique feature of this gym environment is its integration with the uncertainty-aware data-driven pedestrian trajectory predictor algorithm called Uncertainty-aware Polar Collision Grid (UAW-PCG). As a result, the UAW-PCG algorithm's predictions are included as example states within this simulation environment.

This gym environment was used in our paper to train a deep reinforcement learning-based navigation algorithm for an autonomous vehicle navigating among pedestrians in a shared space. It is part of the complete code provided for implementing our algorithm within the paper and can also be readily used for the development and evaluation of other reinforcement learning algorithms for autonomous vehicle navigation in crowded environments.

scenario295 HBS [1]

[1]: Trajectories of a shared space in Hamburg (HBS dataset)

Image taken from: Cheng, Hao, Fatema T. Johora, Monika Sester, and Jörg P. Müller. "Trajectory modelling in shared spaces: Expert-based vs. deep learning approach?." In Multi-Agent-Based Simulation XXI: 21st International Workshop, MABS 2020, Auckland, New Zealand, May 10, 2020, Revised Selected Papers 21, pp. 13-27. Springer International Publishing, 2021.

Installation

Create a virtual environment or conda environmnet using python version 3.9, and Install the required python packages:

pip install -r requirements.txt

Install pytorch version 1.12.1 using the instructions here:

pip install torch==1.12.1 torchvision==0.13.1 torchaudio==0.12.1

Overview

This repository is organized in two main folders:

  • envs/: Contatins the files for the simulation environment.

  • ped_pred/: Includes files for running inference on our pedestrian trajectory prediction model, UAW-PCG.

This simulation environment consists of 310 scenarios extracted from the HBS dataset, each scenario corresponding to one of the vehicles presnet in this dataset. These scenarios are divided into training, testing, and validation sets for robust evaluation.

Pedestrian trajectory behaviors are realistically simulated using real-world data from the HBS dataset. Meanwhile, the ego vehicle's actions can be determined by any reinforcement learning (DRL) algorithm.

A key advantage of this environment is its incorporation of real human-driven trajectories for each scenario. This allows for direct comparison between human decisions and the trajectories generated by trained autonomous vehicle navigation policies.

Usage

This section explains the configuration options available in two key files:

  • PedMove_gym/config.py: This file allows you to adjust various simulation parameters. Here are some key options:

    • sim.predict_method: This parameter controls the pedestrian trajectory prediction integration within the simulation environment. It offers three options:

      • inferred: Generate the pedestrians predicted trajectory based in UAW-PCG predictor model.
      • none: Disables any prediction, relying only on the current state.
      • truth: Outputs the ground truth prediction from the dataset (for evaluation purposes).
    • sim.render: Set this parameter to True to visualize the simulation environment. Scenarios will be stored as GIFs in the Simulated_scenarios/gifs directory.

    • action_space.kinematics: This parameter defines the kinematics options for the ego vehicle (robot), including holonomic and unicycle

  • PedMove_gym/arguments.py: This file allows you to specify scenarios and execution phases. Here are some key options:

    • phase: This parameter defines the execution phase for the scenarios: (train, val, test)

    • test_case: This parameter specifies the scenarios to run within the chosen phase:

      • -1: Runs all scenarios within the chosen phase.

      • A single number in raneg: [0-310]: Runs the specified scenario number.

        Note: Enter a valid scenario number based on the following split for train, validation, and test sets::

        • validation: scenario numbers: 0 - 48
        • train: scenario numbers: 48 - 247
        • test: scenario numbers: 248 - 310
    • consider-veh: This boolean parameter specifies whether to include and visualize other vehicles in the scenario besides the ego vehicle.

This repository focuses on providing a Gym simulation environment for pedestrian trajectory behavior in the presence of vehicles within a shared environment. It also explores the integration of this environment with a pedestrian trajectory prediction model. Therfore, currently, this file uses real driver trajectories from the dataset for action calculation of the ego vehicle (by setting robot.human_driver to True). However, the framework allows for the replacement of this behavior with a decision-making algorithm. An example of such an algorithm can be found in UncertaintyAware_DRL_CrowdNav.

Citation

@article{golchoubian2024uncertainty,
  title={Uncertainty-Aware DRL for Autonomous Vehicle Crowd Navigation in Shared Space},
  author={Golchoubian, Mahsa and Ghafurian, Moojan and Dautenhahn, Kerstin and Azad, Nasser Lashgarian},
  journal={IEEE Transactions on Intelligent Vehicles},
  year={2024},
  publisher={IEEE}
}

About

The gym simulation environment used in our IEEE-IV paper "Uncertainty-Aware DRL for Autonomous Vehicle Crowd Navigation in Shared Space"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages