Abstract: |
Autonomous ground robot navigation in outdoor unstructured environments is challenging due to the complexity of real-world scenes. Gathering data for model training in these conditions is costly and risky, emphasizing the need for simulation environments. However, there is a lack of simulators for such environments. To fill this gap, we introduce MIDGARD, an open-source platform for outdoor navigation based on Unreal Engine, offering photorealistic environments, procedural scene generation and compatibility with OpenAI Gym and overcoming the limitations of simulation platforms like CARLA, AirSim, HABITAT and OAISYS.
MIDGARD is a flexible, high-performance, open-source platform designed for outdoor navigation in unstructured environments. It allows users to configure and extend the core engine and is equipped with a wide suite of ready-to-use sensors. It also provides access to internal state variables for training or reward computation in a reinforcement learning setting. One of MIDGARD's key features is its ability to generate varying and dynamic simulated scenes on the fly. The scene generation process in MIDGARD is fully procedural and requires no human intervention beyond an initial setup stage. Scene descriptors define the base map for the scene type and a set of placeable world objects, each defined by its 3D polygonal mesh, attributes and instantiation constraints. The virtual agent in the simulated scene is defined by two component modules: the perception module and the control suite. The perception module provides a full set of sensors designed for navigation, including vision sensors and low-level sensors for agent state measurements. The control suite includes two types of 4-wheel vehicles. |