See what the European Rover Challenge Simulation Task is actually about.
What does performing a space mission simulation for the European Rover Challenge competition look like? You’re about to find out. And what if I told you you can take the wheel and navigate a Martian rover on the Red Planet’s simulated surface yourself? Read on to learn how :).
In order to be qualified for the remote formula of the ERC competition, the candidates have to perform the Simulation Task using Gazebo. They need to showcase their efforts in the simulation by creating a 2-minute screencast video which is then evaluated. In the video, they’re expected to present a simulation of the Leo Rover robot traversing the terrain with some added features, as well as a simulation of a robotic arm. The maximum number of points the candidates can get for this task is 45.
The ERC Simulation Task aims at helping the teams strengthen and verify their software concepts, hone their teleoperation skills before using them in the field on the real robot, as well as test their ability to describe and demonstrate their work in the form of a short video.
The Gazebo simulation we’ve made for the ERC consists of a model of the Mars yard, the Leo Rover robot with attached hardware, and the software. We used a drone to 3D-scan the area making the three-dimensional model of the terrain, which was an exact copy of the actual ERC Mars yard.
Below, we present selected 3 teams’ approaches to the ERC 2022 Simulation Task.
In their short video, the DJS Antariksh team from Dwarkadas J. Sanghvi College of Engineering in India presented how, with the use of a keyboard teleoperating node, they controlled the Leo Rover.
To detect AR tags on the landmarks and relay the positions, the robot used the ar_track_alvar library (see how to add Alvar to a Leo Rover). Using the onboard odometry sensors, the rover localized itself when a goal was given. With the rtab mapping algorithm, the robot created a 3D map of the area.
Then, it identified obstacles and planned a route. In the simulation of the UR3 Robotic Arm, the team used MoveIt for motion planning, navigation, and 3D perception. They controlled the arm with a joystick. Moveit generated an occupancy grid based on the depth data from the RealSense camera. With the use of a flexible collision library it avoided collisions, and the open motion planning library allowed it to track a safe route for the robotic arm. Using the ar_track_alvar library, AR tags were identified.
The team scored 28 points out of 45 for their video.
Another team that presented their Simulation Task video is Mars Rover Manipal from Manipal Academy of Higher Education in India. With the use of the Hazcam and Navcam visualization on RViz, they spawned the Leo Rover mobile robot on the Gazebo simulator. An RViz plugin allowed them to visualize the IMU data. The team teleoperated the rover with an XBOX 360 controller. By converting the information of the depth from the camera to laser scan data, the simulated rover was able to detect and avoid obstacles.
Thanks to this solution, the team could better analyze the terrain. For autonomous traversal, they developed a custom implementation of the A* algorithm using SLAM. The designated route contained all the required waypoints where the robot dropped the probes. With the use of Gazebo, RViz and Moveit, the UR3 robotic arm was simulated, to which the team attached a Robotiq-2F Gripper and designed a panel. They teleoperated the arm with an XBOX-one controller.
The team scored 33 points out of 45 for their video.
And here’s what Project RED from the University of Modena and Reggio Emilia in Italy demonstrated in their video. To show waypoints location, the team added extra features to the simulated terrain. The Leo Rover robot was teleoperated manually with various controllers.
The team then presented live feed from Depth Sensor and RGB cameras, as well as demonstrated IMU data reading. For obstacle avoidance, they generated both local and global costmaps for planning a path.
The rover detected AR tags and estimated their positions. For autonomous navigation (see how to perform autonomous navigation on a Leo Rover) and probe deployment, the Project RED team used custom interfaces. In the simulation of the Maintenance Task, they used additional features to test various routines. With the use of a joystick, they teleoperated the robotic arm. For constrained path planning, the team added some fictitious obstacles. They also presented simulated pick & place operation, as well as a custom interface to autonomously perform competition operations.
The team scored 37 points out of 45 for their video.
The teams presented above are merely 3 of many participants of the ERC 2022 Simulation Task. Feel free to check out videos of other teams:
ADVAIT from India
Alma-X Rover Team from Italy
ERIG from Germany
GTU Rover from Turkey
IIT Bombay Mars Rover Team from India
INFERNO DTU from India
IUT Mars Rover - Team Avijatrik from Bangladesh
Kapsul Rover Team from Turkey
Mind Cloud from Egypt
OzU Rover from Turkey
Project Scorpio from Poland
Red Giant Rover Team from Turkey
Robocol from Colombia
SHUNYA from India
Team Anveshak from India
TEAM AURORA from India
Team Interplanetar from Bangladesh
Team Phoenix from Bangladesh
Team Robocon IITR from India
Do you want to play with the Leo Rover robot in the simulated Mars yard too? Say no more! The Gazebo simulation for the ERC task is free and available to everyone. You don’t even have to be an ERC contestant :). Go to our tutorial to see how to run the simulation on your computer. Enjoy!
See what software platforms we’ve listed as the coolest and the most promising ones that come in handy in robotics.
Explore different exciting applications of the language model in robotics.