Unity is hosting a workshop at ROS World 2021 to showcase robotics simulation using Unity and ROS2
Many industrial scenarios, such as order fulfillment in warehouses and distribution centers, require multiple robots to operate in a coordinated fashion. Instead of using one complex humanoid robot to perform all tasks, multiple different and specialized robots are used. Advances brought by ROS2 have made it easier to coordinate complex multi-robot systems, but testing them in simulation has still been challenging.
In this workshop, you’ll learn how to use the Unity engine and ROS2 to simulate multiple mobile robots that must perform object pose estimation in order to complete a “find-and-ferry” task in a dynamic warehouse environment.
This will be a workshop from 10 am to 2 pm CT, open to ROS World attendees on October 19, 2021. The workshop will be broken into four main phases.
Workshop Agenda | ||
---|---|---|
Phase I | Unity and environment setup | |
Phase II | Single robot navigation | |
Phase III | Find-and-ferry | |
Phase IV | Using object pose estimation |
Phase I: Unity and environment setup
The workshop will begin with an introduction to the Unity Editor downloading all required packages and assets. Attendees will then learn how to import a robot using the URDF Importer to set up the FindBot. They will load and configure the warehouse environment and add a cube. Finally, attendees will set up the ROS-TCP-Connector and ROS-TCP-Endpoint to ensure proper communication between ROS2 and Unity.
Takeaways:
- How to create a Unity Scene
- How to use the Package Manager to download and install Unity packages
- How to manipulate objects in the scene
- How to set up ROS-TCP communication
- How to use the URDF Importer to import a robot
Phase II: Single robot navigation
With the scene and robot properly loaded, this phase of the workshop will focus on autonomous navigation. Attendees will learn to connect various sensors to the robot. ROS 2 will be used running Nav2 utilizing the sensor data that is generated from the Unity scene. At the end of this phase, you will have a robot that can autonomously go from point A to B.
Takeaways:
- How to add sensors to a robot
- How to set up Nav2 publishers and subscribers
- How to set up message visualizers
- How to run the Nav2 simulation in Unity
Phase III: Find-and-ferry
Now that a single robot (the “FindBot”) can move around the Unity scene autonomously, you will bring in a second robot (the “FerryBot”), which is equipped with a manipulator. This robot will be responsible for executing the pick-and-place task. Finally, attendees will learn how to put all the pieces together: deploy the trained pose estimation model to a ROS2 node and simulate a fully autonomous find-and-ferry task, wherein the robotic arm can pick up the object and place it on the FerryBot for ferrying to the desired location.
Takeaways:
- How to add a manipulator to a mobile robot to create the FerryBot
- Set up inverse kinematics for the FindBot
- Deploying an inference model for object pose estimation
- How to set up and run the Nav2 simulation with multiple robots simultaneously
Phase IV: Using object pose estimation
In this phase, attendees will go through the process of using the Unity Perception Package to collect synthetic data for training a computer vision model that performs object pose estimation.
Takeaways:
- How to use the Unity Perception Package for data collection
- Setup domain randomization parameters
Level up your robot!
As the workshop draws to a close, we will show you how you can expand this project to include additional robots, add more cubes used in the object pose estimation portion, and answer questions you may have about making additional changes.
Requirements
Presenters