Tutorial Contents
We cover “big picture” ideas of Mixed Reality and how we envision that it will transform how we interact with robots, along with technical details on a few different ways to do colocalization to allow any Mixed or Augmented Reality device to share a coordinate frame with a robot. Finally, there is a practical portion where we introduce a few of the tools that are necessary to create full Mixed Reality experiences with robotics. This takes the form of several demos that attendees will be able to build and run on their own, and adapt to use with their own robots.
The tutorial features five videos on the IROS 2020 streaming site (opens in new tab):
- Introduction to MR and Robotics
- Interaction
- Mixed Reality as an intuitive bridge between robots and humans
- MR, AR, VR, a brief overview of differences and sample devices
- Modes of Interaction in MR
- Colocalization
- Co-localization with Mixed Reality devices
- AR-tag-based
- Vision-based
- Shared-map-based
- Azure Spatial Anchors
- Technical introduction
- How to use ASA to colocalize different devices
- Co-localization with Mixed Reality devices
- Demo 1: Interaction [Source Code (opens in new tab) (opens in new tab)]
- Writing and deploying phone and Hololens apps
- Unity
- ROS# and ROS bridge for interfacing with ROS
- Interacting with a virtual robot through AR and MR
- Writing and deploying phone and Hololens apps
- Demo 2: Colocalization
- Azure Spatial Anchors SDK for localization of robots and MR devices
- Creating and querying spatial anchors using sample data
- How to use this code with your own camera
Demo Materials
Demo 1 – Interaction
Sample code for the exercises in this demo can be found here: https://github.com/microsoft/mixed-reality-robot-interaction-demo (opens in new tab) (opens in new tab)
This repo contains an extensive wiki (opens in new tab) (opens in new tab) with instructions on how to run the demo with pre-built apps and docker containers, how to set up your system to develop and deploy MR apps, and how to adapt the sample code to your own robot.
Demo 2 – Colocalization
This demo relied on Microsoft’s Azure Spatial Anchors (ASA) service, which has been retired (see announcement (opens in new tab)). All code and instructions associated with this demo have also been taken down.(opens in new tab) (opens in new tab)
Conclusion
We hope that this information and these tools help you to incorporate Mixed Reality into your robotics projects, for colocalization and/or human-robot interaction. We would like to encourage you to send us feedback on your experience with the tutorial. Please engage with us on GitHub by filing issues (for questions or problems not covered in the wikis) or contributing to the two repositories.