Please Select Your Location
Australia
Österreich
België
Canada
Canada - Français
中国
Česká republika
Denmark
Deutschland
France
HongKong
Iceland
Ireland
Italia
日本
Korea
Latvija
Lietuva
Lëtzebuerg
Malta
المملكة العربية السعودية (Arabic)
Nederland
New Zealand
Norge
Polska
Portugal
Russia
Saudi Arabia
Southeast Asia
España
Suisse
Suomi
Sverige
台灣
Ukraine
United Kingdom
United States
Please Select Your Location
België
Česká republika
Denmark
Iceland
Ireland
Italia
Latvija
Lietuva
Lëtzebuerg
Malta
Nederland
Norge
Polska
Portugal
España
Suisse
Suomi
Sverige
<< Back to Blog

The Hidden Role of Virtual Reality in Humanoid Robot Learning and Training

VIVE POST-WAVE Team • Aug. 15, 2024

| |

3-minute read

In our previous article, we explored the cutting-edge humanoid robot, Figure 02, backed by major players like NVIDIA, OpenAI, and Microsoft. However, you might not be aware of the intricate connections between humanoid robots and technologies like virtual reality (VR), digital twins, and the metaverse. Simply put, VR is crucial for training humanoid robots and enabling their autonomous learning.

From large language models (LLMs) and diffusion models to Mixture of Experts (MoE) and multimodal models, all AI models you've heard of rely on algorithms, data, and computing power. While data's importance might not be immediately apparent—since training data for AI models is readily accessible from publicly available digital knowledge and our online digital footprints—acquiring data for training humanoid robots is more challenging. VR headsets serve as a direct gateway for this purpose.

Creating with AI.

VR Headsets: From Remote Control to Data Feeding

Sci-fi fantasies about controlling massive robots, like in "Neon Genesis Evangelion" or "Pacific Rim," are nothing new. One of my personal favorites, "Vox Machinae," is a VR game that involves piloting a mech and is incredibly fun to play.

We previously discussed the German team from NimbRo, who used VR headsets and robotic arms to remotely control robots, successfully completing multiple tasks and winning the ANA Avatar XPrize. Recently, we also covered a Japanese team that used the VIVE Pro Eye with a facial tracker to enable robots to mimic human facial expressions in their research.

The ability for humanoid robots to mimic facial expressions, impacts their ability to naturally interact with humans. (Source: MS1-アバター共生社会)

Another example of using VR headsets to control robots is the startup IHMC Robotics, which previously used VR headsets to operate their humanoid robot, Nadia, as a sparring partner in boxing (so it doesn't hurt when punched). They also recently demonstrated Nadia playing ping-pong with humans under extremely low latency conditions using VR headsets.

These examples illustrate that VR headsets are not only capable of controlling robots but also serve as a medium for feeding data into robots, facilitating their training.

Metaverse: The Hyperbolic Time Chamber for Humanoid Robots

Beyond head-mounted displays that immerse us in virtual environments, the most powerful application of virtual reality lies in "simulation." Recently, NVIDIA released a video showcasing the training of humanoid robots using the Apple Vision Pro, which is no coincidence.

We previously discussed the potential of VR in simulation and training, although those cases involved humans, not robots. If you vividly remember the image of Jensen Huang transforming into Tony Stark with a backdrop of robots, that's the best example of applying VR's simulation potential to robot training.

This tribute to Isaac Asimov, the creator of the "Three Laws of Robotics," is embodied in the robot training platform Isaac Sim, which is derived from NVIDIA's metaverse platform Omniverse. In essence, NVIDIA uses digital twins and GPU power to simulate various scenarios for robot training. This includes well-known entities like Boston Dynamics and even Figure 02, which uses Isaac Sim for training. (Is it a coincidence that NVIDIA invested in Figure? Probably not.)

Watching this little robot walk is especially therapeutic. (Source: YouTube)

An interesting recent example is the "Berkeley Humanoid," a small, somewhat clumsy-walking robot that has been frequently spotted around the UC Berkeley campus. This project, initiated by a doctoral student at UC Berkeley, also utilized Isaac Sim for training.

According to the research, this Berkeley Humanoid was built for less than $10,000—a relatively low-cost for a non-mass-produced, limited-production robot. Impressively, after only a short training period in Isaac Sim, the Berkeley Humanoid's walking ability is already comparable to that of humans, at least faster than a toddler learning to walk. In a demonstration video of a downhill test, even the cameraman fell, but the robot did not stumble and walked quite well. I highly recommend watching the video for its entertainment value.

After reading this article, it's up to you to decide whether to welcome the arrival of humanoid robots with open arms or do you feel like we need to be even more careful?