Nothing's in my cart
2- minute read
Imagine controlling a robot with biomimicry capabilities. It not only perceives the surrounding environment but also conveys expressions with the help of the VIVE Facial Tracker. Wouldn't this make operating a remote-control robot feel more like "possessing" a machine? Let's explore this fascinating concept further.
Tokyo Denki University recently unveiled the 'Yui' biomimetic robot system, which can not only replicate the operator's gaze, mouth movements, and voice-related expressions for communication but also transmit images and sounds back to the operator using the VIVE Pro Eye headset.
This bidirectional and synchronous perception allows the robot to achieve a level of interaction similar to video calls but with a touch of human presence. For instance, in telemedicine, a doctor could use Yui to interact with a patient from afar, as if the patient were speaking directly to the doctor in person. Or, when you're stuck at work, you could catch up with your children through Yui at home, seeing and hearing about their day or viewing their schoolwork and drawings. Your expressions of joy and satisfaction can also be conveyed to your children.
Yui is designed with a child-like, gender-neutral face (perhaps to leverage the natural human affinity for cuteness), and it features up to 28 deformable parts in its head and neck to express emotions. Additionally, it is equipped with stereo cameras and microphones in its eyes and ears, respectively, to capture images and sounds. The information is then presented to the operator via a VR headset. The operator's head movements, facial expressions, and voice are also detected by sensors and microphones within the VR headset and synchronized with "Yui" using the VIVE Facial Tracker.
Doesn't it look a bit smug? (Source: MS1-アバター共生社会)
The Tokyo Denki University team has enhanced communication by integrating VR headsets, moving beyond the traditional one-way robot operation and message reception. In their demonstration video, they showcased the use of the VIVE Pro Eye and VIVE Facial Tracker. With the introduction of the VIVE Full Face Tracker — which features dual eye-tracking cameras and a facial tracker camera capable of capturing 38 blend shapes of the lips, jaw, cheeks, chin, teeth, and tongue, along with a smooth 60 Hz facial expression tracking rate — could this technology make interactions even more realistic?
The team utilizes the VIVE Pro Eye and is encouraged to try the latest VIVE Full Face Tracker. (Source: MS1-アバター共生社会)
The research team hopes this technology will provide a more tangible sense of "in-person interaction" than video calls. Even from a distance, adding "expressions" can lead to more satisfying communication quality, applicable in interviews, medical consultations, meetings, or other scenarios where expressions help us judge and understand one another. A study published last year by the University of Montreal showed that 15 minutes of face-to-face interaction is more efficient than an hour of online communication.
These are Yui's expressions. Look below and try to guess its emotion. (Source: IEEE Access)
Yui can express basic emotions, but there is room for improvement in terms of precision. (Source: IEEE Access)
However, the team also acknowledges in their paper that several challenges remain to be overcome, including the accurate reproduction of facial expressions. While 'Yui' can express basic emotions, there is still room for improvement in terms of precision. The synchronization between the operator and the biomimicry robot relies heavily on pre-determined parameters, which presents a significant barrier to widespread adoption.
The research team named it Yui (ゆい), hoping it would serve as a link between people. However, if translated into Chinese as '結衣,' it might evoke different images and distractions. In the post-pandemic era, envisioning a world where everyone could embody robots like Yui for remote communication, what other uses could you imagine for such technology?