One of the challenges that need to be addressed in developing a comprehensive metaverse is how to emulate tactile sensations.
Accepting this challenge, a team at Meta’s research lab developed a haptic glove that is extremely comfortable, customizable, and most importantly, can produce a range of sensations in the virtual world including textures, pressures and vibrations.
This glove development effort is still in the early stages of research, and they presented it in a slightly odd video. Later as the technology becomes more practical, Meta hopes to be able to sell these gloves, which can be used in conjunction with their AR headsets or glasses. The company says that these gloves will change the user’s Mixed Reality experience, and make it even more interesting.
Meta said that the development of this glove has been going on for seven years, and is now reportedly successful in developing new techniques, technologies and disciplines. Meta shares some examples of how the device has innovated over the years:
Perceptual Science: Since current technology can’t completely recreate real-world movements and experiences in VR, we explored the idea of combining sound, display and motion for things like giving the glove user a sensation of the weight of an object in the virtual world.
Soft Robotic: Current mechanical actuators create too much heat for this kind of glove, making it less comfortable to wear all day. To solve this problem, we created a new soft actuator, relying on small motors throughout the glove that will move together to provide sensation to the user.
Microfluidics: We developed the world’s first microfluidic speed processor, a microfluidic chip that controls the air that drives the actuators. The use of air meant we could fit more actuators into the glove than any other electronic circuit would allow.
Hand Tracking: Despite having an air movement controller, the system needs to know when and where to deliver the right sensation. We built an advanced hand tracking technology, to allow the device to identify precisely where the user’s hand is moving, or whether it is interacting with a virtual object and how it is interacting.
Haptic Rendering: Our haptic renderer sends precise instructions to the actuator in the hand, based on an understanding of the location of the hand and virtual objects.