Microsoft details four ways it's experimenting with haptic VR
VR has gained so much traction in the last couple of years that its futuristic, sci-fi sheen has been diminished in the eyes of a lot of consumers. Most people now expect that they will be interacting with the technology in some form within the next decade.
The area of haptic feedback, having the ability to touch and feel virtual objects, is still likely to elicit shocked gasps from consumers. Unsurprisingly, there are currently a wealth of companies racing to trying to create the killer piece of haptic hardware.
A recent blog post, Microsoft’s research team has detailed four distinct approaches to haptic controllers. The tech giant is trying to move past the current range of tactile devices, which are mostly limited to buzzes created by internal motors. As the blog states:
“There are many reasons why haptic is such a hurdle. Anyone who’s been to the movies understands that the eye and the ear can be tricked; film after all, at 24 frames per second, isn’t true motion. But haptic is different and represents a challenge many orders of magnitude larger in complexity. Some of the challenges lie in the area of hardware. Laboratory prototypes such as exoskeletons and other hand mounted devices tend to be cumbersome, both to fit to individual users as well as to don and remove.”
Many of the current range of prototype devices that are known about are trying to simulate singular types of sensation such as heat or weight, and the complex mechanics are likely to render the devices expensive, fragile and cumbersome.
“What you really want is the impression of virtual shapes when you interact with objects in VR, not just the binary feedback you get from current devices,” said Microsoft researcher Christian Holz.
“Our controllers render such shapes, giving your fingers and hands continuous and dynamic haptic feedback while you interact.”
Here are the four prototypes detailed in the blog post.
The CLAW looks to extend the current paradigm for VR controllers to include haptic capabilities. The user’s index finger is placed into a mount that uses a motor to move the finger in a way that simulates force feedback.
The controller can also adapt its haptic rendering based on sensing differences in the user’s grasp and the virtual context in which they are interacting. This could allow for incredibly accurate grasping and grabbing sensations.
Here is a product introduction video.
The Haptic Wheel looks to simulate the feeling of friction and different materials. The controller uses an actuated wheel that moves up and down in order to simulate running your finger over a surface, and spins in order to “render shear forces and motion as the user slides along a virtual surface.”
The user’s index finger rests in a groove in the wheel, so all of the haptic sensation is localised to their fingertip.
You can see the Haptic Wheel in action here.
These dual controllers aim to solve the problem of using both hands in virtual environments, such as to carry an object in both hands or using a device like a bow and arrow. Two controllers are connected with a range of sensors that are capable of dynamically altering the forces felt in the user’s hand. So, for example, the hand pulling back the bow strings would feel a realistic build-up of pressure.
Watch a demonstration of the device here.
The team are also exploring ways to make virtual experiences accessible to all. The Canetroller allows people who use a cane in real life to transfer that ability to virtual environments.
The controller provides three kinds of haptic feedback to users. They will feel physical resistance from a programmable brake mechanism that makes it harder to move the cane along the floor, as well as vibrotactile feedback simulating the feel of hitting objects and 3D auditory effects.
Watch a demo video here.