Nvidia is trying to put AI in charge of haptics in games.

General
Nvidia is trying to put AI in charge of haptics in games.

Every few years, there is a new effort to bring immersive and meaningful haptics to the forefront of gaming, and as we can see with DualSense, this is another step up in the journey that has led us from rumble packs to full body haptic vests. So far, however, many of these solutions rely on pre-programmed or auditory-responsive haptics, and according to a group of researchers at Nvidia, with a little something they like to call machine learning, more dynamic and flexible haptics They say it is possible to achieve this.

Yes, Nvidia is intent on finding other uses for machine learning.

In a recently published patent filed in September 2019, a team of Nvidia researchers advocates a different approach to generating accurate haptics through machine learning. They believe that an intelligent algorithm can learn to detect certain "features" in content, such as games, and generate appropriate haptic responses in any hardware to which it is connected.

As for the use of machine learning in gaming, this sounds pretty amazing, at least if you ask me. [Haptic effects have long been offered to enhance content, such as vibrations, buzzing, etc. to the remote control or other device the user is using when watching or listening to content. To date, haptic effects have been provided either by programming controls for haptic effects within the content itself or by providing an interface to audio that simply maps a particular haptic effect to a particular audio frequency. The present disclosure provides a haptic control interface that intelligently induces haptic effects on content, in particular by using machine learning to detect specific features in the content and then inducing specific haptic effects on those features."

The patent, as is often the case, is a bit thin on specifics. As mentioned in the paper, the haptic control interface could work with custom circuitry, a CPU or GPU, or a combination of hardware and software.

As per the paper's broad vision of possible applications, the door is open to a wide range of devices. This includes wired and wireless units, haptic interfaces that are local and haptic interfaces that are in the cloud. There can be one or multiple haptic devices, and they can be built without further training to accommodate different content sources, such as games or movies.

The initial haptic control interface, whatever form it takes, will require preliminary training to get up to speed with video images, objects, audio signals, etc. From there, the rest would be picked up on the fly, without prior knowledge of the game or movie at hand.

"Haptic effects can be predefined to correspond to features, such as specific haptic effects for gunshots. The haptic control interface then causes the remote controller to provide the determined haptic effect, thereby coordinating the haptic effect experienced by the user with the gunfire experienced by the user in the video game."

It states.

As with many up-and-coming machine learning concepts in gaming, the reality may be somewhat different from the original concept. However, of all the uses of machine learning out there, I have to say that clever haptics feel like a sure bet to actually make it into our gaming rigs someday.

Categories