The Quest 2 Supernatural fitness app tracks knee strikes by analyzing head and arm movements

Image: Inside

The article can only be displayed with JavaScript enabled. Please enable JavaScript in your browser and reload the page.

Fitness app Supernatural captures knee strikes. How is this possible with Meta Quest 2? The founder of the studio gives an answer.

Supernatural is one of the most successful VR apps for Meta Quest 2. It’s so successful that Meta bought the studio in late 2021. Or at least wanted to: The US competition authority FTC is currently trying to prevent the acquisition.

Supernatural is reminiscent of Beat Sabre: you smash flying objects to the beat of the music. Unlike Beat Sabre, however, the focus is entirely on fitness. The VR app offers hundreds of workouts and real trainers, but costs a $20 monthly fitness subscription.

Outside of the US and Canada, Supernatural is not yet available due to licensing issues. The VR app offers well-known music tracks from all genres that have until now only been licensed to North America.

The latest update introduces a new category of notes: To get a hit, you must smash objects with your knee. This should activate your lower body and core.

The question is how the VR application achieves this. Meta Quest 2 only captures head and hand movements. The device can’t really recognize what the lower body is doing.

There has been a lot of speculation surrounding the feature. According to one theory, the VR app uses sensor data from the bottom tracking cameras to detect the position of the knees. In the Supernatural Facebook group, studio founder Chris Milk responded:


“Knee strikes are tracked using an algorithm that analyzes the movement of your head and arms to infer whether or not you’re executing the move,” Milk wrote.

According to Milk, Supernatural is not using a new interface. An AI model that is trained for knee strikes and recognizes when the user is about to kick based on subtle movement patterns is obviously sufficient.

AI motion prediction: a lot of untapped potential

Milk admits the tracking isn’t perfect. “If you’re just standing there, they should whip you, but if you’re trying to hit them or hit them with the opposite knee, they’re still likely to explode. Overall, it works surprisingly well, giving you a great sense of accomplishment when the targets explode just as you drive your knees through them,” Milk wrote.

Milk also says they hope to offer more accurate lower body tracking when new tracking functionality becomes available in the future.

In a research paper, Meta recently demonstrated how well artificial intelligence can predict movement. According to the article, an AI model can plausibly animate a full-body avatar based solely on sensor data from the headset and two controllers. Such technology should help simulate legs for meta-avatars in the near future. Latency is still high.

Leave a Comment