WiFi New Solution For Metaverse Body Tracking

Full body tracking in the metaverse may now be possible using only WiFi signals, according to new a study by three Carnegie Mellon University researchers.

Jiaqi Geng, Dong Huang and Fernando De la Torre found that “common WiFi antennas [or 1D sensors] can be used as the sole source of active sensing to track fine human movements in a room.”

Mapping the human body

The researchers developed what they are calling a “deep neural network” that maps the phase and amplitude of WiFi signals to UV coordinates within 24 human regions. They used three WiFi transmitters and three receiver antennas to assess several people.

Results of the 13-page study, published in December, show that “our model can estimate the dense pose of multiple subjects by utilizing WiFi signals as the only input.”

“This paves the way for low-cost, broadly accessible, and privacy-preserving algorithms for human sensing. We believe that WiFi signals can serve as a ubiquitous substitute for RGB images for human sensing in certain instances,” it said.

UV coordinates are used in the process of projecting a 2-dimensional image, or 2D, onto the surface of a 3D model to create texture mapping. Dense pose is designed to map all human pixels of an RGB image to a 3D surface of the human body, according to experts.

RGB stands for the primary colors red, green and blue. It typically refers to three hues of light that can be mixed together to create different colors. A combination of the three colors is the standard method of producing color images on TVs, computer monitors, and smartphones.

Expanding human tracking research

WiFi New Solution For Metaverse Body Tracking

The first row illustrates the hardware setup. The second and third rows are the clips of amplitude and phase of the input WiFi signal. The fourth row contains the dense pose estimation of the algorithm from only the WiFi signal.

The Carnegie Mellon University study expands on the use of WiFi signals in combination with so-called “deep learning architectures”, commonly used in computer vision, to estimate dense human pose correspondence.

There is already a great deal of progress made in this area using 2D and 3D sensors over the last few years, fueled by applications in autonomous driving and augmented reality (AR). It includes uses linked to RGB sensors and radars.

However, traditional sensors are criticized for being too pricey for the ordinary everyday user. For example, a simple radar detector costs between $200 – $600. They are blamed for using too much power and have been flagged as a threat to user privacy in non-public spaces.

A lot of people would not install the tech say in their home bathrooms and other such private areas. For RGB cameras, narrow field of view and poor lighting conditions, such as glare and darkness, can have a severe impact on camera-based approaches, said the study.

It also mentioned occlusion as another obstacle that prevents the camera-based model from generating reasonable pose predictions in images. This is “especially worrisome for indoors scenarios, where furniture typically occludes people,” it adds.

WiFi body tracking protects user privacy, say researchers

WiFi New Solution For Metaverse Body Tracking

According to the researchers, illumination and occlusion have little effect on WiFi-based solutions used for interior monitoring. The equipment required for this sort of tech is cheaper and can protect user privacy, they claimed.

“Compared to video or lidar, WiFi offers a better privacy-preserving signal,” said Fernando De La Torre, a co-author of the study who researches Computer Vision and Machine Learning at Carnegie Mellon University. He was speaking to AiBreakfast in a recent interview.

“This is critical in applications such as monitoring the well-being of elderly people at home (e.g., detecting falls, computing the amount of social interaction, or detecting potential health concerns), where many users will feel uncomfortable using cameras,” he said, adding:

“WiFi adds an additional layer of anonymity because it cannot be immediately interpreted by humans and it can only recover shape information rather than texture.”

Body tracking in the metaverse

Tracking your body movements in the metaverse is a big deal. It helps to make an otherwise virtual experience feel real. Companies such as Sony have been at the forefront of trying to bring this sense of reality to the metaverse.

In November 2022, the Japanese electronics company launched “Mocopi“, a device that allows users to translate their body movements onto a metaverse avatar. The unit is made up of six motion-tracking bands worn on your hands, feet, back and head.

Worth around $350, Mocopi is a play on the phrase “mocap”, meaning motion picture. It is designed to “track your body to create videos or operate avatars in real time with metaverse apps like VRChat.” It offers tools that let users “import motion data into 3D animation apps.”

 

“Normally, video production using motion capture requires dedicated equipment and operators,” said Sony in a statement at the time.

“By utilizing our proprietary algorithm, ‘Mocopi’ realizes highly accurate motion measurement with a small number of sensors, freeing VTubers [virtual YouTubers] and creators involved in movie and animation production from time and place constraints.”

Downside of WiFi human tracking

The Carnegie Mellon University WiFi-based full body tracking tool could hold some promise for future use in the metaverse. However, the current version is “trained for our scenario and specific hardware,” said Fernando De La Torre.

There is also a few downsides to the tech. Positioning for the four separate devices should be impeccable. It may also interfere with other WiFi networks.

“It is important to keep in mind that a number of variables, like the position and orientation of WiFi devices, the presence of objects, and the movement of people and things in the environment, can alter the WiFi signals,” De La Torre explained.

“Therefore, more research is required to make the approach robust to these factors, before releasing the code/model.”

This article is originally from MetaNews.

Leave A Reply

Your email address will not be published.