Artificial Intelligence to Track Real Time Body Shapes

AI is rapidly coming into our homes, phones, smartwatches and more. But now, artificial intelligence is coming into something else too. Our clothes. Or to put it in the right words, stripping off our clothes. Now though that might sound creepy and weird, it definitely has its own uses.

The reason behind this new technology is the idea to capture a human being in motion. All ready for a movie or an augmented reality game, body tracking is being used. And for that, motion capture artists have to wear skin-tight suits because their ordinary dresses make it hard for the system to tell how their body is moving.

And so the idea is simple. The more the dresses worn, the more difficult it is for the system to identify the body. Well, the multi-institutional project (PDF) which is to be presented at the CVPR in Salt Lake City combines data with smart assumptions as to how a body is shaped and what a body can do.

The result is an X-ray vision that reveals the shape and position of a person’s body underneath their clothes. This particular software works in real time and even during quick movements and actions like dancing or running.


So, how does this actually work? The paper builds on two previous methods, Dynamic Fusion and BodyFusion. The first process uses a single-camera depth data to estimate a body’s pose, and the second uses the model of a skeleton to estimate pose. Both the process loses track during fast motion, and so when they are combined, they create something called Double Fusion.

This thus creates a plausible skeleton from both the data and shrinks wrap it with the skin at an appropriate distance from the core. When both the first and second process is brought together, the results are much better than either method alone. Also with this method is the ability to see and track body movements even when other dress and accessories get in the way.

But this does not mean that this software is without any shortcomings. The software sometimes tends to overestimate the size of a person if a lot of clothes are being worn, and it gets hard to say if a person is wearing either a jacket or if that person is plain fat.

At present, the ability to see if a particular person uses a console or game controller is also limited, and it is hard to see such minute movements now. Such minute movements have been kept aside for work in the future.


For this project, hands on deck were called by researchers and engineers like Tao Yu of the Tsinghua University in China, and also researchers from Google, Beihang University, and the Max Planck Institute.

The various authors and researchers who helped in this particular project believe that this accuracy in body movements would help various applications and projects. And this project is also the first time that users can digitize themselves easily.

The applications of this technology are endless. Not just when coming to entertainment, but this technology can also be used in physiotherapy, bodybuilding and also many other avenues.