CSAIL/Fadel Adib

Sorry Superman, We Have Our Own X-Ray Vision Now

Superman just got a little less super, sorry.

It was crazy enough when we heard this week that scientists are close to making a real-life tractor beam. Very cool.

But, because science never sleeps, today we bring you the MIT engineers who have clearly put in their 10,000 hours reading Superman graphic novels. Are we about to fly? Shoot heat lasers from our eyes? Nope. Just X-Ray vision.

You heard me, freakin' X-ray vision.

For the past two years, researchers at MIT's Computer Science and Artificial Intelligence Lab (CSAIL) have been telling anyone who would listen that WiFi signals could be used to "see" things that are invisible to the eye. They've been in the lab working on wireless signals that can track human motion and now they've submitted a paper for an upcoming scientific conference where they will unveil a new technology they are calling "RF Capture."

What does it do? It "picks up wireless reflections of the human body to see the silhouette of a human being standing behind a wall." NBD, just x-ray vision, basically. According to MIT News, "by tracking the silhouette, the device can trace a person’s hand as he writes in the air and even distinguish between 15 different people through a wall with nearly 90 percent accuracy. In other words, from the opposite side of a building, RF Capture can determine where you are, who you are, and even which hand you are moving."

Researchers say it could have tons of applications, from monitoring a baby's breathing remotely to allowing firefighters to determine if there are survivors in a building and even revolutionizing Hollywood's motion capture technology.

"Today actors have to wear markers on their bodies and move in a specific room full of cameras," PhD student and lead author of the paper Fadel Adib said. "RF Capture would enable motion capture without body sensors and could track actors’ movements even if they are behind furniture or walls."

CSAIL

Another one of the paper's authors, Dina Katabi, said the team is already working on an in-home device that can call 911 if it detects an unconscious family member or adjust your heating and air conditioning based on where you are in the house. The technology could also be used in gaming, allowing you to "interact with a game from different rooms or even trigger distinct actions based on which hand you move."

How does it work?

The device transmits wireless signals through walls that bounce off a persons body and then back to the device, which captures the reflections, analyzes them and determines the person's silhouette. According to the team, it scans 3-D space "to capture wireless reflections off objects in the environment, including the human body."

But, since only a small portion of body parts are able to reflect a signal back at any given point, it monitors how those parts vary as someone moves and then "intelligently stitches the person's reflections across time to reconstruct his silhouette into a single image." After time, the technology is able to differentiate between different bodies and create "silhouette fingerprints" for each individual.

MIT CSAIL

All images used with permission of MIT's Computer Science and Artificial Intelligence Lab.