The technology allows researchers to study patients with Parkinson's disease and allows them to monitor mobility. Artificial intelligence demonstrates the movement of a person behind a wall by using a stick figure as a representative of their body. (Image via MIT)


Researchers have made it possible to see a person in motion on the opposite side of a wall through the use of radio signals and artificial intelligence. This technology isn't entirely new in today's world. It has been used before on patients with Parkinson's disease to monitor their movements in their homes.

The interest in this technology surfaced a few decades ago. The project at the time had just begun to flourish, but there wasn't much revealed when it was being used. It would only manage to reveal a blob of a person behind a wall, and this was before recent advancements made to technology that allows researchers to see something far more than just a blob. It shows the person behind a wall as a skeletal stick-figure and can show their mobility as they're moving around. In such instances, it focuses on their joints like elbows, hips, and feet. Everything the person does is live.

 

The radio signal they use for this is not as strong as a WiFi signal, but it's quite similar. The radio waves can move through solid objects - like a wall which then bounces off a human body then transmits the signal back to the device. Since humans are made mostly of water, the ability for radio waves to move through a human body isn't achievable. The machine learning tool called a neural network then interprets the signal and provides viewing of the person on screen.

 

This neural network is trained by researchers by inputting data into its system and allowing it to create its own rules in order to learn. Which is done through a process called supervised learning. Neural networks are mostly used to interpret images, but can also carry out complicated tasks. Some of those tasks include translating languages from one language to another or create new text from the text it's been given.

 

A problem in this is that labeling precisely where the head, joints, etc. are isn't a simple task because the radio waves do not have the ability to label the data that bounces off a person - they can only label an image.

 

A solution to this problem is to combine both the camera and radio and label the images created by the camera to correlate the activates for the neural network. This was done without a wall obstructing the person so the camera could have a view of what should be displayed. They were used during the training stages - using labels from the camera and the wireless signal.

 

It was discovered that after the training period was over the system could not only detect people who weren't blocked by a wall or any other for of obstruction but also detected those who weren't in view. The system was able to create a stick figure of the human behind the wall.

 

The system can also tell people apart just by how they walk. With the use of an additional neural network, the system was able to see examples of people walking and associated those with different instances that involved the same person. The system used this data to identify individuals who were walking, especially if there was a wall present.

 

Researchers have previously used this system on those who have Parkinson's disease, and the study was conducted on seven people throughout the duration of eight weeks. The device was always situated in the patients' homes - allowing for them to be monitored without the use of cameras in a comfortable setting - thus, making it less invasive. It provided results to the researches about the behavior of the patient and provided them an idea of their quality of life. These devices aren't used on people who do not give their consent, of course. But... who knows how it'll be used.

 

 

Have a story tip? Message me at: cabe(at)element14(dot)com

http://twitter.com/Cabe_Atwell