MIT develops ‘magic’ carpet that can detect if person is doing sit-ups or other exercises

Posted on

Researchers at the Massachusetts Institute of Technology have developed an ‘intelligent’ carpet that can sense human movement and poses without using cameras, opening up a whole new world in both gaming and health care. 

At this week’s Conference on Computer Vision and Pattern Recognition, a team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrated a 36′ by 2′ mat than can extrapolates the sitter’s posture, movement and relationship to the ground in a 3D model.

If a user steps onto the mat and performs a sit up, the system can produce an image of a figure doing a sit up.

The mat is trained on synchronized tactile and visual data, such as video footage and a heatmap of a volunteer doing a sit up or pushup.   

Scroll down for video 

Researchers at MIT are working on a mat embedded with thousands of sensors that can detect pressure from feet, limbs and other body parts. The tech could be used to track workouts or detect falls

Researchers at MIT are working on a mat embedded with thousands of sensors that can detect pressure from feet, limbs and other body parts. The tech could be used to track workouts or detect falls

It’s larger than the average 8′ x 10′ area rug, but it’s still scalable and would be economically feasible for home use, the engineers say.

Made of commercial-grade pressure-sensitive film and conductive thread, the mat has over 9,000 sensors that detect pressure from feet, limbs and other body parts and convert it into an electrical signal.

In tests, the system was able to predict a volunteer’s motion with 97 percent accuracy and determine their pose within an error of less than four inches.

Other efforts at monitoring movements have mostly relied on external cameras, either worn by the subject or mounted nearby.

A camera filming live action (left) works with tactile sensor data to train the mat's neural network to recognize sit ups, pushups and other movements

A camera filming live action (left) works with tactile sensor data to train the mat’s neural network to recognize sit ups, pushups and other movements

But cameras raise privacy concerns, the team says, and don’t always have a clear view.

The CSAIL team only used cameras to train the ‘intelligent’ carpet’s initial dataset—once that was achieved, its neural network could determine if someone was stretching, laying prone, or doing something else, based solely on tactile information.

The 'tactile-sensing' carpet (top right) is embedded with more than 9,000 sensors that interpret a variety of human poses and activities, including walking, sitting, forward thrusts and waist bends

The ‘tactile-sensing’ carpet (top right) is embedded with more than 9,000 sensors that interpret a variety of human poses and activities, including walking, sitting, forward thrusts and waist bends

Predictably, the mat was better at determining legs and lower body motions than ones from above the torso.

It was also unable to predict gestures that lacked direct floor contact, ‘like free-floating legs during sit-ups or a twisted torso while standing up,’ the researchers reported.

Co-author Yunzhu Li, a Ph.D. student at MIT, envisions the mat being incorporated into gaming or home workouts.

‘Based solely on tactile information, it can recognize the activity, count the number of reps, and calculate the amount of burned calories,’ Li said in a statement.

The mat interprets data from the pressure map to construct a 3D model of the person's action. It could be used to determine how many calories were burned in a workout or if a person has fallen and needs assistance

The mat interprets data from the pressure map to construct a 3D model of the person’s action. It could be used to determine how many calories were burned in a workout or if a person has fallen and needs assistance

But it could also have implications in health monitoring for the elderly, added lead author Yiyue Luo, monitoring physical rehab routines or detecting falls.

The researchers hope to evolve the system to determine more granular info including height and weight, and to generate metrics for multiple users at once, like a couple dancing.

In March, another CSAIL team debuted clothing with sensors that could similarly track a person’s movement and determine if the wearer was sitting, walking, or doing particular poses.

Their ‘smart’ wardrobe—which included a vest, socks, and other clothing—could one day be used by coaches to improve athletes’ posture and performance or even to monitor residents in assisted-care facilities

In a statement, CSAIL graduate student Yiyue Luo said that, unlike many existing wearable electronics, their ‘machine-knitted tactile textiles’ are soft and breathable and could be easily incorporated into mass-produced clothing.

‘When you manufacture lots of sensor arrays, some of them will not work and some of them will work worse than others,’ said Luo, lead author of a report in the journal Nature Electronics. ‘So we developed a self-correcting mechanism that uses a self-supervised machine learning algorithm to recognize and adjust when certain sensors in the design are off base.’

The technology could be used on a humanoid robot’s skin to provide the kind of ‘tactile sensing’ that humans enjoy, said material engineer Wan Shou, co-author of the Nature Electronics study. 

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *