MIT Algorithm Can Predict When People Will Hug, Kiss Or Shake Hands

Posted: Jun 22 2016, 4:07am CDT | by , Updated: Jun 22 2016, 2:14pm CDT, in News | Latest Science News


MIT Algorithm Can Predict When People Will Hug, Kiss or Shake Hands
Credit: Carl Vondrick/MIT CSAIL

MIT researchers have taught an artificial intelligence system to understand body language patterns and to predict human interactions.

When two individuals meet, people around them can sense what is going to happen next: hug, handshake, high five or even kiss. 

Thanks to our ability to understand body language pattern, which comes with years of experiences, we can anticipate human interactions accurately more often than not. 

Now MIT researchers want AI systems to predict these human interactions too. They have taught a computer system to understand the body language pattern between two people in order to guess their next move.

The new algorithm is trained on YouTube videos and TV shows such as “The Office” and “Desperate Housewives” and looks for physical movements like a raised hand, outstretched arm or eye coordination to predict how two people will interact.

“Humans automatically learn to anticipate actions through experience, which is what made us interested in trying to imbue computers with the same sort of common sense. We wanted to show that just by watching large amounts of video, computers can gain enough knowledge to consistently make predictions about their surroundings.” Lead author Carl Vondrick from MIT’S Computer Science and Artificial Intelligence Laboratiry (CSAIL) said in a statement.

For this purpose, researchers fed artificial intelligence system 600 hours worth of videos and allowed it to assess individual image pixel by pixel. Then, researchers applied a deep-learning technique called “neural networks” to teach the system how to sift through huge amounts of data and to find patterns on their own. AI system was eventually able to predict four basic actions with precision: hug, handshake, high five and kiss.

“A video is not like a ‘Choose Your Own Adventure’ book where you can see all of the potential paths,” said Vondrick. “The future is inherently ambiguous, so’ it’s exciting to challenge ourselves to develop a system that uses these representations to anticipate all of the possibilities.”

So how accurate was the algorithm in predicting human interactions?

The algorithm correctly anticipated the action more than 43 percent of the time when videos were paused one second before the real action. This is far from perfect because existing algorithms can predict actions with 36 percent of accuracy. Even human are not error free in judging interactions. They can predict actions accurately 71 percent of the times. 

“There is a lot of subtlety to understanding and forecasting human interactions. We hope to able to work off of this example to be able to soon predict even more complex tasks.”

Though, the AI system is not accurate enough yet for real-world application, still it’s a significant step and could have implications for navigating human environments and emergency response systems in future.

You May Like


The Author

Hira Bashir covers daily affairs around the world.




Leave a Comment

Share this Story

Follow Us
Follow I4U News on Twitter
Follow I4U News on Facebook

Read the Latest from I4U News