MIT Researchers 'Eavesdrop' On Conversation By Looking At A Bag Of Potato Chips (VIDEO)

MIT researchers have developed an algorithm that can reconstruct an audio signal through minute vibrations depicted in a video.

In one trial the research team was able to determine intelligible speech using only the vibrations from a potato chip bag photographed from 15 feet away and through soundproof glass, MIT reported. The method was also successfully performed using the surface of a glass of water, aluminum foil, and the leaves of a potted plant.

"When sound hits an object, it causes the object to vibrate," said Abe Davis, a graduate student in electrical engineering and computer science at MIT and first author on the new paper. "The motion of this vibration creates a very subtle visual signal that's usually invisible to the naked eye. People didn't realize that this information was there."

Determining intelligible speech requires the object is photographed with a high-speed camera that captures between 2,000 and 6,000 frames per second. But standard digital cameras could still reveal details of the speaker, such as their gender and voice quality.

The method could be used in forensics and law enforcement, but it could also lead to "a new kind of imaging." In the experiment the researchers were able to determine the motions of the objects they were filming within a tenth of a micrometer; this corresponds to five thousandths of a pixel in a close-up image. Looking at the change of a single pixel's color value over time allows researchers to infer motions even smaller than a pixel.

"We're recovering sounds from objects," Davis said. "That gives us a lot of information about the sound that's going on around the object, but it also gives us a lot of information about the object itself, because different objects are going to respond to sound in different ways."

The researchers are now trying to determine "material and structural properties of objects from their visible response to short bursts of sound," MIT reported.

WATCH:

Real Time Analytics