Disney’s Research Lab Figures Out How To Put Words In Your Mouth

research lab

Here’s another absurd one out of Disney’s research labs. We’ve seen them build adorable robots to draw on beaches, 3D printers that make huggable objects out of felt, and make spinning tops out of seemingly impossible shapes.

Their latest trick? Using algorithms to make people say things they didn’t actually say.

Think of it a bit like lip reading — or like those bad lip dub videos that are ever so popular. Lip reading well is tough because a lot of the shapes we make with our mouth when speaking are quite similar to each other, and so much of what we think we hear is a combination of visual/auditory clues smashing together in your brain. Take away the audio, and “bah” and “vah” and “gah” all look incredibly similar.

It’s what enables the McGurk illusion, as demonstrated here:

Taking that to an extreme, one of Disney’s research teams has figured out a way to generate a list of things you could be saying based off what you actually are.

A recording of someone saying “clean swatches” for example, could be redubbed to say some 9,000 different phrases all while fitting quite well with the way their lips originally moved. Not all 9,000 phrases make sense, of course; in fact, most of them are gibberish. You get everything from “need no pots” to the fairly creepy “like to watch you” — but when the stars align, it breaks your brain a little.

Here’s the demo video (remember — they’re using a roboto voice here for efficiency, but the illusion would be further improved with proper human dubbing):

What’s the point? Beyond just being a damned neat example of how quirky our brains are, it’s not hard to think of practical uses. Dubbing swear words out of movies without resorting to “Yippee Kay Yay, Mr. Falcon”, perhaps. But it’s mostly just friggin’ cool.