At the TechCrunch Disrupt NY 2016 Hackathon, a team of engineers teamed up to create an app that can help travelers visiting a foreign country better understand what food is being described on a menu. Just by pointing their phone at the restaurant’s menu, the app will recognize the text and translate that into an image of the food item in question which is then overlaid on top of the camera’s viewfinder.
The members of the team behind the hack, Ocean Huang, Brent Bevolo, Jose Portcarrero and Bret Deasy, initially came up with the idea for a menu-scanning app after first toying with an app that would scan the text in books. But they soon realized that wasn’t the best use of the technology at hand.
“This was a natural pivot from the first idea,” noted Bevolo.
The group started work on the MenuMe Android app at midnight last night, so it’s fair to say that MenuMe is not in a state where it can be uploaded to Google Play at this time.
The app takes advantage of the Vuforia Android SDK to implement the text and image recognition functionality and to display the 3D objects in augmented reality. This is actually a simple but practical implementation of AR, in fact – it’s translating a language you don’t understand into the universal language of photos.
Not only can you figure out what the menu reads, MenuMe could also help you decide if the meal looks tasty to you.
While the hack demonstrated today was a bit thrown together, if the team decides to complete the project, they said they would use Google Image Search to source the food imagery to match the text.
But will the app make it to Google Play one day?
“We’ll see,” said Huang.