AI-fueled Synqq app update lets you use your voice to add notes and calendar entries

If there is one thing we learned last week at CES, it’s that the power of voice is coming soon to a device near you — even your fridge. Synqq released an update in the iOS App Store this week, that brings voice and artificial intelligence to its calendar and notes app.

What’s more, the app works with Apple’s AirPod wireless headphones, introduced last summer, giving a practical business application to Apple’s latest gizmo.

Synqq CEO Ramu Sunkara believes that voice is the way we will interact with our mobile devices increasingly moving forward. His company’s product enables you to navigate using voice to find information about meetings, notes, people, or create new meetings (completing forms with your voice).

“Recording voice notes is not new, but Synqq offers a compelling new level of personal productivity, with or without Apple AirPods,” he said.

It’s built on an artificial intelligence foundation that combines Machine Learning (ML), Natural Language Processing (NLP) and Named-entity Recognition (NER) to improve the results from the speech recognition engine. While it’s built on top the Google voice recognition today, the AI platform has been architected in such a way, the company could replace it in the future if the need arose, Sunkara told TechCrunch.

All of this technology when taken together, enables users to simply speak their commands. There is no need for any trigger word as with Amazon Echo (which uses “Alexa” as the trigger word). Instead, you open the app and tap the volume-up button when using the phone, or tap an AirPod when using the app in conjunction with the wireless headphones.

Once it’s active, you see a blue bar at the top of the screen that vibrates as you speak. It picks up any spoken words, but it only responds to commands it understands. “The Synqq platform understands such natural language requests as ‘Add a note,’ ‘Share a note,’ ‘Create a meeting,’ or ‘When was my last meeting with Bob?,'” Sunkara explained. It will even understand more complex commands like “Create a new meeting for tomorrow morning at 9 am with Bob,” getting that tomorrow morning is the next day and adding it to the calendar on the appropriate day and time.

What’s more, because of the built-in intelligence, the more you use the program, the more it should begin to understand your voice and respond faster to your requests.

While it requires you to use its built-in notes and calendar program, if you use an external notes program like Evernote, you can push content to Evernote to maintain an archive there if you wish.

I tested it briefly and it works pretty much as described. It has decent voice recognition and you can even hitch a bit and hem and haw, and the voice recognition waits and catches up with your command. It wasn’t fool-proof in my brief test, but it did well, and with the machine learning component, it should improve over time.

“Voice control is pretty addictive. You don’t want to use touch or keyboard because it so addictive. [We believe] it will become default way of interacting with mobile,” Sunkara said.