It’s still hard to say if the voice-controlled aluminum can will be the next smartphone or the next Segway, but either way, brands are preparing. This morning Adobe launched a new set of analytics tools, Adobe Sensei for Voice, to help brands take advantage of conversational data to improve targeting and, ideally, conversions.
Adobe says it can consume data from Alexa, Siri, Google Assistant, Cortana and Bixby (lol one day). The company captures both user intent and contextual data — context that can be put to use by brands when targeting customers across other channels like social and email.
This means Adobe can track the actions users most often take with their conversational AI of choice and the things they regularly interact with — think calling an Uber versus listening to the latest Portugal. The Man album.
In practice, developers and brands will have access to the frequency of user interactions and the sequential actions taken after engaging with a particular service. Similarly, Amazon offers its own tool to assist gathering and aggregating similar metrics, like total customer interactions, usage time and frequency of intent.
Adobe has been paying a lot of attention to the market potential for conversational AI embedded in speaker hardware. The company released metrics today that painted the Amazon Echo Dot as the market leader in the growing space.
Regardless of what happens to this market, Adobe has a reasonable competitive edge with the relationships it already has with brands. And unlike tools built by companies with skin in the game like Amazon, Adobe is a relatively market-agnostic player.