With the introduction of the Firefly and Dynamic Perspective SDKs, mobile app developers who have grown tired of iterating on activities like messaging, photo-sharing, socializing and more now have new ways to differentiate themselves from a growing number of app store competitors.
With Amazon’s newly opened up visual recognition technology, developers will not only be able to make their existing apps smarter and more aware of objects out in the real world, they can potentially develop new kinds of applications entirely. And with Dynamic Perspective, they have a whole new way of navigating within those applications, too.
Firefly was introduced as a key feature in Amazon’s just-announced Fire Phone – the company’s first-ever smartphone.
Using a dedicated hardware button on the side of the phone, mobile consumers are able to identify almost anything they see or hear using the smartphone’s camera and other sensors, including text, products, movies, TV shows, books, games, CDs, business cards, web addresses, barcodes, QR codes and more.
The feature is similar to that of Flow, a visual-recognition technology several years in development that only a few months ago was introduced in Amazon’s flagship application. There’s also a bit of a Shazam-like function, with Firefly’s audio and video recognition capabilities.
What’s interesting, however, is that Amazon is not just making Firefly a part of its smartphone software and hardware. It’s attempting to seed an app ecosystem with new types of apps and services that competitors Apple and Google won’t have.
Third-party adopters of Firefly technology already include several popular mobile app makers, like StubHub, MyFitnessPal, iHeartRadio and Vivino.
But Amazon isn’t just hoping for better and smarter versions of existing apps, thanks to Firefly. It’s looking for a whole new category to emerge. “The Firefly SDK is available starting today so developers can invent new ways to use this advanced technology,” the announcement reads (emphasis ours). “Later this year, Firefly will include artwork recognition, foreign language translation, and wine label recognition powered by Vivino.”
Most of the potential use cases for Firefly, from Amazon’s perspective, are all about driving more sales. See or hear anything, identify it automatically, and Amazon will help you acquire it instantly. Hear a good song? Push the Firefly button and buy. See some nice shoes? Buy. Drink a great wine? Buy.
Naturally, one of the default built-in actions in the Firefly SDK is to point the end user to purchase the item in question from Amazon. But the SDK documentation indicates that developers can add additional actions to that list, too, like pointing the user to a detail page within the developer’s own application or launching a website, for instance.
Identification can also serve as just the first step in directing a user to more information beyond just the “what.” For example, a sample plug-in called “Exempli,” also described in the developer documentation, first identifies the musical artist behind a song being heard, then searches an external service to determine if there are any upcoming shows by that artist within a 50-mile radius of the user’s current location.
To what extent Amazon will allow competitors to build on top of this technology is unknown. Can Netflix point users to its own movies, for instance? That seems iffy. (We asked Amazon to clarify this and will update if it does).
Dynamic Perspective is clever, too. This new technology is about responding to how an end user holds and moves the phone. For instance, Zillow is using this to let users zoom in on pictures just by moving the phone closer to them. And they can move their head to peek around the corner in photos of a home’s interior.
But the technology seems more promising in terms of game development. Just as video games evolved in the living room from mashing buttons to moving your body, Dynamic Perspective takes the next step from just tilting the phone back and forth — as the major platforms support today — to actually getting more of your body involved in the game.
For example, Ezone.com created an “endless runner”-type game (think Temple Run, etc.), which has you navigate with just your head. There’s also a special flip you can do in one of its games that involves you flipping your head, too.
The goal is to introduce a new way of interacting with on-screen content in a way that’s as remarkable as the “pinch-and-zoom” and swiping gestures were when the iPhone first debuted. (Whether Amazon has succeeded here remains to be seen.)
In fact, both technologies are about moving away from the tapping and swiping and typing altogether. Dynamic Perspective is about movement and tilting and things you can do with one hand. Firefly, meanwhile, lets you find things without Googling, turning the real world into an Amazon search engine.
These technologies can be combined, too — for example, a music app that lets you tilt to skip tracks, “hear” songs from the radio to build out a playlist and alert you to nearby concerts after listening.
Amazon, of course, will be challenged in a number of ways with its smartphone debut. It’s not great that Fire Phone is currently limited to AT&T, and that it will have to compete with significant players, Apple and Google, in today’s smartphone ecosystem.
But with these two additions, the company has clearly been thinking beyond the specs and the look and feel of its device. It’s been thinking about the biggest selling point for smartphones: the app ecosystem. These SDKs are not just user interface updates, they are new ways of using a phone. Maybe they will turn out to be new ways that nobody wants. Maybe these technologies will be too restrictive to third parties whose apps and companies compete with Amazon. Maybe they will be buggy and weird. Maybe developers won’t even bite. But at least Amazon has shown us something new.