Designing New Mobile Experiences For The Music-Loving Generation

As music fans go mobile, music festivals are following suit. The kids in America are rocking out to T-Swift with a Natty Lite in one hand and a mobile phone open to native festival apps in the other.

Schedules, venue maps, artists and vendors are all in the palm of partygoers hands. And festivals are even designing other features specifically to engage the throngs of eager event-goers, diehard fans and dispassionate bystanders weeks before the events actually begin.

Imagine for a second if these apps (or a new app, for that matter) took the mobile experience a bit further in engaging people on an ongoing basis.

Consider this context: You and your best friend are at a café when a song plays in the background that gets both of your attention. Without even saying a word, you look at each other in firm agreement that you like the song.

Normally, one of you would launch 2-3 different apps to figure out who the artist is, look up biographical information, search for more of their music and maybe ask your social network who’s familiar with the artist or post something in reference to the context.

What if an app, upon hearing the first few seconds of the song, recognizes that you like this type of music, takes care of all those search-and-share activities on its own then simply prompts you with a personalized call-to-action?

This is isn’t rocket science. This is modern-day computing power.

It could prompt you with a notification of the artist’s show in a nearby venue in the next few weeks, where you’re directed to purchase tickets. It also suggests flights, hotels, restaurants and artist merchandise. Notification-driven smartwatches could play a part in this interaction, too.

Simply put, it knows you just as well as your best friend does (or maybe even better) and has figured out the most efficient way to communicate with you and direct your next steps.

Humanistic Experience Design

SDKs, APIs, sensors and all of the other technologies needed to construct this type of seamless experience are already out there. StubHub’s recent partnership with Spotify allows for the integration of a person’s music library and upcoming concerts and events. As users build their libraries and StubHub learns their listening behaviors, more tailored recommendations come their way.

More than the mere emergence of mobile devices as people’s primary gateway to the connected world, there’s also a gamut of technology pieces that make on-the-go digital experiences much more “human.”

Connectivity being ubiquitous is no longer news. Anyone building and designing mobile apps, small and big companies alike, should utilize such technology in creating customer touch points that collect and make use of behavioral information.

History lessons aside, this approach to designing mobile experiences is predicated on the humanistic vision of the world in which everything revolves around mankind and how our evolving rationality should shape our experiences.

Through the use of measurable behavioral data, empirical analysis, dynamic mental models and rational decision-making methodologies, well-designed mobile experiences can solve problems that alleviate and improve human experiences.

Tech Bigwigs Paving The Way

Consider where the Facebook Messenger app might be heading. By releasing an SDK for developers and integrating third-party tools, the tech giant is building an ecosystem of rich data sources to further define and predict what kinds of content people share at any given mobile moment.

The possibility of a Siri-like experience that acts more like a personal assistant in foreseeing which messages and content you need to send in many situations is not too remote.

Imagine for a second if these apps took the mobile experience a bit further in engaging people on an ongoing basis.

Their partners may have limited access to people’s overall usage outside the tools and content that get integrated into the messenger app, but Facebook itself will probably have full visibility of all this data.

Facebook could layer this information with everything else it knows about someone’s social activities and personal profile in designing a messaging app that does the job of curating, creating and even conceptualizing sharable content all on its own. Talk about cutting through the clutter.

Contextual intelligence powering a stack of user interactions during on-the-fly moments is playing a central part in how Google redesigned its Now on Tap service. Android-based mobile phones are now smarter and more efficient in figuring out how users can get to their intended tasks with fewer steps.

By piecing together information from other apps like Fandango, OpenTable and IMDb, users can accomplish multiple tasks that they usually perform inside several apps. These efforts toward machine learning and deep neural networks “humanize” digital experiences.

IBM is also building computing platforms using a similar partnership model. The Watson Health Cloud is pulling data from more than 700 million iPhone users whose mobile devices are collecting data from other personal electronic devices, such as fitness trackers, connected medical devices, implantables and other sensors.

With the news that Facebook is also ramping up its repertoire of AI-driven products, it’s possible that user interactions and experiences across different platforms and devices will be more humanistic.

This is isn’t rocket science. This is modern-day computing power. Content sharing and consumption can become more personalized, intelligently responding to and predicting each person’s emotions, thoughts and even physical conditions.

Sounds like an exclusive science club? Not anymore. Earlier in April, Amazon announced that it’s making its AI engine, dubbed as the Amazon Machine Learning, available to software developers who are new to the field.

The predictive models it can create can solve a wide range of problems, from detecting rogue financial transactions to improving customer experiences. These APIs and wizards Amazon will release are also scalable.

An Equally Intelligent And “Humanized” Mobile World

Humanistic design, in principle, is framed around people’s emotional, physical, social and cognitive behaviors in different contexts. As Facebook stretches its arms and reaches into the data pools of its growing partner base, the amount of contextual data they can leverage will only continue to exponentially increase.

Gartner outlined many of these technology pieces in a recent report, noting that because connectivity is everywhere, companies can collect real-time, contextual data about its customers. The next step is using a humanistic framework in piecing together all the moving parts.

Taking the above example of music festival apps, humanistic design would incorporate the finer elements of people’s experiences in layering the app with interactions that not only gather information that’s already available, but thread these together in a manner that adds more meaning to the larger experience of enjoying music.

Sounds like a no-brainer? Then why aren’t there more apps out there that are intelligent enough to predict where we’d like to eat, shop, visit and entertain ourselves without having users go through the same information-gathering interactions over and over again?

The challenge here is not the scarcity or lack of technology. Quite the opposite: Developers, companies and others alike, should ground these advancements in a deeper belief that technology should work for us humans — for our happiness, productivity and well-being — not the other way around.

Humanizing technology can definitely lead to digital interactions that feel more natural, are anticipatory or predictive and respond to new information about our complex minds in real-time.