The Future Of Algorithmic Personalization

Personalization algorithms influence what you’ve chosen yesterday, what you choose today and what you’ll be choosing tomorrow.

Simultaneously, there seems to be something wrong with personalization. We are continuously bumping into obtrusive, uninteresting ads. Our digital personal assistant isn’t all that personal. We’ve lost friends to the algorithmic abyss of the News feed. The content we encounter online seems to repeat the same things again and again.

Personalization’s image of us is like looking yourself in the funfair’s house of mirrors. Personalization caricaturizes us and creates a striking gap between our real interests and their digital reflection.

Personalization Gaps

There are five main reasons why personalization remains broken.

Data gap means that any algorithmic environment has only a limited amount of data about you. The system can understand you only on its own terms, based on the experience it provides and the feedback loops it uses. Even if the system uses external data sources, it still has only a partial understanding of your interests and preferences.

Computing gap refers to the limitations of computing power and machine-learning technologies. Today’s fastest systems become too slow when they try to understand the complexity of an individual on their own terms. Simultaneously, even the most advanced machine-learning solutions haven’t yet provided a way for computers to seamlessly learn from us and adapt to us.

Personalization caricaturizes us and creates a striking gap between our real interests and their digital reflection.

Interest gap is related to the conflicting interests of users, platforms and third-party actors (e.g., marketers). It comes down to this: Whose interests and preferences are prioritized when deciding what you can see and do? You might not be interested in ads, but they are shown to you nevertheless. When someone is paying for your attention, your ability to choose for yourself diminishes.

Action gap defines the incongruity between your true actions and the ones that are at your disposal. For example, you might want to press a non-existent “Not Cool” button. Or you might not want to see a certain image ever again, but there’s no method to make it happen. Your actions are simplified to fit the environment’s limited feedback loops.

Content gap denotes that any platform or application doesn’t always have content that serves your exact intentions or needs. It also means that the diversity of the served content might be very limited. Be it sports news or restaurant tips, the app or website might run out of relevant content. The more niche the topic, the smaller the chance that you continuously get content that works for you.

Additionally, there lies a more general paradox at the very heart of personalization.

Personalization promises to modify your digital experience based on your personal interests and preferences. Simultaneously, personalization is used to shape you, to influence you and guide your everyday choices and actions. Inaccessible and incomprehensible algorithms make autonomous decisions on your behalf. They reduce the amount of visible choices, thus restricting your personal agency.

Because of the personalization gaps and internal paradox, personalization remains unfulfilling and incomplete. It leaves us with a feeling that it serves someone else’s interests better than our own.

Toward Human-Centered Algorithmic Personalization

There are three design and development paths that would help personalization to serve us better as unique individuals.

Personalization needs a new user interface paradigm and interaction model. To bridge the data gap, personalization interface learns efficiently from direct and indirect actions. Similarly, the computing gap grows smaller as the system learns from what you do and what you’re not doing. To solve the interest gap, you can directly control what is visible to you. The interface allows you to intuitively view different alternatives, mixing in relevant content coming from third parties. You’re able to see why certain things are shown to you. This transparency enhances your ability to affect your preferences, benefitting also the platform and third parties.

To shrink the action gap, the adaptive user interface enables context-aware interactions, such as custom emojis or gestures, based on your real intentions and reactions. From a content gap point of view, the system is able to notify you when something interesting is available and actionable — with a little vibration on your wrist or a smart notification on your device’s screen. The new interface for personalization puts “my-time” above real-time.

Personalization should be able to provide a mix of relevant and surprising, timely and well-aged content. From the data gap and computing gap perspective, the system can have a more granular understanding of your real interests if it provides a diverse set of alternative options. This allows you to express your personal interests in more detail. Simultaneously, the system can learn to recognize previously unknown or unorthodox connections based on your activity.

Interest gap-wise, the remix of relevant information and surprises allows you to decide yourself how to prioritize information. By introducing relevant yet diverse alternatives, the system is not caging you into a restrictive information silo. Even the time-to-time irrelevant content is not interfering with your experience. Both relevance and serendipity are subjective and contextual. Algorithms recognize when you are open to exploration or when you’re more goal-oriented, searching a specific piece of information.

To diminish the action gap, diverse smart recommendations let you define yourself on your own terms. The system starts to understand your short-term and long-term interests to anticipate your information needs. Timeliness does not equal relevance; a huge pool of content ages well without losing its appeal or meaning. Content gap shrinks effectively as the pool of interesting information gets wider and deeper.

Personalization should bring together collective intelligence and artificial intelligence. The connections become faster and the computers smarter and more efficient. To decrease the computing gap the focus is on enhancing the information flow between humans and machines. Humans are (still) the best pattern-recognition systems in the known universe. We can help each other to find and discover meaningful signals. Artificial intelligence should empower this sense-making by powering adaptive interfaces and predictive learning systems.

Human-centered personalization brings together human-curated signals and adaptive machine-learning solutions. In this way intelligent systems mature by learning from our individual and collective interactions and insights. In this way human imagination and irrationality can outplay the restrictions of algorithmic determinism.

What about the personalization paradox? There’s no objectivity (nor should there be claims of objectivity) in the realm of personalization. We shape the algorithms and the algorithms shape us continuously. To serve us better, personalization algorithms should be able understand our subjective way of seeing connections and meanings around us.

In the end, personalization as a concept derives from the world of industrial mass production and marketing. Maybe, in order to mark a new era of algorithmically assisted decision-making — and to emphasize the importance of personal agency — we should talk about choice algorithms instead of personalization algorithms.

Who would build such choice algorithms for us?