Unless you’ve ever used it, chances are you’ve never heard of Adobe’s Experience Manager. Inside of large enterprises, though, it’s among the most popular cross-platform content management systems for marketing teams and those who have to manage large online stores. Today, the company is launching an update to the Experience Manager that brings a number of new machine learning-based tools to the service.
As Adobe’s senior director of strategy and product for Experience Manager Loni Stark told me, the overarching theme here is that the company wants to use artificial intelligence and machine learning to help brands better connect with consumers by using the data they have about their customers to personalize the user experience and make it more relevant.
In this context, the most interesting new feature is probably Smart Layout. This feature allows brands to segment their customers into a number of personas and create layouts that work best for them. To do this by hand would be quite a hassle, though, so Experience Manager can automatically generate the most effective layouts and pick the best assets for these groups. The brands simply have to decide what they want to optimize for (maybe social shares) and the system will then automatically A/B test and optimize the experience.
Similarly, another new feature now allows marketers to take a fragment of a page, say the page header, and then personalize that for specific audiences.
These audiences will interact with this content on various devices, but not every image works well across desktop and mobile. With this update, Experience Manager can now automatically suggest the best crops for images by automatically detecting their focal points. It can also automatically adjust the image based on the available bandwidth and device type, so you don’t send a high-res image to a low-res screen.
Unless you regularly have to tag images, you don’t know what a drudgery it is, but in order to find images in a large catalog, you can’t get around it. Most people do a terrible job at this, though (and the TechCrunch media library is a prime example for this). Since image recognition is pretty much what machine learning was invented for, though, Experience Manager can now automatically tag images. The system learns as you add new tags, so that it can go beyond the generic terms and learn the names of your products, for example.
Also new in Experience Manager is forms conversion, for identifying and digitizing input fields on traditional PDF forms, a better integration with Creative Cloud and Adobe Stock for pulling in assets and support for Dimension CC 3D images.