Next IT has been working on building natural language processing and artificial intelligence based virtual assistant systems for enterprise customers for years. Its tools are currently being used by major airlines and insurance companies to provide answers to their customers with voice and text-driven virtual assistants that can answer their questions. Today, after almost 12 years of working on these problems, it’s expanding into healthcare.
Alme for Healthcare is a virtual assistant for patients that focuses on disease management and can respond to common patient and customer requests using both text and voice prompts. Using this tool, health-care providers will be able to create personalized health assistants for their patients that will help them ensure these patients stick to a treatment plan and regularly report their status. The tool is currently targeted at pharmaceutical companies, healthcare providers, government organizations and accountable care organizations. Next IT hopes that they will then make its tools available to consumers.
As Next IT founder and CEO Fred Brown told me earlier this month, the team decided that it had to go back to the drawing board to brings its services into the healthcare world. Given the vagaries of speech recognition and the potential dangers of giving patients wrong advice, the team added a number of options for users to talk to a real person when necessary. Also, if the technology notices that somebody could be in danger of a relapse or other serious medical issue, it can direct the user to talk to an expert directly.
Unlike other systems the company has previously built, Alme for Healthcare also offers institutions that use it more for personalization options. Doctors can, for example, create disease-management plans for individual users. These, Brown noted, tend to be well-structured and lend themselves to the kind of virtual assistant systems the company offers because they work within well-defined parameters.
It’s worth noting that the platform itself consists of three pieces: a comprehensive patient ontology that forms the foundation of the system, support for goal-based conversations that help patients stick to their treatment plans, and interactive concept illustrations that can show a patient where to do an at-home injection, for example.
Brown showed me a demo, for example, that quizzed a user about how well he was managing his diabetes. What’s impressive about Next IT’s systems is how well its natural language processing algorithms – which were all developed in-house – recognize even casual voice responses (though users can also type responses and on phones and tablets, the system can ask show multiple choice prompts).
As Brown also noted, the fact that the system can regularly prompt patients to self-report how they feel or to enter certain diagnostic information, doctors will now have far more information about them when they do eventually have to come to their office in person. Traditionally, a doctor focusing on MS, for example, will ask virtually the same questions about a patient’s status every time he or she comes in. Now, they will have this info already and can use their time with the patient more efficiently.
Given the privacy implications of gathering this data, the Next IT team tells me that it went to great lengths to ensure that all the data that passes through its systems remains safe. Its customers can also install the system on their own servers and networks, which are mostly likely already compliant with HIPAA and other regulations.
The system is now out of beta and the company has already lined up a number of large partners. Sadly, though, Brown wasn’t ready to announce any of them yet.