How to choose and deploy industry-specific AI models

As artificial intelligence becomes more advanced, previously cutting-edge — but generic — AI models are becoming commonplace, such as Google Cloud’s Vision AI or Amazon Rekognition.

While effective in some use cases, these solutions do not suit industry-specific needs right out of the box. Organizations that seek the most accurate results from their AI projects will simply have to turn to industry-specific models.

Any team looking to expand its AI capabilities should first apply its data and use cases to a generic model and assess the results.

There are a few ways that companies can generate industry-specific results. One would be to adopt a hybrid approach — taking an open-source generic AI model and training it further to align with the business’ specific needs. Companies could also look to third-party vendors, such as IBM or C3, and access a complete solution right off the shelf. Or — if they really needed to — data science teams could build their own models in-house, from scratch.

Let’s dive into each of these approaches and how businesses can decide which one works for their distinct circumstances.

Generic models alone often don’t cut it

Generic AI models like Vision AI or Rekognition and open-source ones from TensorFlow or Scikit-learn often fail to produce sufficient results when it comes to niche use cases in industries like finance or the energy sector. Many businesses have unique needs, and models that don’t have the contextual data of a certain industry will not be able to provide relevant results.

Building on top of open-source models

At ThirdEye Data, we recently worked with a utility company to tag and detect defects in electric poles by using AI to analyze thousands of images. We started off using Google Vision API and found that it was unable to produce our desired results — with the precision and recall values of the AI models completely unusable. The models were unable to read the characters within the tags on the electric poles 90% of the time because it didn’t identify the nonstandard font and varying background colors used in the tags.

So, we took base computer vision models from TensorFlow and optimized them to the utility company’s precise needs. After two months of developing AI models to detect and decipher tags on the electric poles, and another two months of training these models, the results are displaying accuracy levels of over 90%. These will continue to improve over time with retraining iterations.

Any team looking to expand its AI capabilities should first apply its data and use cases to a generic model and assess the results. Open-source algorithms that companies can start off with can be found on AI and ML frameworks like TensorFlow, Scikit-learn or Microsoft Cognitive Toolkit. At ThirdEye Data, we used convolutional neural network (CNN) algorithms on TensorFlow.

Then, if the results are insufficient, the team can extend the algorithm by training it further on their own industry-specific data.

Let’s take an example: A small to medium-sized tax services company based in Los Angeles is leveraging AI models to power its customer service chatbot. A more generic chatbot feature would be limited in its ability to answer specific domain-related questions. However, once they train it on up-to-date, industry-specific data, such as federal and state-level information on IRS processes and forms, the model will be able to accurately respond to specific customer questions around filing taxes in 2021 and all the caveats that brings.

Deploying off-the-shelf industry-specific AI solutions

As the need for more tailored AI solutions grows, so do the market opportunities to leverage models that have been trained on industry-specific data. Dozens of these offerings now exist, with many of them coming from Big Tech and AI companies, and we should expect to see that number explode over the next five years or so.

For example, IBM has a complete list of industry-specific AI models, including Watson solutions for customer service, supply chain, financial operations and security — to name but a few. IBM also recently collaborated with Atos to build industry-specific automation solutions that use AI and hybrid cloud technologies.

C3 AIi, the enterprise AI company, recently released a new suite of industry-specific AI solutions, including ones for energy management, anti-money laundering and production schedule optimization. The organization has an impressive 4.8 million AI models in use, which, according to the company website, provide “powerful predictions across the most complex organizations.”

H2O’s solutions range from financial services and insurance to manufacturing and retail. In the retail sphere, the company’s AI models have been able to help leading brands like Macy’s, Walgreens and eBay forecast product demand, boost customer experience through personalized encounters and drive advanced inventory planning.

While developers don’t get full access to the model itself, the investment in these types of solutions gives them access to many other valuable services that they can use while implementing the AI models. For example, they might need assistance with setting up secure data pipelines — a necessary feature given how many data sources a team may be handling at a given time. These services might also include help with preparing the data, data privacy and security assurance, or version management of the AI models.

So, how do companies know which is the best approach for them?

Choosing the right approach

When selecting the best way forward to deploy industry-specific AI solutions, it’s important to keep a few things in mind.

Companies deciding whether to build on top of an open-source model should note that it’s easier to expand on the existing capabilities of AI models than build them from scratch. With the original model available as open-source, the end product belongs to the data science team to continuously tweak and develop.

However, they should remember that it’s necessary to build on and train the model in an environment that lets them do so. Azure ML, for example, is a developer environment that lets you import an open-source algorithm and build a model on top of it.

Still, taking the OS approach can come with some setbacks. If models are running across different clouds, that can be time- and resource-consuming to manage. Inevitably, there will be no guarantee of the model’s performance or service-level agreement because it’s not a paid service — the data science team managing and training the model must stay on top of maintenance.

If a business is seeking guaranteed performance and a full stack of accompanying services, they might consider leveraging an off-the-shelf tool.

Investing in these pre-trained AI models is a good idea for companies looking to deploy AI models for an extended period of time that require scalability and reliability. These are often enterprises that have their AI needs largely figured out and require the guarantee of service-level agreements.

However, with the AI models ultimately residing in the vendor’s environment, developers working with the model won’t be able to take those AI projects elsewhere. That’s why opting for an out-of-the-box solution should be a long-term commitment for companies prepared to invest in an advanced solution at the expense of ownership over the model.

For a select few organizations, building from scratch might actually be the best option — though given the amount of time and resources necessary to build an industry-specific solution from zero, it’s only advisable to do this when the other options don’t suffice.

For example, extremely niche industries doing truly novel work such as drug manufacturing or medical research will have to build and train their own algorithms from the ground up. For example, take researchers using AI to identify patterns in genetic code for a certain disease; this kind of use case likely wouldn’t have pre-built models that can provide a base for further training.

Whatever path a company takes will depend on the nature of the AI project. Companies should carefully consider their technology and maintenance needs, how much budget and time they are willing to spend, and from there decide which approach to adopt.

One thing is for certain, though: Industry-specific AI models are only going to boom in popularity over the next few years, and businesses from across sectors will realize their power in delivering accurate and powerful insights.