Generative AI’s future in enterprise could be smaller, more focused language models


Hand holding up a rectangle with an opening in it to bring something in the distance, a man walking down a dirt road into focus.
Image Credits: © Marco Bottigelli / Getty Images

The amazing abilities of OpenAI’s ChatGPT wouldn’t be possible without large language models. These models are trained on billions, sometimes trillions of examples of text. The idea behind ChatGPT is to understand language so well, it can anticipate what word plausibly comes next in a split second. That takes a ton of training, compute resources and developer savvy to make happen.

But maybe the future of these models is more focused than the boil-the-ocean approach we’ve seen from OpenAI and others, who want to be able to answer every question under the sun. What if each industry or even each company had its own model trained to understand the jargon, language and approach of the individual entity? Perhaps then we would get fewer completely made up answers because the answers will come from a more limited universe of words and phrases.

In the AI-driven future, each company’s own data could be its most valuable asset. If you’re an insurance company, you have a completely different lexicon than a hospital, automotive company or a law firm, and when you combine that with your customer data and the full body of content across the organization, you have a language model. While perhaps it’s not large, as in the truly large language model sense, it would be just the model you need, a model created for one and not for the masses.

This will also require a set of tools to collect, aggregate and constantly update the corporate dataset in a way that makes it ingestible for these smaller large language models (sLLMs).

Building these models could pose a challenge. They will probably tap into something like open source or a private company’s existing LLMs and then fine-tune it on the industry or company data to bring it more into focus, all in a more secure environment than the generic LLM variety.

This represents a huge opportunity for the startup community, and we are seeing lots of companies with a head start on this idea.

May Habib, co-founder and CEO at Writer, a generative AI startup, says that is exactly what her firm is trying to do: customize the model for each customer, their words and way of working. She says her company is going to market “in a hyperverticalized way,” and this should result in more accurate and tailored content.

“We are essentially building that last mile of allowing them to use LLMs that are informed by their data and things they’ve written before. [It’s] their information and everything that we put in our models at the retrieval layer,” Habib recently told TechCrunch+.

She says that this involves a kind of product underneath the base Writer product that basically turns the firehose of a large language model into something more focused and useful for each individual customer. “The way that we talk to customers about it is that it’s like having small language models on top of large language models,” she said.

Hello, Dolly

Databricks, which is mostly known for being a hot startup with a huge valuation, building a cloud data lakehouse, recently released an sLLM it called Dolly, after the first cloned sheep (not the musical), based on a 2-year-old model. You may ask why they built it on top of an older model, which on its own produces mostly garbage, according to company CEO Ali Ghodsi.

It’s because it’s training that model on smaller, more focused corpuses of data and coming up with what the company claims is more accurate and focused answers. “The model underlying Dolly only has 6 billion parameters, compared to 175 billion in GPT-3, and is 2 years old, making it particularly surprising that it works so well. This suggests that much of the qualitative gains in state-of-the-art models like ChatGPT may owe to focused corpuses of instruction-following training data, rather than larger or better-tuned base models,” the company wrote in a blog post announcing the availability of Dolly.

The beauty of this approach, the company claims, is that it trained Dolly in three hours on a single machine and it cost just $30, compared to the hundreds of thousands to millions of dollars it likely cost to train ChatGPT.

Your cost will probably vary depending on the size of your dataset, but the idea is to feed Dolly your data and then put it to work to understand your particular company’s data and answer questions in a ChatGPT fashion, all while keeping your data private.

“Every company on the planet has a corpus of information related to their [organization]. Maybe it’s [customer] interactions, customer service; maybe it’s documents; maybe it’s the material that they published over the years. And ChatGPT does not have all of that and can’t do all of that.”

“With Dolly you can actually train the model to understand and be specialized on your dataset, and you keep it. You don’t need to give it to the rest of the world. It’s your proprietary information that you can use in your competition with other folks in your industry,” Ghodsi said.

That’s important as we think about using this data moving forward. It’s the same point that Habib makes about her customers: They not only want the wow factor that we get from ChatGPT, they want practical application of the AI against their data in a secure way.

Where could we go from here?

As it becomes more about the data and less about the model, and startups and established companies continue to build the tooling, the hard part will be taking the information and making it available in a format that the model can use and constantly update.

Jeetu Patel, executive vice president and general manager of security and collaboration at Cisco, believes the future is not necessarily sLLMs, but it definitely involves feeding your company’s data into some sort of existing LLM.

“To be clear, every company will have some sort of a custom dataset based on which they will do inference that actually gives them a unique edge that no one else can replicate. But that does not require every company to build a large language model. What it requires is [for companies to take advantage of] a language model that already exists,” he said.

He sees a future in which companies use more specific models than ChatGPT and feed it their own data, not unlike what Databricks is trying to do with Dolly.

“Where I think there’ll be a difference is that there are going to be some AI models that are going to be generic, like what you see with ChatGPT, and then there will be some which are just company specific,” he said.

Using his own company as an example, Patel suggests that in the future, you could interact with Cisco applications like WebEx and get a summary of all your meetings from that day simply by asking it. As a security executive, he is keenly aware that such an approach would have to have careful permissions built in, but it provides a possible scenario where this type of application could be put to work on top of a specific company’s products and services in a very practical way.

All of this is moving so fast, it’s hard to make any clear predictions about where this technology will go tomorrow or next week. But there is some thinking that in order to work in the enterprise, the models will have to be flexible enough to deal with proprietary company data for model training, and if that’s the case, the future could involve smaller and more focused models.

More TechCrunch

Welcome back to TechCrunch’s Week in Review. This week had two major events from OpenAI and Google. OpenAI’s spring update event saw the reveal of its new model, GPT-4o, which…

OpenAI and Google lay out their competing AI visions

Expedia says Rathi Murthy and Sreenivas Rachamadugu, respectively its CTO and senior vice president of core services product & engineering, are no longer employed at the travel booking company. In…

Expedia says two execs dismissed after ‘violation of company policy’

When Jeffrey Wang posted to X asking if anyone wanted to go in on an order of fancy-but-affordable office nap pods, he didn’t expect the post to go viral.

With AI startups booming, nap pods and Silicon Valley hustle culture are back

OpenAI’s Superalignment team, responsible for developing ways to govern and steer “superintelligent” AI systems, was promised 20% of the company’s compute resources, according to a person from that team. But…

OpenAI created a team to control ‘superintelligent’ AI — then let it wither, source says

A new crop of early-stage startups — along with some recent VC investments — illustrates a niche emerging in the autonomous vehicle technology sector. Unlike the companies bringing robotaxis to…

VCs and the military are fueling self-driving startups that don’t need roads

When the founders of Sagetap, Sahil Khanna and Kevin Hughes, started working at early-stage enterprise software startups, they were surprised to find that the companies they worked at were trying…

Deal Dive: Sagetap looks to bring enterprise software sales into the 21st century

Keeping up with an industry as fast-moving as AI is a tall order. So until an AI can do it for you, here’s a handy roundup of recent stories in the world…

This Week in AI: OpenAI moves away from safety

After Apple loosened its App Store guidelines to permit game emulators, the retro game emulator Delta — an app 10 years in the making — hit the top of the…

Adobe comes after indie game emulator Delta for copying its logo

Meta is once again taking on its competitors by developing a feature that borrows concepts from others — in this case, BeReal and Snapchat. The company is developing a feature…

Meta’s latest experiment borrows from BeReal’s and Snapchat’s core ideas

Welcome to Startups Weekly! We’ve been drowning in AI news this week, with Google’s I/O setting the pace. And Elon Musk rages against the machine.

Startups Weekly: It’s the dawning of the age of AI — plus,  Musk is raging against the machine

IndieBio’s Bay Area incubator is about to debut its 15th cohort of biotech startups. We took special note of a few, which were making some major, bordering on ludicrous, claims…

IndieBio’s SF incubator lineup is making some wild biotech promises

YouTube TV has announced that its multiview feature for watching four streams at once is now available on Android phones and tablets. The Android launch comes two months after YouTube…

YouTube TV’s ‘multiview’ feature is now available on Android phones and tablets

Featured Article

Two Santa Cruz students uncover security bug that could let millions do their laundry for free

CSC ServiceWorks provides laundry machines to thousands of residential homes and universities, but the company ignored requests to fix a security bug.

2 days ago
Two Santa Cruz students uncover security bug that could let millions do their laundry for free

TechCrunch Disrupt 2024 is just around the corner, and the buzz is palpable. But what if we told you there’s a chance for you to not just attend, but also…

Harness the TechCrunch Effect: Host a Side Event at Disrupt 2024

Decks are all about telling a compelling story and Goodcarbon does a good job on that front. But there’s important information missing too.

Pitch Deck Teardown: Goodcarbon’s $5.5M seed deck

Slack is making it difficult for its customers if they want the company to stop using its data for model training.

Slack under attack over sneaky AI training policy

A Texas-based company that provides health insurance and benefit plans disclosed a data breach affecting almost 2.5 million people, some of whom had their Social Security number stolen. WebTPA said…

Healthcare company WebTPA discloses breach affecting 2.5 million people

Featured Article

Microsoft dodges UK antitrust scrutiny over its Mistral AI stake

Microsoft won’t be facing antitrust scrutiny in the U.K. over its recent investment into French AI startup Mistral AI.

2 days ago
Microsoft dodges UK antitrust scrutiny over its Mistral AI stake

Ember has partnered with HSBC in the U.K. so that the bank’s business customers can access Ember’s services from their online accounts.

Embedded finance is still trendy as accounting automation startup Ember partners with HSBC UK

Kudos uses AI to figure out consumer spending habits so it can then provide more personalized financial advice, like maximizing rewards and utilizing credit effectively.

Kudos lands $10M for an AI smart wallet that picks the best credit card for purchases

The EU’s warning comes after Microsoft failed to respond to a legally binding request for information that focused on its generative AI tools.

EU warns Microsoft it could be fined billions over missing GenAI risk info

The prospects for troubled banking-as-a-service startup Synapse have gone from bad to worse this week after a United States Trustee filed an emergency motion on Wednesday.  The trustee is asking…

A US Trustee wants troubled fintech Synapse to be liquidated via Chapter 7 bankruptcy, cites ‘gross mismanagement’

U.K.-based Seraphim Space is spinning up its 13th accelerator program, with nine participating companies working on a range of tech from propulsion to in-space manufacturing and space situational awareness. The…

Seraphim’s latest space accelerator welcomes nine companies

OpenAI has reached a deal with Reddit to use the social news site’s data for training AI models. In a blog post on OpenAI’s press relations site, the company said…

OpenAI inks deal to train AI on Reddit data

X users will now be able to discover posts from new Communities that are trending directly from an Explore tab within the section.

X pushes more users to Communities

For Mark Zuckerberg’s 40th birthday, his wife got him a photoshoot. Zuckerberg gives the camera a sly smile as he sits amid a carefully crafted re-creation of his childhood bedroom.…

Mark Zuckerberg’s makeover: Midlife crisis or carefully crafted rebrand?

Strava announced a slew of features, including AI to weed out leaderboard cheats, a new ‘family’ subscription plan, dark mode and more.

Strava taps AI to weed out leaderboard cheats, unveils ‘family’ plan, dark mode and more

We all fall down sometimes. Astronauts are no exception. You need to be in peak physical condition for space travel, but bulky space suits and lower gravity levels can be…

Astronauts fall over. Robotic limbs can help them back up.

Microsoft will launch its custom Cobalt 100 chips to customers as a public preview at its Build conference next week, TechCrunch has learned. In an analyst briefing ahead of Build,…

Microsoft’s custom Cobalt chips will come to Azure next week

What a wild week for transportation news! It was a smorgasbord of news that seemed to touch every sector and theme in transportation.

Tesla keeps cutting jobs and the feds probe Waymo