PitchBook’s new tool uses AI to predict which startups will successfully exit

Can an algorithm predict whether a startup will successfully exit? PitchBook believes so.

The venture capital and private equity database today launched VC Exit Predictor, a tool trained on PitchBook data to attempt to suss out a startup’s growth prospects. When given the name of a VC-backed company, VC Exit Predictor generates a score on the probability that it’ll be acquired, go public or not exit due to becoming self-sustaining or experiencing any event (e.g., bankruptcy) that prevents an exit.

“The VC Exit Predictor was developed using a proprietary machine learning algorithm developed by PitchBook’s quantitative research team, trained exclusively on data available within the PitchBook platform including deal activity, active investors and company details,” McKinley McGinn, product manager of market intelligence at PitchBook, told TechCrunch in an email interview. “To ensure accuracy, predictions are made for venture-backed companies that have received at least two rounds of venture financing deals.”

PitchBook certainly isn’t the first to develop an algorithmic tool to inform investment decisions. For years, investors have been clamoring for an AI-driven competitive advantage; Gartner predicts that by 2025, more than 75% of venture capital and early-stage investor executive reviews will be informed by AI and data analytics.

VC firms, including SignalFire, EQT Ventures and Nauta Capital, are using AI-powered platforms to flag potential top firms. In 2021, a team of researchers used public CrunchBase data to build a tool quite similar to VC Exit Predictor that had the ability to predict whether startups will exit successfully through an IPO or acquisition, fail or remain private.

But do these tools actually work?

McGinn says that PitchBook back-tested VC Exit Predictor on a historical set of companies with known exits, which included firms such as Blockchain.com, Revolut and Bitso. Averaged across the set, the tool was 74% accurate in predicting a successful exit, McGinn claims.

“The VC Exit Predictor can be leveraged by venture capitalists looking for a data-driven approach for their initial evaluation of a venture-backed company,” he added. “However, we anticipate a long tail of use cases for industry players searching for upcoming IPO candidates, monitoring competitors in the market or seeking validation for an investment in their next round.”

VC Exit Predictor might perform well on PitchBook’s test set. But the question is whether it’s resilient to black swan events like a pandemic, global conflicts (such as the war in Ukraine) and natural disasters that can’t be anticipated. Algorithms have historically struggled with these owing to their limited training data.

PitchBook

PitchBook’s new tool attempts to predict which startups will be successful, drawing on historical data. Image Credits: PitchBook

A VentureBeat piece (written by yours truly) details how a company in the frozen foods industry, for instance, struggled to use an algorithm to predict where sales would ultimately settle during the COVID-19 pandemic. In the first three to four months of the health crisis, when most regions had dining restrictions in place, frozen food sales went up significantly as customers chose to eat at home. But when some countries quickly loosened their quarantine rules down the line and others opted for slower reopenings, it led to shifting trends that made the company’s algorithm less reliable.

McGinn admits that VC Exit Predictor suffers from similar flaws — for example, holding a favorable outlook on crypto companies despite the industry-wide decline. “There are limitations at the market-level predictions that the algorithm can make,” he said. “Since it’s reliant on timely updates in a slower moving market space, it takes time for the model to adjust to rising or failing segments.”

There’s also the bias problem: Inevitably, algorithms amplify the biases in the data on which they’re trained.

In an experiment in November 2020, Harvard Business Review (HBR) built an investment recommendation algorithm and compared its performance with the returns of angel investors. According to HBR, the algorithm tended to pick white entrepreneurs rather than entrepreneurs of color and preferred investing in startups with male founders, likely because women and founders from other underrepresented groups tend to be disadvantaged in the funding process and ultimately raise less venture capital.

Experts found similar issues with CB Insights’ Mosaic tool, which scores early-stage founders and management teams to support investment, purchasing and M&A decisions. Tech Brew reported that four out of the six disclosed “signals” that CB Insights uses to inform a person’s likelihood of success are proxies for race, socioeconomic status, gender and disability. That’s significant, given that just 8% of MBA graduates are Black; early-stage hires at tech giants tend to skew white, Asian and male; and fewer than 2% of enterprise software startups in the U.S. have a female founder.

McGinn makes the bold assertion that VC Exit Predictor is “blind to the race, gender and education of founders,” but revealed that even PitchBook found a slight difference in its distributed success predictions — 1% — between male and female CEOs.

“While no tool or person can predict the company exits with complete accuracy, the VC Exit Predictor’s ability to process large amounts of data and identify patterns can give investors an edge in making informed investment decisions,” he said. “We plan to continue building on this tool to improve the accuracy of predictions and add new functionality to deliver even more insights.”

The takeaway is that no predictive tool is perfect, and to his credit, McGinn doesn’t deny this. We only hope that investors don’t rely on VC Exit Predictor exclusively to make their financial decisions, particularly in the absence of a third-party audit of the algorithm.