Data privacy regulations like GDPR, the CCPA and HIPAA present a challenge to training AI systems on sensitive data, like financial transactions, patient health records and user device logs. Historical data is what “teaches” AI systems to identify patterns and make predictions, but there are technical hurdles to using it without compromising a person’s identity.
One workaround that’s gained currency in recent years is federated learning. The technique trains a system across multiple devices or servers holding data without ever exchanging it, enabling collaborators to build a common system without sharing data. Intel recently partnered with Penn Medicine to develop a brain tumor–classifying system using federated learning, while a group of major pharma companies, including Novartis and Merck, built a federated learning platform to accelerate drug discovery.
Tech giants, including Nvidia (via Clara), offer federated learning as a service. But a new startup, DynamoFL, hopes to take on the incumbents with a federated learning platform that focuses on performance, ostensibly without sacrificing privacy.
“DynamoFL was founded by two MIT Department of Electrical Engineering and Computer Science PhDs, Christian Lau and myself, who spent the last five years working on privacy-preserving machine learning and hardware for machine learning,” CEO Vaikkunth Mugunthan told TechCrunch in an email interview. “We discovered an enormous market for federated learning after we received repeated work offers from leading finance and technology companies that were trying to build out federated learning internally in light of emerging privacy regulations like GDPR and CCPA. During this process, it was clear that these organizations were struggling to stand up federated learning internally and we built DynamoFL to address this gap in the market.”
DynamoFL — which claims to have key customers in the automotive, internet of things, and finance sectors — is in the early stages of its go-to-market strategy. (The startup has four employees currently, with plans to hire 10 by the end of the year.) But DynamoFL has focused on refining novel AI techniques to stand out against the competition, offering capabilities that putatively boost system performance while combating attacks and vulnerabilities in federated learning — like “member inference” attacks that make it possible to detect the data used to train a system.
“Our personalized federated learning technology … enable[s] machine learning teams to fine-tune their models to improve performance on individual cohorts. This gives C-suite executives higher confidence when deploying machine learning models that were previously considered black-box solutions,” Mugunthan said. “This [also] differentiates us from competitors like Devron, Rhino Health, Owkin, NimbleEdge and FedML that struggle with the common challenges of traditional federated learning.”
DynamoFL also advertises its platform as cost-efficient pitted against other privacy-preserving AI point solutions. Since federated learning doesn’t necessitate the mass collection of data on a central server, DynamoFL can cut data transfer and computation costs, Mugunthan asserts — for example, allowing a customer to send only small, incremental files rather than petabytes of raw data. As an added benefit, this can reduce the risk of data leaks by eliminating the need to store large volumes of data on a single server.
“Common privacy-enhancing technologies like differential privacy and federated learning have suffered from a perennial ‘privacy versus performance’ tradeoff, where using more robust privacy-preserving techniques during model training inevitably results in poorer model accuracy. This critical bottleneck challenge has prevented many machine learning teams from adopting privacy-preserving machine learning technologies that are needed to safeguard user privacy while complying with regulatory frameworks,” Mugunthan said. “DynamoFL’s personalized federated learning solution tackles a critical hurdle to machine learning adoption.”
Recently, DynamoFL closed a small seed round ($4.15 million at a $35 million valuation) that had participation from Y Combinator, Global Founders Capital and Basis Set; the startup is a part of Y Combinator’s Winter 2022 batch. Mugunthan says that the proceeds will mainly be put toward recruiting product managers who can integrate DynamoFL’s technologies into future, user-friendly products.
“The pandemic has highlighted the importance of rapidly leveraging diverse data for emerging crises in healthcare. In particular, the pandemic underscored how critical medical data needs to be made more accessible during times of crisis, while still protecting patient privacy,” Mugunthan continued. “We are well-positioned to weather the slowdown in tech. We currently have three to four years of runway, and the tech slowdown has actually assisted our hiring efforts. The largest tech companies were hiring the majority of leading federated learning scientists, so the slowdown in hiring in big tech has presented an opportunity for us to hire top federated learning and machine learning talent.”