D-ID nabs $4M to protect images from being read by facial recognition algorithms

Next Story

Korea’s Kakao eyes overseas acquisitions following $1 billion raise

As people become more aware of how their data is used — and abused — in our digital world; and regulations (like GDPR in Europe) are leading more organizations to implement better data protection policies, a startup that is hoping to protect that data found in images — by making it impossible for algorithms to read it — has raised a seed round to help it bring its services to market.

D-ID, a startup based out of Tel Aviv that has built tools to block facial recognition algorithms from  “reading” photos and videos, has closed out a seed round of $4 million. Its investors include Pitango Venture Capital (which led), Foundation Capital, Fenox Venture Capital, Maverick Ventures, and two angels. Y Combinator, where D-ID was in a cohort earlier this year, also participated.

D-ID taps into what is a new trend in the world of data protection called “de-identification” (which happens to be the basis of D-ID’s name, too). De-identification has two aims: one is to protect data that already being used for authentication; and the other is to never let that data get “read” in the first place.

At issue here is the growth of technology to identify people for various services might be useful for authentication (such as in a hospital, or a banking service) or for simply making your life a little easier (say, to help you log into your phone, or download apps quickly, or easily tag all your photos on a social network). In the wrong hands — for example, as the result of a data breach — that data can prove to be your undoing.

De-identification aims to help protect some of that data, as CEO and co-founder Gil Perry describes it, by “playing on the gap between the brain and human eye and what the machine understands.” From what I understand, to the naked eye, the pictures will look no different, but there will be small, imperceptible differences that will essentially scramble what a facial recognition algorithm can see. 

He would not explain the details of how this works to me more than this — nor would he provide any before/after pictures at this point in time — citing proprietary information, but said it is akin to systems you may have heard about that make sounds that confuse machines but humans cannot hear. Perry said that a pilot of the service is going to be getting underway soon, with a version of the product shipping in May.

Perry and his two other co-founders, COO Sella Blondheim, and CTO Eliran Kuta, are all ex-Israeli special forces, and this is somewhat a matter of knowing your enemy, so to speak: all three have experience in AI algorithms and security identification services. But also three have also been on the other side of the issue, from a consumer point of view: they were actually prohibited from posting pictures on social networks while they were working for the intelligence forces, and — knowing what facial recognition algorithms are doing when they “read” an image — they believed they could figure out a way of being able to be like “normal” consumers while still not falling afoul of their security requirements.

As surveillance and data troves continue to expand, and as malicious hacking becomes more of a persistent problem, there are startups popping up that are trying to address this gap, although most appear to have focused around the big data opportunity rather than biometrics and images. Privitar in the UK last year raised $16 million to build out its business, and Privacy Analytics is another targeting, it appears, the healthcare vertical first.

Yair Cassuto, a principal at Pitango, said that D-ID appears to be the first, however, to be focusing on blocking image recognition algorithms.

“We consulted with the top facial research researchers and thought leaders and they looked into the tech and the research behind D-ID and gave us the assurance that for the upcoming years, their tech will be proven to be sufficiently strong enough against different types of AI attacks and learning algorithms,” he said. 

“People want a more seamless experience, and using biometrics as authentication is something that does that, but people are worried about services holding a lot of information, especially when cyberdata is a big target for hackers. They didn’t feel like keeping images is safe enough with current solutions.”

Featured Image: Johnson76/Shutterstock (IMAGE HAS BEEN MODIFIED)