Mythic launches a chip to enable computer vision and voice control on any device

Hardware that responds to voice commands is already out there and probably in your hand or house right now. Whether it’s a smartphone, smart speaker or wearable, it has to connect to the cloud to deliver answers. Now, a startup called Mythic (formerly known as Isocline) is launching a chip and software that will change all that, putting voice control, computer vision and other kinds of AI into our devices locally, no cloud required.

Headquartered in Austin, with offices in Redwood City, Calif., Mythic has raised $9.3 million in venture funding, including angel capital and a Series A round, according to CEO and co-founder Mike Henry. Mythic’s Series A round was led by Draper Fisher Jurvetson joined by Lux Capital, Data Collective, and AME Cloud Ventures. DFJ’s Steve Jurvetson and Lux Capital’s Shahin Farshchi have joined the startup’s board of directors.

Prior to their Series A, Mythic had raised about $2.5 million in government grants. The new funding will help the startup begin to commercialize its chips, which are about the size of a small shirt button, and proprietary software that make them work alongside other processors and memory.

Mythic’s chip enables AI features to run locally on any device.

DFJ Partner Steve Jurvetson, who was an early backer of SpaceX, Tesla and Nervana Systems (now owned by Intel) said:

“What Mythic has built is a deep learning or neural network chip that implements learning algorithms at a radically lower price point, chip size and power consumption level than anything we have today.

With this, you could put machine intelligence into a toaster, or a Roomba, a security camera, or all kinds of devices where it wouldn’t make sense before because you’d need a persistent internet connection to make it useful.”

The influential and self-proclaimed “VC geek” views Mythic’s market opportunity as wide-ranging. He predicted, “The industrial internet of things is becoming the sensory cortex of the planet. Industrial and enterprise customers will want to use this to do inspection and quality control wherever they have a sort of sensor node, whether it’s a camera, a microphone, a temperature sensor or something else.”

The inspiration for Mythic’s technology dates back to 2012, when Henry and co-founder and CTO Dave Fick were both graduating from different labs with doctorates in computer science. The friends made an early bet that deep neural networks and machine learning approaches to software development would eventually require more powerful compute resources in the devices and machines we use daily. Nobody was calling it “AI” yet, but insiders were talking about deep learning as the key to making devices that see, hear and interact with the world with human-like qualities.

Mythic CEO and cofounder Mike Henry.

Today, the CEO said, that bet is starting to pay off. “A lot of devices and applications are taking in too much data to send to the cloud. Like think about drones with multiple, high-res images coming through the cameras. All they need to find is the crack on a turbine or dry patches of farm land. If they process all that data through the cloud, and run analytics with complex algorithms, they’ll drain their batteries and have to land. We saw opportunity to do processing locally inside a device.”

Henry said that the company has been hiring aggressively since closing its Series A round at the end of 2016. It will continue to do that while reaching out to potential pilot customers. Long-term, Mythic’s cofounders hope to bring their technology to automakers building autonomous vehicles. However, the CEO said, consumer electronics, drone and robotics companies, which tend to move from design to production more quickly, will most likely be among Mythic’s earliest users.