
New investors in this round include ARCH Venture Partners and Sunny Optical Technology Group, as well as early investors Atlantic Bridge Capital, AIB Seed Capital Fund, Capital-E, DFJ Esprit and Robert Bosch Venture Capital.
The company, which has offices in Silicon Valley, Ireland and Romania, plans to use the new funding to push its R&D efforts forward, hire more engineers and to improve the software tools that help developers make the most out of its accelerated computer vision processor.

In the realm of VR (and augmented reality projects like Microsoft’s HoloLens), Movidius can power the kind of positional tracking and eye tracking that makes for the kind of low latency experience that then enables the necessary sense of presence that makes the VR experience truly immersive. El-Ouazzane tells me the company is currently working with three of the five main head-mounted display manufacturers.
Interestingly, Movidius is willing to cede the market for vision processing in cars to others. The lead times to get products to market there are simply too long for a small startup that needs to generate some cash flow. “We are going for high-growth markets at the cutting edge instead,” El-Ouzzane told me.
Specifically, he believes the Chinese market will have a major impact on his company. “I have a firm belief that China is going to take some steps forward to take the lead in some markets — and especially in markets that matter to us,” he said. “One of the leaders in the drone industry [DJI] is Chinese; the largest camera module manufacturer is Chinese.” So to accelerate its presence in China, the company specifically looked for lead investors with a strong background there for this round and found them in WestSummit and Atlantic Bridge.
As for the company’s roadmap, El-Ouazzane tells me that the next version of Movidius’ vision processor is coming soon. “We are entering the Golden Age of accelerated computer vision,” he believes and with this new funding round, Movidius is well-positioned to help usher in this future where our devices are more aware than ever of what’s going on in their surroundings.