Petah Tikva-based Oryx Vision announced a $17 million Series A funding round Wednesday led by Bessemer Venture Partners for a new kind of object-detection technology aimed at the blooming self-driving car market. The round also included participation from Trucks VC and Maniv Mobility.
“Our technology enables us to design a system with a noise floor which is 100K-1M times lower than conventional LiDAR,” Rani Wellingstein, Oryx co-founder and CEO, tells Geektime. “It means that we are not bound to complex mechanical or optical designs for steering our beam, but rather we have the ability to illuminate the entire field of view and still detect and identify objects which are 3-4 times farther than any other LiDAR can detect.”
The company boasts a depth-sensing mechanism they say requires 50x less energy than LiDAR, the most common sensor system being installed in autonomous vehicles for the purpose of object detection.
“Autonomous vehicles need much more powerful depth sensing capabilities than what was originally thought; existing technologies simply cannot deliver them,” said Wellingstein. “We have taken a completely different approach to artificial depth sensing and managed to create a solution that will truly enable autonomous driving.”
“It’s more or less an enhanced form of radar,” explains Wellingstein. “Our system is, like LiDAR, an active system which uses lasers to illuminate the scene, but that’s where the resemblance ends. We’re building a Coherent Optical Radar System,” as in their system illuminates the entire environment to be analyzed (optically) and analyzes those surroundings simultaneously (coherently).
“For the sake of simplifying it, it’s like the difference between a situation where a driver picking you up from the airport was given an amorphous description of what you look like (traditional LiDAR) to one in which he’s watching a live video stream of you coming through the arrivals.”
Originally founded by VP of R&D David Ben-Bassat in 2009, they had only received $300,000 from angel investors up until this point. Wellingstein joined the company last year.
Moving forward from this round, they are looking to use their unprecedented investment to hire new computer scientists, electrical engineers and physicists.
There are a growing number of companies working to develop the technology that will form the backbone of the autonomous vehicle revolution. Israeli startup Innoviz raised $9 million in August for its LiDAR product. Ford has gotten into the game, co-investing with Baidu this past summer in LiDAR-producing startup Velodyne. Driven by self-driving cars, the LiDAR market could be worth a good $3.22 billion by 2022, but that figure likely doesn’t anticipate alternatives to the technology. While Innoviz and Velodyne bet on the future being in LiDAR, Oryx is trying to leapfrog that tech.
Wellingstein says that LiDAR sensors are “limited” performance-wise to 60 meters in ideal conditions and merely 30 meters in “broad day light” while radar is limited in resolution. Because of limited range in object recognition, Wellingstein asserts, Google self-driving cars are limited in speed (25 mph according to Google) and only function in already highly-mapped areas.
“In Oryx’s case, in addition to providing range, we provide speed, reflectivity and the equivalent of temperature for each pixel, in a single frame (whereas depth-sensors such as LiDAR provide range and reflectivity and require several frames to generate speed).”
Treating light like radio
Their most interesting pitch, which needs some further explanation, is that they treat “light waves like radio waves.” They use nano-antennas to detect long light waves rather than other light-detecting technologies like cameras and LiDARs. When light’s discrete, smallest elements (photons) are absorbed by a camera, it generates an electrical signal. Oryx’s antennas on the other hand act differently.
“Oryx Vision sensors are based on antennas,” Wellingstein explained. “For us, light is not a photonic flux – it is an electro-magnetic wave. Thus, Oryx designed antennas (tiny antennas, but still – antennas) that absorb the light and transduce it to electrical signals. The antenna approach allows us to utilize features that cannot be easily implemented with photo-electric detectors like choosing our wavelength and the ability to mix signals.”
They can choose which frequency to operate in depending on the size of the antennas. The antennas they currently use work at 10 micrometers (µm) wavelength, which Wellingstein says, “Is several orders of magnitude farther away from visible light than the 0.9-1.4µm LiDAR use and allows Oryx to function in changing weather conditions or any power limitations.
At this point, the company has already shown off their product to a number of potential partners, but have not entered (or at least announced) into any big sales or partnership agreements yet. Wellingstein says that should change soon.
“We’re in continuous discussions with OEMs, Tier-1s and technology players. We’ve demonstrated our technology to a selected list of auto manufacturers and technology players, and will single out design partners in the coming quarter.”