Improving LiDAR – or defeating it

The buzz at Sensors Expo 2019 pitted LiDAR-tech optimism against the reality of an impending shakeout.

For more than a decade, the LiDAR market has been led by Velodyne and its spinning towers of 64 or 128 laser beams. But the relatively high cost, delicate architecture, and beefy form factor have made it a competitive target. The growing ranks of competitors fall roughly into two camps. There are the fledgling LiDAR-makers with novel approaches to distance measurement, laser wavelength, and beam steering. And there are those with radar and vision systems promising LiDAR-level sensing capabilities at a lower cost.

The jockeying for position was on full display at the 2019 Sensors Expo and Conference, held in San Jose, Calif. “The greatest amount of discussion is about LiDAR right now because there are a lot of startups in that space,” observed Jim Hines, an independent consultant working on semiconductor and sensor technologies for connected and autonomous vehicles. Hines, who is based in nearby Palo Alto, told SAE’s Autonomous Vehicle Engineering that the field of self-driving sensors is showing “continual progress,” but that “nothing was really revelatory” at Sensors Expo 2019.

Amir Hosseini, a founding engineer at Santa Clara-based Ours Technology Inc., might disagree. He admitted, however, that the industry is a “LiDAR-congested environment.” Hosseini’s company promises a LiDAR module using a small set of 1550-nanometer wavelength beams and measuring distance with frequency modulation. (Velodyne uses 905 nanometers and time-of-flight measurements.) The company promises what it calls 5D technology to directly measure the velocity of every pixel. Hosseini pegged the price of the Ours Technology Inc. device at “a couple of hundred dollars.” The company has a development demo but is not yet discussing production details. He said the form factor would be “smaller than a phone.”

300 lines of resolution at 10 fps

Nearby on the expo floor, Jason Ferns, director of marketing and applications at Seyond, said, “To get 128 scan lines, you need 128 laser emitters and 128 detectors. So, it’s hard to manufacture – and the cost will never scale down.” Seyond is based in Los Altos with a 20,000 square-foot assembly and test facility in Sunnyvale. Its technology uses a beam that moves both horizontally and vertically. “With two beams, we can get 300 lines of resolution across the field of view while maintaining 10 frames per second. Compare that to 64 lines or 128 lines,” asserted Steve Ehrsam, the company’s VP of global marketing and sales.

The vertical movement is important, he said, because vehicles move in a horizontal direction. The vertical scan, which minimizes horizontal motion blur, is made possible by Seyond’s dual rotating polygon beam pattern. Ehrsam said the price at scale for Seyond’s lidar would be less than $1,000 – if built in a quantity of 100,000 units per year, which the company hope to achieve in three to four years. Seyond has shipped early versions for testing to an undisclosed set of carmakers and robotaxi companies.

War chest size matters

At the expo hall’s “Automotive Technology Theater,” the Autotech Council held a pitch session in the style of the TV show Shark Tank. Four automotive-sensing startup companies vied for attention from council members, a consortium of 100 companies based in Silicon Valley looking to invest in promising auto technologies. One of the presenters was Aditya Srinivasan, North American general manager for Innoviz, an Israel-based LiDAR company that has raised about $250 million in its three-year existence. Its second product, the InnovizOne, will be used in the BMW iNext self-driving SUV, expected in 2021.

Srinivasan promoted the advantages of Innoviz’s solid-state LiDAR with MEMS-based beam steering. He also explained that the company successfully tackled the challenge of ensuring eye safety from a 905-nanometer wavelength LiDAR, reaping the benefits of low 20-watt power consumption compared to the energy-thirsty 1550 wavelength. Srinivasan also played up how the InnovizOne was built from the ground up to be automotive grade. An Autotech Council review panelist, Dan Smith, the chief executive of Capstone Financial Group, a San Jose-based investment bank focusing on auto technologies, asked, “Other than your choice of wavelength, what’s your main differentiator?”

“The war chest,” replied Srinivasan, referring to its fundraising successes. “This is an industry in which you need a long runway.” He added that the partnership relationships with BMW and Tier-1 supplier Aptiv, are critical. “They forced us to change from being a dinky little startup to an automotive supplier, and that’s not an easy transition,” said Srinivasan.

Other startup presenters not yet on the path to scale included:

• David Slemp, president and chief engineer at Aeres Em, presented a holographic radar-imaging solution. Slemp said its resolution would be “second to none.” He added, “The breakthroughs are done. It’s now just engineering.”

• Matt Harrison, head of artificial intelligence at Metawave, emphasized his company’s beam-steering capabilities (top), allowing objects to be detected and classified beyond 300 meters. Harris said that Metawave, which counts Hyundai as an investor, has “surpassed traditional radar signal processing by using a neural network approach.”

• Semyon Nisenzon, chief executive of Cluster Imaging, has raised about $650,000 for a system using six to eight cameras for computing depth. “We can generate real-time accurate depth at an order of magnitude lower cost and better efficiency than LiDAR,” said Nisenzon.

Toward sensor fusion

Smith of Capstone Financial believes that much of current industry investment is “purely FOMO,” a popular term for “the fear of missing out.” Smith stated that there are as many as 250 LiDAR companies in business, although not all are focused on auto applications. “We doubt that fewer than 10 will survive for auto use,” wrote Smith in an email. Despite the disproportionate attention given to LiDAR at Sensors Expo 2019, Smith believes that it will continue to exist as one of several sensors in the stack. “That’s why there’s so much money going now to sensor fusion,” he said. Smith argued that real-world deployment of AVs would remain limited for perhaps 10 to 15 more years, a rollout starting with delivery robots and vehicles, then long-haul trucks, and finally robotaxis.

Jim Hines, the industry analyst, agrees about the rising importance of software and fast processing to handle multi-sensor platforms. “A lot of the innovation is in application of AI neural nets to process the data that comes from the sensors to do all the object detection, classification and path planning,” he said. “There are too many startup companies going after the same prize in lidar,” he concluded. “There’s clearly going to be a shakeout in this market.”