The Ford Motor Company and Chinese search engine provider Baidu have recently jointly invested USD 150 million in Velodyne, a LIDAR (Light Detection and Ranging) manufacturer based in Silicon Valley.
The companies hope that the investment will help increase production and lower Velodyne’s cost per unit from USD 8,000 to roughly USD 200 in the next few years.
At this year’s Consumer Electronics Show, Ford announced that LIDAR would be used going forward in the Ford Fusion Hybrid vehicles, which are already being tested in three States across the US. Baidu is already testing a fleet of driverless vehicles in China and announced that it wants a self-driving car on the road by 2021.
What is LIDAR?
LIDAR is becoming a key technology in the driverless vehicle race. LIDAR uses laser beams to create a highly detailed 3D map of a car’s surrounding environment, and, with the assistance of cameras and requisite software, assists the car in driving autonomously. (Learn more about LIDAR here.)
“LIDAR is a relatively new technology in automotive applications, so investment in companies in this nascent segment will be necessary to fund both the development of the technology, as well as the scale of manufacturing,” says Jeremy Carlson, IHS Markit principal analyst covering autonomous driving and mobility. “Since LIDAR is competing in some ways with relatively well-established radar and camera sensors, refinement of the technology combined with economies of scale will be helpful in defining LIDAR’s unique position in the market.”
In contrast, Tesla is focused on using a radar-based system to detect objects. Yet most automakers are opting for LIDAR, cameras, radar and ultrasonic sensors as key components of a fully autonomous car. However, LIDAR is unable to penetrate through extreme weather conditions such as heavy rain and snow fall or dusty conditions. In these conditions, radar performs better. As a result, autonomous vehicles of the future will likely use both technologies.
“LIDAR technology can be an important input to automated and autonomous driving systems, and as those systems assume responsibility for more or all of the vehicle’s operation, even more information will be required. This is the premise behind ‘sensor fusion’ in which multiple, diverse sensors create a robust, diverse view of the environment,” explained Carlson.
MORE FROM ITU: Read more about ITU’s Future Networked Car event here.
Carolyn Mathas (@) is a technology writer/editor for a number of industry publications. She writes for the LED and Wireless Networking Design Centers on EDN, and previously several DesignLines and CommsDesign for EE Times. Previously, she was a Senior Editor at Lightwave Magazine and a West Coast correspondent for CleanRooms Magazine.