Last post, we learned about LIDAR technology helping cars envision their surroundings but also noted the drawbacks of the technology. I mentioned that Elon Musk and Tesla, a leader in the autonomous vehicles, have chosen to not use LIDAR technology while developing an autonomous vehicle. So how are the future Tesla and other autonomous vehicles going to “see”?
Cameras! Just like you can look through a camera and see a clear picture of the environment around you, cars will be able to do the same thing. By outfitting cars with cameras at all angles, the vehicles will be able to maintain a 360 degree view of their surroundings. Tesla outfits their cars that have the Full-Self Driving Hardware with eight cameras around the car. These cameras take their 2D images and run them through a computer that is 40 times more powerful than the previous model in Tesla’s. The computer creates a 3D mapping of the surroundings, allowing the car to navigate itself on the road.
Above is the video featured on Tesla’s website showing a fully autonomous car driving along with the feeds from the cameras installed in the car.
Advantages of Cameras
As you can see in the above video, the cameras are able to pick up a lot from lane markings to road signs. This makes cameras advantageous over other technologies employed in self driving cars, because the ability to read signs and see colors will allow cars to navigate modern roads without driver input. Cameras also have a big advantage for both the consumer and manufacturer of the vehicle – price. When learning about LIDAR in the last post, we saw that one of the biggest hurdles of the technology was cost, with prices reaching in the tens of thousands. Cameras on the other hand cost to be only a couple hundred to a few thousand. The processing power needed to analyze all the data from the cameras can be expensive, but the whole package is still much cheaper than LIDAR, making it a more suitable candidate for wide adoption. As the quality of cameras and the software interpreting the images advances, cameras are seen as the number one technology to be used in self driving cars, mainly in conjunction with other sensors.
Can Cameras Stand Alone?
Most everyone in the autonomous vehicles envision the AVs of the future incorporating cameras in some capacity to drive the car, but can cameras be the only technology behind helping the cars see? Cameras have been found to have trouble navigating the cars in adverse weather conditions as their “vision” can be blocked or clouded. Many companies current attempts at self driving cars seem to suggest that cameras will need to be complemented with either radar or LIDAR. Tesla outfits their cars with a radar on the front of their car and many others used LIDAR to help the car get a clearer picture of its surroundings. Mobileye, an Israeli company backed by Intel working on helping AVs “see”, states on their website that “From the outset, Mobileye’s philosophy has been that if a human can drive a car based on vision alone – so can a computer.” Mobileye, who used to work with Tesla, believes cameras can be the sole source of vision, but the seem to be among the minority. More likely, cameras will be one of the most important common parts of the puzzle of a truly autonomous vehicle but only a part.