Cameras: The Eyes of Autonomous Vehicles

Last post, we learned about LIDAR technology helping cars envision their surroundings but also noted the drawbacks of the technology. I mentioned that Elon Musk and Tesla, a leader in the autonomous vehicles, have chosen to not use LIDAR technology while developing an autonomous vehicle. So how are the future Tesla and other autonomous vehicles going to “see”?

Cameras! Just like you can look through a camera and see a clear picture of the environment around you, cars will be able to do the same thing. By outfitting cars with cameras at all angles, the vehicles will be able to maintain a 360 degree view of their surroundings. Tesla outfits their cars that have the Full-Self Driving Hardware with eight cameras around the car. These cameras take their 2D images and run them through a computer that is 40 times more powerful than the previous model in Tesla’s. The computer creates a 3D mapping of the surroundings, allowing the car to navigate itself on the road.

Above is the video featured on Tesla’s website showing a fully autonomous car driving along with the feeds from the cameras installed in the car.

 

Advantages of Cameras

As you can see in the above video, the cameras are able to pick up a lot from lane markings to road signs. This makes cameras advantageous over other technologies employed in self driving cars, because the ability to read signs and see colors will allow cars to navigate modern roads without driver input.  Cameras also have a big advantage for both the consumer and manufacturer of the vehicle – price. When learning about LIDAR in the last post, we saw that one of the biggest hurdles of the technology was cost, with prices reaching in the tens of thousands. Cameras on the other hand cost to be only a couple hundred to a few thousand. The processing power needed to analyze all the data from the cameras can be expensive, but the whole package is still much cheaper than LIDAR, making it a more suitable candidate for wide adoption. As the quality of cameras and the software interpreting the images advances, cameras are seen as the number one technology to be used in self driving cars, mainly in conjunction with other sensors.

 

Can Cameras Stand Alone?

Most everyone in the autonomous vehicles envision the AVs of the future incorporating cameras in some capacity to drive the car, but can cameras be the only technology behind helping the cars see? Cameras have been found to have trouble navigating the cars in adverse weather conditions as their “vision” can be blocked or clouded. Many companies current attempts at self driving cars seem to suggest that cameras will need to be complemented with either radar or LIDAR. Tesla outfits their cars with a radar on the front of their car and many others used LIDAR to help the car get a clearer picture of its surroundings.  Mobileye, an Israeli company backed by Intel working on helping AVs “see”, states on their website that “From the outset, Mobileye’s philosophy has been that if a human can drive a car based on vision alone – so can a computer.” Mobileye, who used to work with Tesla, believes cameras can be the sole source of vision, but the seem to be among the minority. More likely, cameras will be one of the most important common parts of the puzzle of a truly autonomous vehicle but only a part.

 

References:

  • https://www.tesla.com/autopilot
  • http://www.mobileye.com/our-technology/
  • http://www.telegraph.co.uk/technology/2016/11/21/watch-how-teslas-self-driving-cars-see-the-road/
  • https://www.engadget.com/2017/10/09/mit-tech-helps-cameras-see-around-corners/
  • https://www.theverge.com/2016/11/20/13693120/tesla-self-driving-car-elon-musk
  • https://blog.nxp.com/automotive/radar-camera-and-lidar-for-autonomous-cars
  • https://www.technologyreview.com/s/539841/one-camera-is-all-this-self-driving-car-needs/

4 thoughts on “Cameras: The Eyes of Autonomous Vehicles

  1. This post made me think about the two speakers we’ve had from the MIT Media Lab and how they record drivers. What are your thoughts on cameras inside the car (i.e. ones that watch the driver and make sure they are staying aware)? They are useful for driver vigilance but they can also collect huge amounts of data, which is a big privacy concern.

  2. Why do they always speed up the video in the driving shots? Sure doesn’t contribute to ones trust in the system.

    Good point about ‘cameras only’. Autonomous vehicles have to improve upon human vision before they are trusted!

    Very nice blog!

  3. MobileEye’s camera only philosophy seems flawed to my intuition, but their point that humans drive cars purely on sight is (mostly) true, so it makes me question my assumption that there needs to be supplementary sensors. However, humans also have hearing, feeling, and the ability to move and view things that a fixed camera may not be able to see. I am sure that a camera only system would work, but radar provides so many benefits (such as seeing underneath cars) that I cannot understand why one would not use the technology that is available, since these cars will have to be beyond safe to be accepted.

Leave a Reply