Overview of Select Incidents
With rising numbers of self-driving cars in production, reports of their failure are reported frequently and are met with concerns about the safety of these vehicles. Tesla boasts about the auto-pilot capability of some of their models, and while the general public may consider "auto-pilot" to mean a completely hands-off experience, Tesla insists that the driver never remove their hands from the steering wheel. Just as airline pilots are still required to man planes equipped with auto-pilot, Tesla acknowledges that the technology they have implemented cannot operate completely without human intervention without risking the safety of everyone on the road. A few particular incidents, although largely isolated in comparison to regular, manually-driven model accidents, outline the bugs that self-driven cars are facing now.
The most talked-about crash in recent news was regarding the first death by a partially self-driven car, the Tesla Model S. The following model explains the basic circumstances of the crash:
The cause of the crash, which occurred while the car was in Autopilot mode, is assumed to be a failure of the obstacle-detection system. A camera and radar system interact with each other to determine obstacles around the car. The camera may have missed the truck due to its bright white exterior, too similar to the brightness of the sky that day to differentiate the two. As the CEO of Tesla explained shortly after, the radar system may have missed the obstacle due to the high ride height of the trailer - confusing it with a road sign that it is trained to ignore to prevent false braking events. This fatal combination of bugs, as well as the theory suggesting that the driver was not alert to the road conditions as the car tells you to be at all times no matter the mode setting, led to the first death by self-driven car.
A second popularly reported crash occurred with a Google self-driven car, and thankfully did not result in injury. The following video shows footage of the crash:
Google explains in its statement that the car detected sandbags near the right side of the road while planning to turn right, waited for cars to pass, saw the truck and assumed it would yield to it to allow the car to maneuver around the sandbags, and subsequently moved directly into the side of the bus. Google also explains that these assumptions happen all the time in regular human-directed driving, which brings us to many important questions going forward in autonomous vehicle production: do we want these self-driven cars to act like human drivers, or do we want them to exhibit perfection? Should these assumptions be made by the vehicle, or should this aspect of driving be put aside in favor of absolute safety?
Pertinence to Computer Science
The routines performed by these self-driving cars are laid out purely in computer programming. Millions upon millions of lines of code enable the car to guide the driver to their chosen destination, with ideally little to no interference on their part. With programming comes bugs, as we've seen through the very simple projects that we've done. However, when these programs hold human lives in the balance, there is no room for error. Before fully self-driven cars can be deployed to the public, extensive bug testing will have to occur and many new regulations met. Any mistake can prove, quite literally, to be fatal.
In computer science courses, instructors stress that all conditions be considered and thorough bug testing be done. The incidents explained above show why thoroughness is such an important concept in this field, and how no matter how tedious it can be to consider every possible input/outcome, it will certainly pay off when you have a flawless finished product (that doesn't kill anyone).
Sources:
http://www.nytimes.com/interactive/2016/07/01/business/inside-tesla-accident.html?_r=0
https://electrek.co/2016/07/01/understanding-fatal-tesla-accident-autopilot-nhtsa-probe/
https://www.engadget.com/2016/02/29/google-self-driving-car-accident/

Interesting blog Ryan! I did not know much of what you mentioned in this blog before I wrote my blog about the Tesla Model X which you can feel free to check out! I feel that more people should know about the possible dangers of this car before they decide to buy it because these are very important warning signs.
ReplyDelete