November 2, 2015
Millions of Americans celebrated Halloween this past weekend, and thankfully the grand majority of them made it home safely. Unfortunately, however, some did not. And contrary to popular perception, the real danger didn’t come from tainted candy but rather from auto safety accidents. In fact, there were 115 pedestrian fatalities of children ages 18 and under on Halloween between 1990 and 2010, making it the deadliest day of the year for such auto accidents. With that in mind, the Google Self-Driving Car Project is taking steps to ensure that its self-driving vehicles don’t contribute to those scary Halloween statistics. In a Halloween-eve blog post, the company explained how it asked costumed children to parade in front of one of its self-driving vehicles so that sensors know to recognize their presence. Those with young children in their lives know all too well how swift and unpredictable they can be—should they happen to dart out in front of a Google self-driving vehicle, however, the vehicle will be able to stop suddenly. If only the same could always be said for human-driven cars. Indeed, people commonly believe that self-driving cars are somehow less safe than those driven by people. However, here’s another case where the perception doesn’t quite match reality. A recent study from the University of Michigan’s Transportation Research Institute compared the road safety records of Google, Audi and Delphi—the three companies with self-driving vehicles currently engaged in road tests—with the safety records of all conventional vehicles in 2013. The authors of the study acknowledged that there isn’t a neat apples-to-apples comparison between self-driving vehicles and conventional vehicles, given that self-driving vehicles logged comparatively fewer miles and drove in less hazardous conditions. Even with those caveats in mind, the study concluded that self-driving vehicles had a higher crash rate per million miles travelled than conventional vehicles, but that such accidents were less severe and tended to not be the fault of the self-driving vehicles themselves. Google self-driving vehicles were in 14 reported traffic accidents in the past six years, all of which were minor in nature and typically the fault of rear-end collisions from distracted human drivers. As Chris Urmson, the head of Google’s Self-Driving Car Program, remarked in a blog post on the subject, “the clear theme is human error and inattention. We’ll take all this as a signal that we’re starting to compare favorably with human drivers.” Regardless, industry observers expressed concern about Google’s lack of transparency and accountability regarding its vehicle safety record. To address those concerns, the company began posting online monthly reports in May of this year detailing the activities of its self-driving vehicles. Google even encouraged the public to contact the company with positive or negative feedback about its vehicles on the road. Find about more about Google’s self-driving vehicles by following Auto Publishers!