Automotive Design and Production

JAN 2016

Automotive Design & Production is the one media brand invested in delivering your message in print, online, via email, and in-person to the right automotive industry professionals at the right time.

Issue link: https://adp.epubxp.com/i/629442

Contents of this Issue

Navigation

Page 49 of 51

STOP TALK In November, a police ofcer pulled over one of Google's prototype self-driving cars that the company was testing near its headquarters in Mountain View, California. The reason? Reckless driving? Running a red light? Texting? Road rage? Or how about plain old speeding? The answers are no, no, no and not even close. In fact, the rule the car ran afoul of was 22400(a) of the California Vehicle Code, which states in part that "no person shall drive at such a slow speed as to impede or block the normal movement of trafc," which in layman's terms basi- cally translates into the Sunday Driver Law. The Google car, which was motoring along at a Yugo-like 24 mph in a 35-mph zone, was causing a lengthy backup on a busy thoroughfare where speeds are typically 10 mph more than the posted limit rather than the opposite. This is all starting to sound like one of The Onion's satirical news stories or the premise for a Saturday Night Live skit. Google even joked about it on its company blog: "Driving too slowly. Bet humans don't get pulled over for that too often." Not necessarily something you brag about to friends. But in the end, no ticket was issued to the engineers who were riding along as passen- gers to evaluate the vehicle. So no harm, no foul. Right? Autonomous cars tested in California currently are only allowed to operate on streets with a posted speed limit of 35 mph or less. Google takes this a step further by restricting the top speed of its vehicles to 25 mph. The thinking is to play it safe until the technology is fully vetted, and any potential safety issues are addressed and fxed. After all, improved safety is one of the primary drivers—along with better trafc fow and allowing drivers to relax, work or do other things while their cars are taking them from A to B— of self-driving vehicles in the frst place. But can autonomous cars really do everything a human can do, including reacting quickly and logically to a virtu- ally infnite range of possible situations that drivers can STEVE PLUMB, Senior Editor, AutoBeat Daily encounter and routinely handle? And who is liable if things do go wrong? Driving too slow is way down the list of concerns of what can go wrong with self-driving cars. But it underscores just how important it is to fully test emerging technologies under all possible scenarios. Longtime consumer advocate Ralph Nader, whose seminal 1965 book Unsafe at Any Speed helped launch the automotive safety movement 50 years ago, warns that automated technologies could have the reverse efect on safety than what's intended. While recognizing the benefts of adaptive cruise control, blind-spot detection and collision-avoidance systems, Nader and other skeptics point out the dangers of temporarily removing people from the driving process and turning cars into entertainment pods and mobile ofces. Until vehicles become fully autonomous, which isn't expected anytime soon, control will pass back and forth between drivers and on-board computers depending on where the vehicle is being driven and other factors. Not only does this increase the potential for driver distractions, Nader notes, it also could diminish a driver's skill level—especially when it comes to dealing with emergency situations. Tesla ran into some problems shortly after downloading its Autopilot software—which enables automated steering, braking, throttle control and lane changes during highway driving—into 40,000 of its Model S electric cars in October. After several motorists posted videos of the system not functioning properly and driving with their hands of the wheel or while sitting in the back seat, Tesla CEO Elon Musk vowed to implement additional safety guards to prevent drivers from doing "crazy things." Musk says there are no reports of Autopilot causing any accidents and he suggests there is already evidence the system has helped prevent some crashes. But he also cautions drivers to be careful when using automated features and emphasizes that users are ultimately responsible for their own safety. Sounds like good advice. As technology continues to progress, perhaps everyone should take a page from Google's playbook and go slow when it comes to implementing autonomous features until everything is properly engineered, tested, validated, repeated and confrmed under all conditions so we can all reap the safety benefts. With more than 25 years of experience, Steve has covered every aspect of the auto industry as an industry writer, editor and marketing professional. He was the founding editor of AutoTech Daily and rejoined the AutoBeat team in 2015. He previously was the editorial director for a leading public relations company. Perhaps everyone should take a page from Google's playbook and go slow when it comes to implementing autonomous features. Autonomous Cars: Safe at Any Speed, or Lack Thereof? 48

Articles in this issue

Archives of this issue

view archives of Automotive Design and Production - JAN 2016