Self-driving car technologies raise array of legal questions

Self-driving car technologies raise array of legal questions

on Dec 28, 12 • by Chris Bubinas • with 1 Comment

The technological barriers for putting self-driving cars on the road are becoming less significant - Google recently claimed that its autonomous vehicles have logged 300,000 miles without an accident - but the legal ramifications of introducing such cars are just beginning to become clear. ...

Home » Coding Standards » Self-driving car technologies raise array of legal questions

The technological barriers for putting self-driving cars on the road are becoming less significant – Google recently claimed that its autonomous vehicles have logged 300,000 miles without an accident – but the legal ramifications of introducing such cars are just beginning to become clear. According to a recent article on The Verge, questions of liability and reliability are likely to fall on software developers as lawmakers attempt to dissect the implications of putting automated cars on the road.

Self-driving cars are currently legal in three states, and an array of automated features already exist in many automobiles. Nonetheless, many proposed features are difficult to reconcile with existing vehicle codes and prevailing attitudes about automobile safety. For instance, Google's idea of cars that drop passengers off and leave to park automatically violates certain statutes in the Geneva Convention on Road Traffic (1949) that require drivers to control their vehicles at all times, The Verge noted.

While many issues could be resolved by minor language revisions – updating the term "driver" to include computers, for instance – translating driving laws for automated cars becomes more challenging in areas where driver discretion plays a role, according to Stanford researcher Bryant Walker Smith, who wrote a paper on the subject. For instance, speed limits are often subject to good judgement, and human drivers may drive slightly above or below posted limits based on road conditions.

"Once the 'driver' can be a computer, Smith puzzles over what to make of a phrase like 'the driver shall exercise due care,'" The Verge's Russell Brandom noted. "The laws are full of terms like 'prudent' and 'reasonable' that make sense for humans, but become frustratingly vague once you're trying to convert them to code."

Assigning responsibility
The current legal framework also makes determining issues of liability a challenge, The Verge explained. If drivers are in an accident, for instance, they are required to stop and help the injured, which would be a challenge if the car were unmanned. More challenging might be an accident that occurs due to some degree of human negligence, particularly once variables such as assigning responsibility for patching software systems are incorporated.

"For automated drivers, most of these rules have yet to be written, and they'll need to be handled extremely delicately," Brandom wrote. "If the liability laws are too punitive towards driver bots, letting [human drivers] join in a suit against the self-driving-tech developer, then companies might avoid the sector entirely. On the other hand, if the laws leave car-owners on the hook for anything the new gadgets do, consumers may be scared away from buying them."

The Verge reported that current auto liability claims total around $54 billion a year in the U.S., noting that, if even a fraction of that sum fell on developers, the results could flatten business for self-driving cars. Self-driving car advocates note that human drivers are probably less safe – in 2011, an average of 88 people died in auto collisions every day. However, according to the publication, more testing may be needed to ensure automated cars can guarantee the degree of reliability needed to match human drivers, who average one fatal crash per three million hours of driving. To improve software reliability, developers may need to strengthen their coding practices through techniques such as source code analysis.

Currently, embedded software in cars is subject to MISRA compliance, but many electronic systems already in use are still being tested and remain ungoverned by National Highway Traffic Safety Administration (NHTSA) guidelines, NextGov reported. The site warned of the potential for malicious takeovers of poorly secured embedded systems and advocated for more research into cyber risks. Debate is likely to continue over how best to regulate automated systems and whether the responsibility for ensuring safety falls on government agencies, auto manufacturers or subcontracted developers, and creating standards for new technologies is likely to take years, the site noted.

Software news brought to you by Klocwork Inc., dedicated to helping software developers create better code with every keystroke.

Related Posts

One Response to Self-driving car technologies raise array of legal questions

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Scroll to top