Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

AI 'drivers' will gain same legal status as human drivers in autonomous vehicles, NHTSA rules

The artificial intelligence in a self-driving vehicle can be treated as the "driver" from a legal standpoint, the U.S. National Highway Traffic Safety Administration said in a decision that could set an important precedent for Apple's own self-driving car.

The position was announced in a letter to Google, published to the NHTSA's website, Reuters reported on Wednesday. The missive was written in response to a design submitted by Google on Nov. 12, which specified a vehicle that has "no need for a human driver."

The chief counsel for the NHTSA informed Google that the agency "will interpret 'driver' in the context of Google's described motor vehicle design as referring to the (self-driving system), and not to any of the vehicle occupants." Current vehicle regulations are based on the assumption of human control, which has created problems in establishing a legal framework for self-driving technology.

In January, U.S. Transportation Secretary Anthony Foxx announced that the federal government would be bending its interpretation of the rules in some cases to accelerate self-driving car development. "Best practices" guidelines should be established in the first half of the year, and the NHTSA will exempt up to 2,500 vehicles from safety standards for the sake of testing.

Although officially still under wraps, Apple is widely believed to be working on an electric car for launch in 2019 or 2020. While the first model may potentially lack self-driving systems, the company is at least thought to be working on the concept for subsequent vehicles.



74 Comments

SpamSandwich 20 Years · 32917 comments

That's the Federal government for you. They pass laws they then determine are inconvenient, then don't rescind them.

JinTech 10 Years · 1066 comments

Does this mean that a drivers license is not required to operate these vehicles?

jfc1138 13 Years · 3090 comments

My understanding is that this referred to design standards where "the driver" is specified for things like rear view mirror height, angle and position etc., accessibility to the brake pedals and that sort of thing. A cpu's physical position doesn't link to those sorts of statutory design requirements.

SpamSandwich 20 Years · 32917 comments

JinTech said:
Does this mean that a drivers license is not required to operate these vehicles?

If you were a non-vehicle operating passenger in an autonomous vehicle that you did not own, it'd be no different from you taking a cab.

If you owned the vehicle, presumably you'd still need insurance.

zimmie 10 Years · 651 comments

JinTech said:
Does this mean that a drivers license is not required to operate these vehicles?
If you were a non-vehicle operating passenger in an autonomous vehicle that you did not own, it'd be no different from you taking a cab.

If you owned the vehicle, presumably you'd still need insurance.

More interesting to me is the question of whether you would need insurance listing you as the driver, or the car. It's entirely plausible that a self-driving car could talk to insurance companies and refuse to engage the self-driving mode unless it had a valid insurance checkin for the day or the month or whatever. That deals with the issue of manufacturer liability rather nicely from the manufacturer's point of view. The problem is that various areas don't require insurance. Private property is the one most people think about, but countries where cars are still rare often don't have much of a legal framework for their use outside of cities. It also doesn't address the potential for a bond in lieu of insurance in regions where that is allowed.

There's also the question of how such insurance checkin would work. I can't imagine any two insurance companies agreeing on a protocol. They're worse than banks, and we've seen the nonsense involved in dealing with banks in the ApplePay rollout.