The trial over the Massachusetts Apple Store crash is well underway, and the defense is claiming that bad AI was the culprit, not the driver.
Driver Bradley Rein drove the car into the Apple Derby Street store in November 2022 at about 60 miles per hour, with the vehicle only stopping when it hit the back wall. The incident killed one person and injured 22.
Rein originally claimed it was an accident. That story has now changed.
The defense team filed a motion in court on Tuesday, according to WCVB 5, that claimed that AI software installed in the car caused the unintentional acceleration.
The charges filed in court were upgraded in 2023 from previous charges of reckless homicide by a motor vehicle to second-degree murder, 18 counts of aggravated assault and battery with a dangerous weapon, four counts of assault and battery with a dangerous weapon, and motor vehicle homicide by reckless operation of a motor vehicle.
In April 2023, Rein pled not guilty to the charges, and remained free on a $100,000 bail.
In Massachusetts, there is a strong precedent for drivers being held responsible for crashes involving AI. Should the motion be granted, the charges would likely have to be changed, though, to reflect the nature of the incident.
It's not clear what "AI software" was installed in the 4Runner that could cause forced acceleration and prevent the driver from hitting the brakes. It appears that the 2022 and earlier 4runner models lack factory self driving technology, beyond lane-keeping assist and adaptive cruise control.
The driver previously claimed to police on the scene that his foot was stuck on the gas pedal — which would obviously not be AI-related. Observers on the scene didn't see any deceleration, and there was no evidence found during the investigation that the driver tried to brake.
The prosecution has asked for more time to review the claim. Toyota told WCVB that it was not a party to the litigation, and has no comment.
Apple is also being sued over the crash. The suit alleges that Apple didn't do enough to prevent the incident.
"Our experts tell us that this catastrophe was 100% preventable," Sheff & Cook lawyer Doug Shef said at the time. "They simply needed to place a few barriers or bollards between the parking lot traffic, which was busy holiday traffic, and the public."
It's not clear how the suit against Apple is going.
7 Comments
Is apple the property owner? If they lease or rent they shouldn’t be held accountable the property owner would be. If they do own and the store front is directly on city property such as their front door and store front is directly on a sidewalk then depending on the district they would be responsible for installing those.
I can't believe a court would allow such rubbish 😡
They should add at least 10 years to his sentence for this pathetic attempt at a defence, the driver has clearly shown no remorse or accountability for his actions and is clearly lying 😡
Unintended acceleration has been an argument since 1986. No AI!
In the end the problems were the drive standing on the gas pedal when they thought they were on the brake.
https://www.thetruthaboutcars.com/2007/05/in-defense-of-the-audi-5000/
As for requesting information from Toyota. The police should have tools to read the black box (EDR) data, especially for an incident like this. I doubt the defense will be happy w/what they find.
http://forensicdynamics.com/edr-black-box-downloads
This sounds exactly like a ruse intended to gain sympathy from a jury who sees "AI" as some sort of bogeyman that can subvert a driver's ability to control the operation of their own vehicle. It's not like the media hasn't gone out of their way to portray "AI" as a looming threat to the survival of the human species.
It's yet another attempt to shift blame away from the accused. Blame shifting has existed for as long as humans have existed in quantities greater than one. Pointing a finger at "AI" is simply a modern spin on the same theme. If the "AI" blame shift doesn't stick to the wall they could try "blinded by the glow of a UFO." And don't forget "temporary insanity." That's always an option.
I don't see how Apple should be held in someway responsible for not taking actions to prevent this kind of accident.
If the property has received all the necessary approvals, that is it.
If they had opted to setup a table outside to give away promotional gear (with permission) you wouldn't expect it to have crash protection.