An intoxicated woman driving a Ford “Mustang” Mach-E, who struck and killed two people in Philadelphia last March, is being charged with homicide. However, the vehicle’s suite of advanced driving aids have muddled the case and are apparently being used as part of the defense. Meanwhile, the technology itself is being investigated by the National Transportation Safety Board (NTSB) for presumed operational failures.
Based upon reports by the Associated Press (h/t AutoBlog), Dimple Patel was allegedly driving under the influence with her Ford’s BlueCruise system activated. Authorities have asserted that she struck a stationary Honda CR-V, which had stopped on the left shoulder of I-95 to assist the driver of a broken down vehicle, around 3:00 AM.
Both the Honda driver and the man he was assisting were killed.
While the Pennsylvania State Police are treating the incident as a DUI, the department also took time on Tuesday to issue a public statement that drivers of vehicles possessing so-called advanced driving technologies need to be prepared to resume control at all times.
“No partially automated vehicle technology should ever be left alone to perform the driving tasks that are required to safely navigate the roads of the commonwealth,” police stated.
We’ve grown accustomed to stories about drivers misusing Tesla’s Autopilot, with the resulting incidents frequently becoming NTSB investigations. But they’re hardly the only automaker featuring such technologies and this is the second instance of the federal agency looking into Ford’s BlueCruise, which provides hands-free driving on select highways. The caveat to this is that equipped vehicles come with driver monitoring systems that are supposed to force drivers to retake control of the vehicle whenever road conditions require it or the system believes the operator isn’t paying full attention.
A previous fatal incident, from February, featured a Mach-E striking another Honda CR-V along I-10 in San Antonio, Texas. Investigators were under the impression that the Ford had impacted the Honda because it was stopped in the center lane with no lights.
There’s an aspect of this that feels like the NTSB is wasting its time to even investigate these types of incidents. Whatever manufacturers are promising, the resulting technologies always have blind spots. Independent testing has shown repeatedly that the efficacy of these systems vary wildly between manufacturers, road conditions, vehicle speeds, and even times of day.
Your author is likewise also forced to point out that numerous studies have resulted in findings supporting assertions that modern vehicle technologies actually diminish skills behind the wheel while promoting distracted driving. There’s a chance that they’re having an inverse effect on roadway safety, represented by a noteworthy increase in per capita roadway fatalities over the last decade.
We even have a prime example. The very first fatality involving an “automated vehicle” was the result of the safety driver having totally checked out to watch videos on their phone.
While it’s difficult to say how much of the above came into play for this specific situation (she was drunk), there are prior examples where intoxicated drivers had attempted to tell police that they were under the impression that their vehicle could basically drive itself. But this is the first instance where the person’s legal defense actually seems to be running with the premise.
From AP:
Investigators in Philadelphia believe that Mach-E driver Dimple Patel was driving about 71 mph (114 kph), using both BlueCruise and Adaptive Cruise Control, when the crash occurred. A fourth vehicle was also struck.
The 23-year-old Patel, a pre-med student from Philadelphia, faces multiple charges, including homicide by vehicle while driving under the influence and involuntary manslaughter. She turned herself in to police Tuesday on the charges filed last week, state police said.
Defense lawyer Zak Goldstein said he had not yet seen the criminal complaint or any reports on the crash, and called the deaths a tragedy. However, he noted that, broadly speaking, Pennsylvania law on DUI-related homicides requires “that the DUI caused the homicide.”
“If in fact it’s a failure in a self-driving or a driving system, that may not be a homicide by DUI even if the driver is intoxicated,” he said, adding that he has not seen any case law on the issue in Pennsylvania.
Ford has said it was collaborating with both the state police and the National Highway Traffic Safety Administration in reviewing the crash, which killed Aktilek Baktybekov, who had broken down, and Tolobek Esenbekov, who had presumably stopped in the shoulder to assist him.
Goldstein is apparently hoping to downgrade the charges against Patel by claiming that the vehicle itself is partially at fault. While this seems ludicrous on its face, the fact that Ford’s BlueCruise system allows drivers to remove their hands from the vehicle while the car handles most of the work on the highway muddies the waters — especially since it’s supposed to have a fail safe by way of driver monitoring.
If you’ve read anything I’ve ever written about connected car technologies, you’ll know that my assumption was that driver monitoring would open up a Pandora’s Box of privacy and legal issues for literally every party involved. Automakers certainly aren’t going to want to take responsibility for wrecks and presumably want driver monitoring for lucrative data acquisition purposes. Insurance agencies would likewise want to use the tech to deny coverage and have previously used connected vehicle technologies to raise their rates. The whole thing is becoming a complete mess for drivers.
That said, it could likewise be argued that having such hardware equipped to the vehicle should have prevented an incident like this from happening in the first place. The federal government claims to want this technology mandated and installed into all modern vehicles as a preventative measure for this exact kind of incident. But BlueCruise clearly failed here, assuming the system was indeed activated.
The Department of Transportation’s National Highway Traffic Safety Administration (NHTSA) has said that, from January 2018 to August 2023, it has recorded 956 crashes involving Autopilot and Tesla’s “Full Self Driving” systems resulting in 29 deaths. It even forced the manufacturer to recall the system to upgrade driver monitoring protocols.
However, other automakers are now utilizing similar systems with nearly identical safety nets and are likewise seeing problems as they attempt to make these technologies commonplace. It seems plausible that there’s an issue with the systems themselves.
In both incidents featuring Ford’s BlueCurise, the crashes happened at night. Previous testing on other so-called advanced driving systems have shown that they’re often less effective when lighting is subpar and that may be the case here as well. At any rate, federal agencies have said they plan on looking into how both BlueCruise and the camera-based driver monitoring system perform as part of their investigation.
[Images: Ford Motor Co.]
Become a TTAC insider. Get the latest news, features, TTAC takes, and everything else that gets to the truth about cars first by subscribing to our newsletter.