Technology and innovation are moving at a breakneck pace, from Amazon Dash to smart refrigerators to driverless cars. With the advancement of these technologies are there going to be new and unforeseen consequences with regards to negligence and tort law, and will tech (particularly social media) play a role in how these legal issues are raised? On today’s show, we check out the torts of tomorrow with an article from the DMV on driverless cars, then we talk scooters and robots with forward-leaning law professor Tracy Pearl from Texas Tech University.
Who Is Going To Be Responsible For Driverless Car Accidents and Injuries?
Today’s Hot Take is an article from the DMV – no, we are not making this up – called “How Do You Sue A Self Driving Car” by Bridget Clerkin. In it, she writes:
“Autonomous cars may be easily acquiring the skills needed to drive, but it’s much harder for the technology to take responsibility for any accidents it should cause on the road.
To start laying the legal groundwork needed to navigate through such uncertainty, many in the world of law have turned to the idea of strict liability, which would hold a vehicle manufacturer accountable for any incident caused by a defect with a car.
However, some speculate that autonomous vehicles could be designated a “service,” which would make the rides subject to contract law—a body of rules that could skew very favorable for businesses. This is not without potential complications – for contract law, you’d have to accept the terms and conditions which could include mandatory arbitration agreement, preventing users from suing service providers in the case of an accident or joining a class action lawsuit.
The technology of tomorrow will quickly test the laws of today, and it won’t be long before the consensus of industry titans or even the perfect video recall of an event data recorder won’t be enough to determine who’s at fault in a world where vehicles can think for themselves.”
While fully automated cars may still be just over the horizon, it’s interesting to start thinking about how they are going to change society and they laws that will be needed to regulate them. Right now, we don’t even know if it is a service or a manufacturing issue or what the assumptions with regards to driving capabilities even are, so how can we begin to lay the groundwork for damage claims and litigation? There are some great approaches, and it’s a very interesting read regarding what lawyers and law firms might be facing in the future, check it out here:
Teaching Negligence with Twitter and The Future of Negligence and Liability Claims
Tracy Pearl is a professor of law at Texas Tech University. She is a nationally recognized scholar on emerging technology and the law and researches and writes extensively about risk, regulation, and tort law in the areas of driverless vehicles, the Internet of Things, and other new forms of technology. Professor Pearl is admitted to practice in Massachusetts, the District of Columbia, and the United States Courts of Appeals for the First, Fourth, and Tenth Circuits.
We were first introduced to her on Twitter, where she is using the platform in a unique way to interact with her students and the community, and that led us down the path to the present, where she has graciously taken the time out of her day to join us on LAWsome.
Will Technological Breakthroughs Lead To A Flood of New Negligence Lawsuits?
In this excerpt from the podcast Jake and Paul talk with Tracy about emerging technologies and the opportunities for legal action with regards to who or what is responsible and where she thinks tort law is and injury claims are headed:
Paul: It’s interesting to hear you say that, particularly about e-discovery and stuff like that because we’ve had guests on the show and we’ve talked about machine learning, AI, you know, document preparation, e-discovery, stuff like that before and one thing that’s been brought up is responsibility. Who’s responsible for that? And I’m curious, how does that go in kind of a larger sense? I mean, if these things were deep into fully automated cars or whatever, do we sue an engineer? Like – who’s responsible?
Tracy: Yeah, yeah, was it program error or is it nobody at all? That’s a great question and there are, coming up on hundreds of law review articles about, well, who is responsible? Is it the driver? A lot of state laws that are being passed in and around driverless vehicles actually make the person who engages the autonomous system responsible for what that system does, which is terrifying, right? I can’t think of a quicker way to convince people not to use driverless cars than to tell them that they’re going to be the ones on the hook if something goes wrong and there’s a software malfunction.
So I don’t know that our traditional models of liability work particularly well in an AI or machine learning context. I just don’t know that we even have jurisprudence that’s meaningful. Like if you look for instance, at basically all automobile-related jurisprudence for the last hundred years, it has all assumed that there was a human making decisions behind the wheel. And so when you have a fully autonomous vehicle, none of that’s applicable anymore. So where do we go from there? And that’s why I’m dubious by the sort of traditionalist assertion that, “Hey, tort law has dealt with all kinds of things before and it’ll continue to deal with these things well in the future.” Like, I just don’t know that that’s the case when we remove human decision making from the fundamental calculus.
Jake: So in a society that’s already perceived by some as overly litigious…[for example] just recently that Java air crash, that plane went down because there was something that went wrong with an autopilot auto correct thing that none of the pilots knew about. Who is to blame for those things? Who would the families of the victims in that plane crash, who would the families of these victims, of all of these data breaches, certain things that are happening right now, is all this emerging tech just going to give more people the opportunity to take advantage of the system? Or is it going to lean towards sue-happy or if we don’t know who to sue, how could you be happy about it?
Tracy: I’m actually sort of optimistic on that front. I don’t see there being floodgates of litigation suddenly opened here for two reasons. Number one, my hope is that a lot of this emerging technology is going to decrease personal injury pretty significantly. And I certainly believe that about driverless cars. If you look at sort of what the auto insurance industry is saying right now amongst themselves, they’re all terrified by fully driverless cars because it’s going to put them out of business, right? I mean 94% of all car crashes are caused by human driver error. So when you take the human out of the driver’s seat, you know, presumably we may reduce the number of accidents on the road by upwards of 90%. And my hope is that other forms of new technology are going to decrease risk as well. I think secondly, these are going to be thorny, expensive cases to litigate. And so I don’t know that anybody’s going to be super excited about having to plow new ground on a machine learning case in a court. I mean, that’s going to be an uphill battle for any plaintiff. So my hope is that there’s not such a flood of litigation, but that it stays the same, if not decreases.
Get More Information About Technology, Negligence and the Torts of Tomorrow
Be sure to listen to the entire podcast episode for more insights and thoughts about emerging technology, negligence claims and the ability of regulations and law to keep up with the rapidly changing realities of driverless cars and other innovations. If you want more LAWsome subscribe to the show on your favorite podcast platform, and for the latest in legal marketing insights and information be sure to subscribe to the Consultwebs Newsletter here –>SUBSCRIBE<—