A California jury may soon have to decide. In December 2019, a person driving a Tesla with an artificial intelligence driving system killed two people in Gardena in an accident. The Tesla driver faces several years in prison. In light of this and other incidents, both the National Highway Transportation Safety Administration and National Transportation Safety Board are investigating Tesla crashes, and NHTSA has recently broadened its probe to explore how drivers interact with Tesla systems.
Getting the liability landscape right is essential to unlocking AI’s potential. Uncertain rules and potentially costly litigation will discourage investment in, and development and adoption of, AI systems. The wider adoption of AI in health care, autonomous vehicles and in other industries depends on the framework that determines who, if anyone, ends up liable for an injury caused by artificial intelligence systems.
Yet, liability too often focuses on the easiest target: the end-user who uses the algorithm. Liability inquiries often start—and end—with the driver of the car that crashed or the physician that gave faulty treatment decision. The key is to ensure that all stakeholders—users, developers and everyone else along the chain from product development to use—bear enough liability to ensure AI safety and effectiveness—but not so much that they give up on AI.
Second, some AI errors should be litigated in special courts with expertise adjudicating AI cases. These specialized tribunals could develop an expertise in particular technologies or issues, such as dealing with the interaction of two AI systems . Such specialized courts are not new: for example, in the U.S., specialist courts have protected childhood vaccine manufacturers for decades by adjudicating vaccine injuries and developing a deep knowledge of the field.
SciAm! Got any suggestions? It WILL happen you know. Hopefully the regulations in place by then will help with accountability and compensation 😌
The company
The Kaylon from TheOrville, clearly. SethMacFarlane
those who program the code are still human so they would be held at fault
In the airplane/airline pilot licensing the FAA is responsible for overseeing there is a 'truth'. The PIC, pilot in command, is always identified and ultimately responsible for the safety of any flight or operation of the aircraft. I suspect push come to shove the same holds.
If I shut a gun - I am guilty, but if the gun shuts itself? 🤔
Technology Technology Latest News, Technology Technology Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: PopSci - 🏆 298. / 63 Read more »
Source: verge - 🏆 94. / 67 Read more »
Source: sciam - 🏆 300. / 63 Read more »
Source: ForbesTech - 🏆 318. / 59 Read more »
Source: verge - 🏆 94. / 67 Read more »