论文标题

在任何级别上不安全:NHTSA的自动化水平是自动驾驶设计和法规的责任

Unsafe At Any Level: NHTSA's levels of automation are a liability for autonomous vehicle design and regulation

论文作者

Canellas, Marc, Haga, Rachel

论文摘要

38岁的苹果公司工程师沃尔特·黄(Walter Huang)于2018年3月23日去世,他的特斯拉(Tesla Model X)撞上了加利福尼亚州山景城的公路障碍。特斯拉立即拒绝了对事故的责任。 “道德和法律责任的基本前提是一个破坏的承诺,这里没有:[Huang先生]很清楚自动驾驶仪并不完美,而且发生这次事故的唯一方法是,如果Huang先生没有注意这条道路,尽管有多次警告,但他们仍在关注道路。”这是特斯拉和Uber的标准响应,迄今为止,涉及六起致命事故的自动化车辆的制造商:自动化车辆并不完美,驾驶员知道这不是完美的,并且只有驾驶员一直在关注并注意车辆的警告,那么事故就不会发生。 但是,由于研究人员专注于航空和军事行动中的人类自动化互动,我们不禁要问是否真的没有诺言,也没有法律责任。科学在确定法律责任方面具有关键作用,法院适当依靠科学家和工程师来确定事故或伤害是否可以预见。具体来说,如果在事故发生时,科学家知道事故与设计师未兑现的预防措施之间存在系统关系,则可能会发现设计师承担责任。 将近70年的研究提供了一个不可否认的答案:自动化所有可能的一切并将其余的人留给人类是不足,不合适和危险的。自动车辆的设计与现在发生的事故类型之间存在系统关系,将来将不可避免地继续发生。这些事故并非不可预见,驾驶员并不是只归咎于罪魁祸首。

Walter Huang, a 38-year-old Apple Inc. engineer, died on March 23, 2018, after his Tesla Model X crashed into a highway barrier in Mountain View, California. Tesla immediately disavowed responsibility for the accident. "The fundamental premise of both moral and legal liability is a broken promise, and there was none here: [Mr. Huang] was well aware that the Autopilot was not perfect [and the] only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so." This is the standard response from Tesla and Uber, the manufacturers of the automated vehicles involved in the six fatal accidents to date: the automated vehicle isn't perfect, the driver knew it wasn't perfect, and if only the driver had been paying attention and heeded the vehicle's warnings, the accident would never have occurred. However, as researchers focused on human-automation interaction in aviation and military operations, we cannot help but wonder if there really are no broken promises and no legal liabilities. Science has a critical role in determining legal liability, and courts appropriately rely on scientists and engineers to determine whether an accident, or harm, was foreseeable. Specifically, a designer could be found liable if, at the time of the accident, scientists knew there was a systematic relationship between the accident and the designer's untaken precaution. Nearly 70 years of research provides an undeniable answer: It is insufficient, inappropriate, and dangerous to automate everything you can and leave the rest to the human. There is a systematic relationship between the design of automated vehicles and the types of accidents that are occurring now and will inevitably continue to occur in the future. These accidents were not unforeseeable and the drivers were not exclusively to blame.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源