TESLA NOW HAS another fatality to hang on its semi-autonomous driving system. The company just revealed that its Autopilot feature was turned on when a
TESLA NOW HAS another fatality to hang on its semi-autonomous driving system. The company just revealed that its Autopilot feature was turned on when a Model X SUV slammed into a concrete highway lane divider and burst into flames on the morning of Friday, March 23. The driver, Wei Huang, died shortly afterwards at the hospital.
This is the second confirmed fatal crash on US roads in which Tesla's Autopilot system was controlling the car. It raises now familiar questions about this novel and imperfect system, which could make driving easier and safer, but relies on constant human supervision.
In a blog post published this evening, Tesla (Links to an external site.) says the logs in the car's computer show Autopilot was on, with the adaptive cruise control distance set to the minimum. The car stays in its lane and a fixed distance from the vehicle ahead, but the driver is supposed to keep his hands on the wheel and monitor the road, too. Take your hands off the wheel for too long, and you get a visual warning, on the dashboard. Ignore that, and the system will get your attention with a beep. If you're stubborn or incapacitated, the car will turn on its flashers and slow to a stop.
Based on data pulled from the wrecked car, Tesla says Huang should have had about five seconds, and 150 meters of unobstructed view of the concrete barrier, before the crash. Huang's hands were not detected on the wheel for six seconds prior to the impact. Earlier in the drive, he had been given multiple visual warnings and one audible warning to put his hands back on the wheel.
The car's manual reminds Tesla drivers that Autopilot is a driver assistance tool, not a replacement, and that they retain responsibility for driving safely. (The big center screen conveys the same message when you engage Autopilot for the first time.) But critics say the ease with which Tesla's system handles regular freeway driving can lull a driver into thinking it's more capable than it is (Links to an external site.), and allow them to become distracted or take their eyes off the road.
Drivers need to be ready to grab the wheel if the lane markings disappear, or lanes split, which may have been a contributing factor in this crash. Systems like Autopilot have known weaknesses. The manual also warns that it may not see stationary objects, a shortcoming highlighted when a Tesla slammed into a stopped firetruck (Links to an external site.) near Los Angeles in January. The systems are designed to discard radar data about things that aren't moving, to prevent false alarms for every overhead gantry or street-side trash can.
Autopilot was first enabled on Tesla's cars, via an over-the-air software updates, in October 2015. The system combines radar-controlled cruise control with automatic steering to stay within painted lane lines. The first person known to die using Autopilot was Joshua Brown, whose Model S crashed into a truck that turned across his path in Florida, in May 2016. Neither he nor the car's computers saw the white truck against the bright sky.
Federal investigators pored over the crash site and the vehicle logs, as they are doing with this second fatality. The National Highway Traffic Safety Administration concluded that the system was operating as intended (Links to an external site.), wasn't defective, and that Tesla didn't need to recall any cars. The crash, in other words, was Brown's fault. It went further, and said that crashes dropped 40 percent in Tesla cars equipped with the autosteer feature.
The National Transportation Safety Board was more damning, saying Tesla should bear some of the blame (Links to an external site.) for selling a system that is too easy to misuse.
After Brown's death, Tesla modified Autopilot to rely more on data from its radar, and less on the camera, to spot obstacles in the car's path. It also sent out a software update that sharply curtailed the length of time a driver can let go of the wheel, and introduced brighter, flashing warnings. That length of time varies according to speed and road conditions, but can still be a few minutes.
Autopilot was groundbreaking when Tesla introduced it, and Elon Musk (Links to an external site.) promises his cars are capable of even more, from changing lanes all by themselves, to full self-driving. Other luxury car makers have introduced similar systems with varying restrictions—and far less grand promises. Cadillac's Super Cruise uses an infrared camera to monitor the driver's head position (so it knows when he's looking at the road), instead of relying on torque sensors in the steering wheel.
The federal investigations into Huang's crash are ongoing, and may not produce reports for several months (the NTSB typically takes 12 to 18 months to finalize and publish its findings). In the meantime, Tesla used its blog post to point out some extreme circumstances in this accident. The barrier that Huang hit was supposed to have a crash attenuator, which crumples to absorb some of the impact. But it had been crushed in a previous accident, and not replaced, the company says. "We have never seen this level of damage to a Model X in any other crash," the Tesla Team blog post reads (Links to an external site.). Coupled with Uber's fatal crash in Arizona, in which one of its self-driving cars hit and killed a pedestrian pushing a bike, this incident marks the beginning of what is likely to be a difficult time for the autonomous vehicle (Links to an external site.) industry. Engineers are convinced that taking the easily-distracted human out of the driving equation will cut down on the 40,000 road deaths each year on American roads. But right now, the systems aren't sophisticated enough to operate without human oversight, which is difficult to ensure. And that leaves everyone in a difficult middle ground—a no man's land with no obvious or immediate route out.Submission:
- Q1: Case Summary
Should include a brief summary of the case background, chronology, and root cause.
- Q2: What good systems engineering practices were followed? Please provide bullets listing your responses. There is one preferred response and you will be given credit if any one of your responses matches that preferred response.
Relate your argument with the lesson learned from the course. Provide at least the minimum amount of argument needed (the minimum amount varies for every case studies).
- Q3: What good systems engineering practices were not followed? Please provide bullets listing your responses. There is one preferred response and you will be given credit if any one of your responses matches that preferred response.
Relate your argument with the lesson learned from the course. Provide at least the minimum amount of argument needed (the minimum amount varies for every case studies).
For Q2 and Q3, there is a long list of preferred responses. You will be given credit for how many responses (up to four) matching this list. You can either give three responses to Q2 and one to Q3, two to Q2 and two to Q3, one to Q2 and three to Q3.
- Q4: What corrective actions were taken by the organization (if any?) Did Tesla try to eliminate the risk, reduce the probability of the risk occurring, protect drivers against the adverse outcome of the risk occurring, accept the risk or transfer responsibility for the risk? If so, how? Please provide a list of bullets with each of your responses. There is one preferred response and you will be given credit if any one of your responses matches that preferred response.
Corrective actions are measures taken after the event to avoid future failures.
- References
Must be given to verify your arguments.
Step by Step Solution
3.45 Rating (161 Votes )
There are 3 Steps involved in it
Step: 1
1 Case Summary In this case a fatal accident occurred involving a Tesla Model X SUV equipped with the Autopilot feature The accident took place on Mar...See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started