Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

As Elaine Herzberg walked her bicycle across a six-lane road in Tempe, Arizona, around 10 o'clock at night, she was fatally struck by a Volvo

image text in transcribedimage text in transcribedimage text in transcribed As Elaine Herzberg walked her bicycle across a six-lane road in Tempe, Arizona, around 10 o'clock at night, she was fatally struck by a Volvo SUV, a prototype autonomous vehicle that Uber had modified to test its self-driving technology. A series of safety failures had combined to cause this fatal accident. First, the sensors mounted on the car failed to spot Herzberg in sufficient time to slow the vehicle down from its 38 miles-per-hour speed. Second, according to police video cameras, the safety driver, a human sitting in the vehicle charged with monitoring the driving, did not have his hands on the steering wheel and was apparently distracted. Third, the car's brakes were never applied by the safety driver or the car's system. In another incident, a Tesla Model X SUV traveling on Highway 101 in Mountain View, California, slammed into a concrete highway lane divider and burst into flames, killing the driver, Wei Huang. Tesla reported that the autopilot system was engaged and provided warnings to Huang of a potential collision, but Huang failed to take control of the vehicle. While the self-driving technology was state-of-the-art in both vehicles at the time of the fatal accidents, it had not been sufficient in either case to prevent tragedy. Some thought that the problem was that the humans in the cars had simply been too slow to react to an unexpected event when they thought the car was in charge. Many saw autonomous vehicles as a way to improve-not degrade-road safety. In 2020, more than 422,000 people died in car crashes in the United States. Clearly, humans were not perfect drivers. Self-driving cars, by contrast, did not get tired, frustrated, distracted, or drunk, as humans often did when behind the wheel. A study by the Virginia Tech Transportation Institute analyzed more than 50 self-driving vehicles commissioned by Google, which had traveled about 1.3 million miles on roads in California and Texas. The Google fleet was involved in just 17 crashes over six years, with none of the incidents being the fault of the self-driving vehicles. However, others saw negative consequences in a world of autonomous driving. Driverless vehicles could cost truckers, taxi drivers, and other driving professionals their jobs. Companies engaged in other forms of transportation, such as buses, trains, and airplanes, could see fewer customers, as people might choose to travel by car when they were not burdened with the driving Another issue raised by critics of autonomous automobiles was the possibility that the vehicle's computer systems might be hacked. As part of a planned experiment, hackers Charlie Miller and Chris Valasek were able to access the controls of a self-driving Jeep Cherokee remotely, instructing it to roll out of a parking lot and into a grassy ditch. The person in the driver's seat at the time, a journalist from Wired magazine, was not controlling the steering wheel or the pedals. Effectively safeguarding against such hacking can be challenging, since self-driving cars have far higher levels of connectivity than human-driven vehicles currently on the road. Will autonomous vehicles be widely adopted in our society? The answer to this question is unclear. In a survey reported in The Washington Post, most Americans said they thought autonomous cars would be quite common within 15 years. Seventy-four percent, however, said they did not expect to own one, and more than 60 percent said they would not want to walk or ride a bicycle anywhere near one. A study conducted by the American Automobile Association (AAA) shortly after the Uber and Tesla accidents reported that 73 percent of Americans said they would not ride in an autonomous vehicle. Yet, as autonomous vehicle technology gained momentum, governments stepped in to regulate. In 2017, the National Highway and Transportation Safety Administration (NHTSA) released new federal guidelines for Automated Driving Systems (ADS). By 2021, 38 states had enacted legislation governing autonomous vehicles. Some states, such as Florida and Arizona, encouraged the safe development, testing, and operation of self-driving vehicles on their public roads, seeing this as an opportunity to encourage business development. Delaware established the Advisory Council on Connected and Autonomous Vehicles, tasked with developing recommendations for innovative tools and strategies that could be used to prepare Delaware's transportation network for these vehicles-an action also taken by numerous other states. While most people expect that fatal crashes involving autonomous vehicles will occur again in the future, the question of whether self-driving cars impose an acceptable level of risk relative to their benefits to society is a matter of ongoing debate. In the case, an experiment that diverted a self-driving Jeep Cherokee into a grassy ditch illustrated which type of technology risk? Multiple Choice How poorly made parking brakes could fail How easily an autonomous vehicle could be hacked How easily GPS could disclose the vehicle's private location How quickly personal data could be exposed through the vehicle's communication system

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Information Systems Management

Authors: Barbara McNurlin

8th Edition

0132437155, 9780132437158

More Books

Students also viewed these General Management questions