Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

CASE STUDY (2 compulsory questions * 7.5 marks ) =15 Marks Read the Case and answer the questions below duly checking for plagiarism The First

CASE STUDY (2 compulsory questions * 7.5 marks ) =15 Marks

Read the Case and answer the questions below duly checking for plagiarism

The First Self-Driving Fatality- And Why The Public Viewed It As an Omen

On a beautiful Saturday afternoon in May 2016, on a sunny stretch of road in northern Florida, Joshua Brown, a forty-year-old entrepreneur and technology enthusiast from north eastern Ohio, was sitting behind the wheel of his Tesla Model S sedan. Hed just spent a week with family at Disney World, they had all said goodbye that morning, and he was now driving to a business meeting for a company he had started five years earlier, to help bring internet access to rural areas.

At about twenty minutes before 5:00 p.m., Browns car was zipping along U.S. highway 27A when a semi carrying blueberries in the opposite direction pulled into a left turn lane and then crossed the road ahead of him. Reports suggest that the truck driver ought to have waited for Brown to pass, but there was still sufficient time for Brown to slow down.

The Tesla, which Brown had put into self-driving Autopilot mode, failed to register the white truck against the bright sky. Brown himself failed to take control and engage the brakes. The car crashed into the side of the truck-trailer at seventy-four miles an hour, then continued under it until hitting a utility pole, spinning, and finally coming to rest. Brown, investigators believe, was killed almost instantly upon the Teslas impact with the truck.

Browns death is the first known fatality in a car operating in self-driving mode, and it got a lot of attention in the technology and automobile worlds. Some media commentators and industry analysts had already faulted Tesla for including Autopilot in its cars because the technology was still in beta mode. Others had criticized the company for not doing more to ensure that drivers are actively in control of their vehicles while Autopilot is engaged. Less than a month before the accident, Elon Musk, Teslas founder, had promoted a video Brown himself had made of a different Tesla Autopilot experience, wherein the car successfully noted and avoided a truck pulling ahead of it. Now, after the fatal accident, Musk found himself defending Autopilot as a lifesaving technology that, when used correctly, would reduce the overall number of vehicle fatalities.

Most experts agree. In more than 90 percent of conventional crashes, human error is to blame. According to some estimates, self-driving cars could save up to 1.5 million lives just in the United States and close to 50 million lives globally in the next fifty years. Yet in an April 2018 poll, 50 percent of the respondents said they believed autonomous cars were less safe than human drivers. After the Tesla crash, consumers were apoplectic. This dangerous technology should be banned. Get it off the road. The public streets are not a place to experiment with unproven self-driving systems, wrote a San Franciscan commenting in a discussion forum. Clearly, people viewed Browns death as less an anomaly than a harbinger of things to come. If robots were going to take over our roads, they deserved serious scrutiny first. The National Transportation Safety Board, which is responsible for investigating airplane and train accidents, among other things, launched an inquiry.

In more than 90 percent of conventional crashes, human error is to blame. Yet in an April 2018 poll, 50 percent of the respondents said they believed autonomous cars were less safe than human drivers.

The NTSB published its report in June 2017. Among the findings were that Brown had used the cars Autopilot mode on an inappropriate road; Teslas manuals had instructed that it be used only on highways where access is limited by entry and exit ramps, not where a truck might make a left turn across two lanes of oncoming traffic. Moreover, where Tesla stated that a fully attentive driver should oversee the cars actions even in Autopilot, Brown had been inattentive for at least thirty seconds before the crash. He may have been so because he had successfully used the Autopilot multiple times in the past and had begun to become more comfortable with the feature. The reports authors included advice to car manufacturers: Until automated vehicle systems mature, driver engagement remains integral to the automated driving system. It is the carmakers responsibility, they said, to build systems that ensure drivers remain engaged.

This incident, like others involving interactions between people and AI technologies, raises a host of ethical and proto-legal questions.

  1. What moral obligations did the systems programmers have to prevent their creation from taking a human life?
  2. And who was responsible for Browns death? The person in the drivers seat? The company testing the cars capabilities? The designers of the AI system, or even the manufacturers of its onboard sensory equipment?

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

The Meaningful Money Handbook

Authors: Pete Matthew

1st Edition

0857196510, 978-0857196514

More Books

Students also viewed these Finance questions

Question

Solve the following 1,4 3 2TT 5x- 1+ (15 x) dx 5X

Answered: 1 week ago

Question

7. Define cultural space.

Answered: 1 week ago

Question

8. Describe how cultural spaces are formed.

Answered: 1 week ago