Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Use 2 journal articles. Use APA in-text citations and a reference page, in standard APA format. Provide two full pages of written work not including

Use 2 journal articles. Use APA in-text citations and a reference page, in standard APA format.

Provide two full pages of written work not including the cover page or reference page.

In what ways has technology made it more difficult for individuals to protect their

privacy?

2. Do you believe an individual should have the right to be forgotten, that is, to remove

information about themselves from the Internet? If so, should this right be limited, and

if so, how?

3. How does public policy with respect to individual privacy differ in the United States

and Europe, and what explains these differences?

4. Do you think Google should be responsible for modifying its search results in response

to individual requests? If so, what criteria should it use in doing so? Are there limits to

the resources the company should be expected to expend to comply with such requests?

5. If you were a Google executive, how would you balance the privacy rights of the indi-

vidual with the public's interest to know and the right to distribute information?

"Google and The Right to Be Forgotten"

In 2009, Mario Costeja Gonzalez, a self-employed attorney living in a small town outside

Madrid, Spain, casually "googled" himself and was startled by what came up on his com-

puter screen. Prominently displayed in the search results was a brief legal notice that had

appeared more than a decade earlier in a local newspaper, La Vanguardia, which listed

property seized and being auctioned by a government agency for nonpayments of debts.

Among the properties was a home jointly owned by Costeja and his wife.

Costeja immediately realized that this information could damage his reputation as an

attorney. Equally troubling, the information was no longer factual. He had paid his debt

nearly a decade earlier. Abanlex, Costeja's small law firm, depended on the Internet to gain

much of its new business, which was often generated by a Google search. Potential clients

might choose not to hire him, based on the old auction notice, he reflected. His mind then

turned to the possible effects of this kind of information on other people's livelihoods.

"There are people who cannot get a job because of content that is irrelevant," he thought.1

"I support freedom of expression and I do not defend censorship. [However, I decided] to

fight for the right to request the deletion of data that violates the honor, dignity and reputa-

tion of individuals."2

The next week, Costeja wrote to La Vanguardia and requested that it remove the article

about his debt notice, because it had had been fully resolved a number of years earlier and

reference to it now was therefore entirely irrelevant.3 In doing so, he was making use of his

rights under Spain's strong data protection policies, which recognized the protection and

integrity of personal data as a constitutional right under Section 18 of the nation's Data

Protection Act.4 In response, the newspaper informed him that it had recently uploaded to

the Internet all its past archives, dating back to 1881, to allow them to be searched by the

public. It also noted that the auction notice had originally been publicly posted in order to

secure as many bidders as possible. The newspaper refused Costeja's request, stating that

the information was obtained from public records and had thus been published lawfully.5

To be sure, the real problem for Costeja was not that the notice had appeared in La

Vanguardia's digital library, but that it had shown up in the results of the most widely used

search engine in the world, Google, where potential clients might use it to judge his character.6

Following this reasoning, Costeja then wrote to Google Spain, the firm's Spanish affiliate,

only to be told that the parent company, Google Inc., was the entity responsible for the

development of search results.7 Costeja was taken aback by this development. "The resources

Google has at their disposal aren't like those of any other citizens," he reflected.8 Costeja felt

he would be at a disadvantage in a lawsuit against an industry giant like Google.

In March 2010, after his unsuccessful attempts with the newspaper and Google Spain,

Costeja turned to Spain's Data Protection Agency (SDPA), the government agency responsi-

ble for enforcing the Data Protection Act. "Google in Spain asked me to address myself to its

headquarters in the U.S., but I found it too far and difficult to launch a complaint in the U.S.,

so I went to the agency in Spain to ask for their assistance. They said I was right, and the

case went to court," he explained.9 In a legal filing, Costeja requested, first, that the agency

issue an administrative order requiring La Vanguardia either to remove or alter the pages in

question (so that his personal data no longer appeared) or to use certain tools made available

by search engines in order to shield the data from view. Second, he requested that the agency

require that Google Spain or Google Inc. remove or conceal his personal data so that it no

longer appeared in the search results and in the links to La Vanguardia. Costeja stated that

his debt had been fully resolved.10

With these steps, a small-town Spanish lawyer had drawn one of the world's richest and

best-known companies, Google, into a debate over the right to be forgotten.

Google, Inc.

Google Inc. was a technology company that built products and provides services to orga-

nize information. Founded in 1998 and headquartered in Mountain View, CA, Google's

mission was to organize the world's information and make it universally accessible and

useful. It employed more than 55,000 people and had revenues of $45 billion. The com-

pany also had 70 offices in more than 40 countries.

The company's main product, Google Search, provided information online in response

to a user's search. Google's other well-known products provided additional services. For

example, Google Now provided information to users when they needed it, and its Product

Listing Ads offered product image, price, and merchant information. The company also

provided AdWords, an auction-based advertising program and AdSense, which enabled

websites that were part of the Google network to deliver ads. Google Display was a display

advertising network; DoubleClick Ad Exchange was a marketplace for the trading display

ad space; and YouTube offered video, interactive, and other ad formats.

In its core business, Google conducted searches in three stages: crawling and indexing,

applying algorithms, and fighting spam.

Crawlers, programs that browsed the web to create an index of data, looked at web pages

and followed links on those pages. They then moved from link to link and brought data

about those web pages back to Google's servers. Google would then use this information to

create an index to know exactly how to retrieve information for its users. Algorithms were

the computer processes and formulas that took users' questions and turned them into

answers. At the most basic level, Google's algorithms looked up the user's search terms in

the index to find the most appropriate pages. For a typical query, thousands, if not millions,

of web pages might have helpful information. Google's algorithms relied on more than

200 unique signals or "clues" that made it possible to guess what an individual was really

looking for. These signals included the terms on websites, the freshness of content, the

region, and the page rank of the web page.11 Lastly, the company fought spam through a

combination of computer algorithms and manual review. Spam sites attempted to game their

way to the top of search results by repeating keywords, buying links that passed Google's

Page Rank process, or putting invisible text on the screen. Google scouted out and removed

spam because it could make legitimate websites harder to find. While much of this process

was automated, Google did maintain teams whose job was to review sites manually.12

Apart from this general policy, Google Inc. also removed content or features from its

search results for legal reasons. For example, in the United States, the company would

remove content with valid notification from the copyright holder under the Digital Mil-

lennium Copyright Act (DMCA), which was administered by the U.S. Copyright Office.

The DCMA provided recourse for owners of copyrighted materials who believed that their

rights under copyright law had been infringed upon on the Internet.14 Under the notice

and takedown procedure of the law, a copyright owner could notify the service provider,

such as Google, requesting that a website or portion of a website be removed or blocked.

If, upon receiving proper notification, the service provider promptly did so, it would be

exempt from monetary liability.

Google regularly received such requests from copyright holders and those that repre-

sented them, such as the Walt Disney Company and the Recording Industry Association of America. Google produced and made public a list of the domain portions of URLs that

had been the subject of a request for removal, and noted which ones had been removed. As of

July 2015, it had removed more than 600,000 URLs out of more than 2.4 million requests.15

Likewise, content on local versions of Google was also removed when required by

national laws. For example, content that glorified the Nazi party was illegal in Germany,

and content that insulted religion was illegal in India.16 The respective governments, via a

court order or a routine request as described above, typically made these requests. Google

reviewed these requests to determine if any content should be removed because it violated

a specific country's law.

When Google removed content from search results for legal reasons, it first dis-

played a notification that the content had been removed and then reported the removal to

www.chillingeffects.org, a website established by the Electronic Frontier Foundation and sev-

eral law schools. This website, which later changed its name to lumendatabase.org, collected

and analyzed legal complaints and requests for removal of a broad set of online materials. It

was designed to help Internet users know their rights and understand the law. Researchers

could use the data to study the prevalence of legal threats and the source of content removals.

This database also allowed the public to search for specific takedown notifications.17

Google removed content quickly. Its average processing time across all copyright

infringement removal requests submitted via its website was approximately 6 hours.

Different factors influenced the processing time, including the method of delivery, language,

The right to be forgotten can be understood as peoples' right to request that information

be removed from the Internet or other repositories because it violated their privacy or was

no longer relevant. This right assumed greater prominence in the digital era, when people

began finding it increasingly difficult to escape information that had accumulated over many

years, resulting in expressions such as "the net never forgets," "everything is in the cloud,"

"reputation bankruptcy," and "online reputation."18 According to Jeffrey Rosen, professor

of law at George Washington University, the intellectual roots of the right to be forgotten

could be found in French law, "which recognizes le droit l'oublior the 'right of oblivi-

on'a right that allows a convicted criminal who has served his time and been rehabilitated

to object to the publication of the facts of his conviction and incarceration."19

Although the right to be forgotten was rooted in expunging criminal records, the rise of

the Internet had given the concept a new, more complex meaning. Search engines enabled

users to access information on just about any topic with considerable ease. The ease with

which information could be shared, stored, and retrieved through online search raised

issues of both privacy and freedom of expression. On the one hand, when opening a bank

account, joining a social networking website, or booking a flight online, a consumer would

voluntarily disclose vital personal information such as name, address, and credit card num-

bers. Consumers were often unsure of what happened to their data and were concerned that

it might fall into the wrong handsthat is, that their privacy would be violated.

On the other hand, by facilitating the retrieval of information, search engines enhanced

individuals' freedom to receive and impart information. Any interference with search

engine activities could therefore pose a threat to the effective enjoyment of these rights.20

As Van Alsenoy, a researcher at the Interdisciplinary Center for Law and Information

Communication Technology, argued, "In a world where search engines are used as the

main tool to find relevant content online, any governmental interference in the provision-

ing of these services presents a substantial risk that requires close scrutiny."21

Europe

Since the 1990s, both the European Union and its member states (such as Spain) had

enacted laws that addressed the right to privacy and, by extension, the right to be forgotten.

A fundamental right of individuals to protect their data was introduced in the EU's

original data protection law, passed in 1995. Specifically, the European Data Protection

Directive 95/46 defined the appropriate scope of national laws relating to personal data

and the processing of those data. According to Article 3(1), Directive 95/46 applied "to

the processing of personal data wholly or partly by automatic means, and to the processing

otherwise than by automatic means of personal data which form part of a filing system

or are intended to form part of a filing system."22 Article 2(b) of the EU Data Protection

official, and three privacy and freedom of speech experts (including one from the United

Nations). Google's CEO and chief legal officer served as conveners. The committee's job

was to provide recommendations to Google on how to best implement the EU court's ruling.

The majority recommendation of the advisory council, published on February 6, 2015,

was that the right to be forgotten ruling should apply only within the 28 countries in the

European Union.44 As a practical matter, this meant that Google was only required to apply

removals to European domains, such as Google.fr or Google.co.uk, but not Google.com,

even when accessed in Europe. Although over 95 percent of all queries originating in

Europe used European domains, users could still access information that had been removed

via the Google.com site.

The report also explained that once the information was removed, it was still available

at the source site (e.g., the newspaper article about Costeja in La Vanguardia). Removal

meant merely that its accessibility to the general public was reduced because searches for

that information would not return a link to the source site. A person could still find the infor-

mation, since only the link to the information had been removed, not the information itself.

The advisory council also recommended a set of criteria Google should use in assess-

ing requests by individuals to "delist" their information (that is, to remove certain links

in search results based on queries for that individual's name). How should the operator of

the search engine best balance the privacy and data protection rights of the subject with

the interest of the general public in having access to the information? The authors of the

report felt that whether the data subject experienced harm from such accessibility to the

information was relevant to this balancing test. Following this reasoning, they identified

four primary criteria for evaluating delisting requests:

First, what was the data subject's role in public life? Did the individuals have a clear role

in public life (CEOs, politicians, sports stars)? If so, this would weigh against delisting.

Second, what type of information was involved? Information that would normally be

considered private (such as financial information, details of a person's sex life, or iden-

tification numbers) would weigh toward delisting. Information that would normally be

considered to be in the public interest (such as data relevant to political discourse, citi-

zen engagement, or governance) would normally weigh against delisting.

Third, what was the source of the information? Here, the report suggested that journal-

istic writing or government publications would normally not be delisted.

Finally, the report considered the effect of time, given that as circumstances change, the

relevance of information might fade. Thus, the passage of time might favor delisting.

The advisory council also considered procedures and recommended that Google adopt

an easily accessible and easy-to-understand form for data subjects to use in submitting

their requests.

The recommendations of the advisory council were not unanimous. Jimmy Wales,

the cofounder of Wikipedia and one of the eight group members, appended a dissenting

comment to the report. "I completely oppose the legal situation in which a commercial

company is forced to become the judge of our most fundamental rights of expression and

privacy, without allowing any appropriate procedure for appeal by publishers whose work

in being suppressed," Mr. Wales wrote. "The recommendations to Google contained in this

report are deeply flawed due to the law itself being deeply flawed."45

44 "Limit 'Right to Be Forgotten' to Europe, Panel Tells Google," The New York Times, February 6, 2015.

45 Ibid.

In what ways has technology made it more difficult for individuals to protect their

privacy?

2. Do you believe an individual should have the right to be forgotten, that is, to remove

information about themselves from the Internet? If so, should this right be limited, and

if so, how?

3. How does public policy with respect to individual privacy differ in the United States

and Europe, and what explains these differences?

4. Do you think Google should be responsible for modifying its search results in response

to individual requests? If so, what criteria should it use in doing so? Are there limits to

the resources the company should be expected to expend to comply with such requests?

5. If you were a Google executive, how would you balance the privacy rights of the indi-

vidual with the public's interest to know and the right to distribute information?

Use 2 journal articles. Use APA in-text citations and a reference page, in standard APA format.

Provide two full pages of written work not including the cover page or reference page.

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Business Law The Ethical Global and E-Commerce Environment

Authors: Jane Mallor, James Barnes, Thomas Bowers, Arlen Langvardt

15th edition

978-0073524986, 73524980, 978-0071317658

More Books

Students also viewed these Law questions