Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

elaborate an essay about this topic, please add references in APA. Select Issues: Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment

elaborate an essay about this topic, please add references in APA.

Select Issues: Assessing

Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964

Employers now have a wide variety of algorithmic decision-making tools available to assist them in making employment decisions, including recruitment, hiring, retention, promotion, transfer, performance monitoring, demotion, dismissal, and referral. Employers increasingly utilize these tools in an attempt to save time and effort, increase objectivity, optimize employee performance, or decrease bias.

Many employers routinely monitor their more traditional decision-making procedures to determine whether these procedures cause disproportionately large negative effects on the basis of race, color, religion, sex, or national origin under Title VII of the Civil Rights Act of 1964 ("Title VIl"). L Employers may have questions about whether and how to monitor the newer algorithmic decision-making tools.

The Questions and Answers in this document address this and several closely related issues.

Title VII applies to all employment practices of covered employers, including recruitment, monitoring, transfer, and evaluation of employees, among others.

However, the scope of this document is limited to the assessment of whether an employer's "selection procedures"-the procedures it uses to make employment decisions such as hiring, promotion, and firing-have a disproportionately large

negative effect on a basis that is prohibited by Title VII. As discussed below, this is often referred to as "disparate impact" or "adverse impact" under Title VII. This document does not address other stages of the Title VII disparate impact analysis, such as whether a tool is a valid measure of important job-related traits or characteristics. The document also does not address Title VI's prohibitions against intentional discrimination (called "disparate treatment") or the protections against discrimination afforded by other federal employment discrimination statutes.

The Equal Employment Opportunity Commission ("EEOC" or "Commission" enforces and provides leadership and guidance on the federal equal employment opportunity ("EEO") laws prohibiting discrimination on the basis of race, color, national origin, religion, and sex (including pregnancy, sexual orientation, and gender identity), disability, age (40 or older) and genetic information. This publication is part of the EEC's ongoing effort to help ensure that the use of new technologies complies with federal EEO law by educating employers, employees, and other stakeholders about the application of these laws to the use of software and automated systems in employment decisions.(2] For related content regarding the Americans with Disabilities Act, see The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees.

Background

As a starting point, this section explains the meaning of central terms used in this document-"software," "algorithm," and "artificial intelligence" ("A|")-and how, when used in a workplace, they relate to each other and to basic Title VII principles

Central Terms Regarding Automated Systems and AI

Software: Broadly, refers to information technology programs or procedures that provide instructions to a computer on how to perform a given task or function. "Application software" (also known as an "application" or "app") is a type of software designed to perform or to help the user perform a specific task or tasks. The United States Access Board is the source of these definitions

Many different types of software and applications are used in employment, including automatic resume-screening software, hiring software, chatbot software for hiring and workflow, video interviewing software, analytics software, employee monitoring software, and worker management software.

  • Algorithm: Generally, an "algorithm" is a set of instructions that can be followed by a computer to accomplish some end. Human resources software and applications use algorithms to allow employers to process data to evaluate, rate, and make other decisions about job applicants and employees. Software or applications that include algorithmic decision-making tools are used at various stages of employment, including hiring, performance evaluation, promotion, and termination.
  • Artificial Intelligence ("Al*): Some employers and software vendors use Al when developing algorithms that help employers evaluate, rate, and make other decisions about job applicants and employees. While the public usage of this term is evolving, Congress defined "Al" to mean a "machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations or decisions influencing real or virtual environments."
  • National Artificial Intelligence Initiative Act of 2020 at section 5002(3). In the employment context, using Al has typically meant that the developer relies partly on the computer's own analysis of data to determine which criteria to use when making decisions. Al may include machine learning, computer vision, natural language processing and understanding, intelligent decision support systems, and autonomous systems.

For a general discussion of Al, which includes machine learning, see National Institute of Standards and Technology Special Publication 1270, Employers sometimes rely on different types of software that incorporate algorithmic decision-making at a number of stages of the employment process

Examples include: resume scanners that prioritize applications using certain keywords; employee monitoring software that rates employees on the basis of their keystrokes or other factors; "virtual assistants" or "chatbots" that ask job candidates about their qualifications and reject those who do not meet pre-defined requirements; video interviewing software that evaluates candidates based on their facial expressions and speech patterns; and testing software that provides "job fit"

scores for applicants or employees regarding their personalities, aptitudes, cognitive skills, or perceived "cultural fit" based on their performance on a game or on a more traditional test. Each of these types of software might include Al. In the remainder of this document, we use the term "algorithmic decision-making tool" broadly to refer to all these kinds of systems.

Title VII

Title VII generally prohibits employment discrimination based on race, color, religion, sex (including pregnancy, sexual orientation, and gender identity), or national origin.

  • Title VII generally prohibits intentional discrimination, or "disparate treatment" in employment, including employment tests that are "designed, intended or used to discriminate because of race, color, religion, sex or national origin."[3] Disparate treatment is not the focus of this technical assistance.
  • Title VII also generally prohibits employers from using neutral tests or selection procedures that have the effect of disproportionately excluding persons based on race, color, religion, sex, or national origin, if the tests or selection procedures are not "job related for the position in question and consistent with business necessity."[4] This is called "disparate impact" or "adverse impact" discrimination. Disparate impact cases typically involve the following questions:[5]
  • Does the employer use a particular employment practice that has a disparate impact on the basis of race, color, religion, sex, or national origin? For example, if an employer requires that all applicants pass a physical agility test, does the test disproportionately screen out women? This issue is the focus of this technical assistance.
  • If the selection procedure has a disparate impact based on race, color, religion, sex, or national origin, can the employer show that the selection procedure is job-related and consistent with business necessity? An employer can meet this standard by showing that it is necessary to the safe and efficient performance of the job. The selection procedure should therefore be associated with the skills needed to perform the job successfully. In contrast to a general measurement of applicants' or employees' skills, the selection procedure must evaluate an individual's skills as related to the particular job in question.
  • o If the employer shows that the selection procedure is job-related and consistent with business necessity, is there a less discriminatory alternative available? For example, is another test available that would be comparably as effective in predicting job performance but would not disproportionately exclude people on the basis of their race, color, religion, sex, or national origin?
  • In 1978, the EEOC adopted the Uniform Guidelines on Employee Selection Procedures ("Guidelines") under Title VII. | These Guidelines provide guidance from the EEOC for employers about how to determine if their tests and selection procedures are lawful for purposes of Title VII disparate impact analysis.]
  • Questions and Answers
  • 1. Could an employer's use of an algorithmic decision-making tool be a
  • "selection procedure"?
  • Under the Guidelines, a "selection procedure" is any "measure, combination of measures, or procedure" if it is used as a basis for an employment decision. (8] As a result, the Guidelines would apply to algorithmic decision-making tools when they are used to make or inform decisions about whether to hire, promote, terminate, or take similar actions toward applicants or current employees.
  • 2. Can employers assess their use of an algorithmic decision-making tool for adverse impact in the same way that they assess more traditional selection procedures for adverse impact?
  • As the Guidelines explain, employers can assess whether a selection procedure has an adverse impact on a particular protected group by checking whether use of the procedure causes a selection rate for individuals in the group that is "substantially" less than the selection rate for individuals in another group. [9]
  • If use of an algorithmic decision-making tool has an adverse impact on individuals of a particular race, color, religion, sex, or national origin, or on individuals with a
  • particular combination of such characteristics (e.g., a combination of race and sex, such as for applicants who are Asian women), then use of the tool will violate Title VIl unless the employer can show that such use is "job related and consistent with business necessity" pursuant to Title VII. [10]
  • 3. Is an employer responsible under Title VII for its use of algorithmic decision-making tools even if the tools are designed or administered by another entity, such as a software vendor?
  • In many cases, yes. For example, if an employer administers a selection procedure, it may be responsible under Title VIl if the procedure discriminates on a basis prohibited by Title VII, even if the test was developed by an outside vendor. In addition, employers may be held responsible for the actions of their agents, which may include entities such as software vendors, if the employer has given them authority to act on the employer's behalf. [11] This may include situations where an employer relies on the results of a selection procedure that an agent administers on its behalf.

Therefore, employers that are deciding whether to rely on a software vendor to develop or administer an algorithmic decision-making tool may want to ask the vendor, at a minimum, whether steps have been taken to evaluate whether use of the tool causes a substantially lower selection rate for individuals with a characteristic protected by Title VII. If the vendor states that the tool should be expected to result in a substantially lower selection rate for individuals of a particular race, color, religion, sex, or national origin, then the employer should consider whether use of the tool is job related and consistent with business necessity and whether there are alternatives that may meet the employer's needs and have less of a disparate impact. (See Question 7 for more information.) Further, if the vendor is incorrect about its own assessment and the tool does result in either disparate impact discrimination or disparate treatment discrimination, the employer could still be liable.

4. What is a "selection rate"?

"Selection rate" refers to the proportion of applicants or candidates who are hired, promoted, or otherwise selected. (12] The selection rate for a group of applicants or candidates is calculated by dividing the number of persons hired, promoted, or otherwise selected from the group by the total number of candidates in that group.

113] For example, suppose that 80 White individuals and 40 Black individuals take a

personality test that is scored using an algorithm as part of a job application, and 48 of the White applicants and 12 of the Black applicants advance to the next round of the selection process. Based on these results, the selection rate for Whites is 48/80 (equivalent to 60%), and the selection rate for Blacks is 12/40 (equivalent to 30%).

5. What is the "four-fifths rule"?

The four-fifths rule, referenced in the Guidelines, is a general rule of thumb for determining whether the selection rate for one group is "substantially" different than the selection rate of another group. The rule states that one rate is substantially different than another if their ratio is less than four fifths (or 80%). [14]

In the example above involving a personality test scored by an algorithm, the selection rate for Black applicants was 30% and the selection rate for White applicants was 60%. The ratio of the two rates is thus 30/60 (or 50%). Because 30/60 (or 50%) is lower than 4/5 (or 80%), the four-fifths rule says that the selection rate for Black applicants is substantially different than the selection rate for White applicants in this example, which could be evidence of discrimination against Black applicants.

6. Does compliance with the four-fifths rule guarantee that a particular employment procedure does not have an adverse impact for purposes of Title

VII?

The four-fifths rule is merely a rule of thumb. [15] As noted in the Guidelines themselves, the four-fifths rule may be inappropriate under certain circumstances For example, smaller differences in selection rates may indicate adverse impact where a procedure is used to make a large number of selections, 16 or where an employer's actions have discouraged individuals from applying disproportionately on grounds of a Title VIl-protected characteristic.[17] The four-fifths rule is a

"practical and easy-to-administer" test that may be used to draw an initial inference that the selection rates for two groups may be substantially different, and to prompt employers to acquire additional information about the procedure in question.

Courts have agreed that use of the four-fifths rule is not always appropriate, especially where it is not a reasonable substitute for a test of statistical significance.

119] As a result, the EEOC might not consider compliance with the rule sufficient to show that a particular selection procedure is lawful under Title VII when the

procedure is challenged in a charge of discrimination. 20] (A "charge of discrimination" is a signed statement asserting that an employer, union, or labor organization is engaged in employment discrimination. It requests EEOC to take remedial action.

For these reasons, employers that are deciding whether to rely on a vendor to develop or administer an algorithmic decision-making tool may want to ask the vendor specifically whether it relied on the four-fifths rule of thumb when determining whether use of the tool might have an adverse impact on the basis of a characteristic protected by Title VII, or whether it relied on a standard such as statistical significance that is often used by courts.

7. If an employer discovers that the use of an algorithmic decision-making tool would have an adverse impact, may it adjust the tool, or decide to use a different tool, in order to reduce or eliminate that impact?

Generally, if an employer is in the process of developing a selection tool and discovers that use of the tool would have an adverse impact on individuals of a particular sex, race, or other group protected by Title VIl, it can take steps to reduce the impact or select a different tool in order to avoid engaging in a practice that violates Title VII. One advantage of algorithmic decision-making tools is that the process of developing the tool may itself produce a variety of comparably effective alternative algorithms. Failure to adopt a less discriminatory algorithm that was considered during the development process therefore may give rise to liability. 21]

The EEOC encourages employers to conduct self-analyses on an ongoing basis to determine whether their employment practices have a disproportionately large negative effect on a basis prohibited under Title VII or treat protected groups differently. Generally, employers can proactively change the practice going forward.

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Operations Management

Authors: William J. Stevenson

11th Edition

978-0073525259, 0073525251

More Books

Students also viewed these General Management questions