DCI Consulting Blog

DCI Staff Present at SIOP’S 26th Annual Conference Held in Chicago

Written by Former Contributors | Apr 26, 2011 6:43:00 PM

The 26th Annual Conference for the Society for Industrial and Organizational Psychology (SIOP) was held April 14-16, 2011 in Chicago, IL. Attendees and presenters from a variety of industries and backgrounds took part in the event. As is typically the case for the Annual Conference, many sessions covered topics related to current EEO-related issues, as well as important themes concerning the workplace, such as assessment and selection, performance appraisals, adverse impact measurement, etc. DCI Consulting Group staff members were involved in a number of those presentations. Some conference presentation highlights are found below.

Recommendations of the Technical Advisory Committee on Adverse Impact Analysis

A respected panel of presenters including David Cohen (Co-chair), Eric Dunleavy (Co-chair), and Mike Aamodt from DCI Consulting; Mary Baker, ERS Group; John Geier, Paul, Hastings, Janofsky & Walker LLP; Lorin Mueller, American Institutes for Research; Mickey Silberman, Jackson Lewis LLP; and Evan Sinar, DDI, discussed some important recommendations of a 70-expert Technical Advisory Committee (TAC) on Adverse Impact Analysis, created by the Center for Corporate Equality (CCE). The session was structured around 15 critical questions addressed in the TAC, with important implications for contemporary measurement of adverse impact. The panel members did note that the results and recommendations were presented in the aggregate, and no single TAC member could be explicitly linked to a particular opinion, perspective or recommendation—survey results were anonymous, and specific focus group participants and comments were confidential. Thus, individual TAC members may have disagreed with some of the recommendations made.

The discussion included, but was not limited to, the following topics:

    • Who is an “applicant” for the purposes of conducting an adverse impact analysis? Panel members noted three criteria to be considered an applicant, which include 1) showing interest in an open position, 2) properly following an organization’s rules for applying, and 3) meeting the basic qualifications; however, those applicants not actually considered for a position or those who formally withdraw from the process should generally not be included in the analysis.
    • Should organizations “guess” the gender or race of applicants who do not self-identify their race or gender? The general TAC recommendation was that organizations should not “guess” the race or gender of applicants who do not self-identify. Although organizations are not legally required to backfill the race or gender of applicants who are hired, doing so is a reasonable practice.
    • Which statistical significance tests should be used? Panelists noted that it is important to determine the context in which the data are being analyzed. The statistical model that best mirrors the reality of the employment decision process should be used; however, it is often difficult to understand how employment decisions were made, and thus challenging to identify the most appropriate model.

 

Adverse Impact Analysis: Contemporary Perspectives and Practices

Presenters in this panel offered modern perspectives on adverse impact analysis, including: statistical significance tests that currently dominate the EEO landscape and which groups to compare in such tests, practical significance tests (including an alternative to the 4/5ths rule), a comparison of three ‘multiple events’ tests for aggregated data, the impact of adverse impact from response distortion (faking) on assessments, and adverse impact-validity tradeoff scenarios.

Some interesting and important take-home points from the session included:

    • A review of some non-traditional EEO enforcement activity by Federal agencies in recent years, in which Eric Dunleavy, DCI Consulting, noted that some recent OFCCP settlements placed emphasis on Non-Hispanics as the disadvantaged group. The importance of proactive analyses in which both traditional (Total Minority vs. White) and individual highest selected group analyses for bottom line impact was emphasized.

 

  • An alternative to the 4/5ths rule, in which Seydahmet Ercan (with Frederick L. Oswald), Rice University, introduced a novel statistic that adjusts the selection ratio to improve upon a statistic that reflects already transformed ratios. The rule was referred to the lnadj rule.

 

 

  • A finding presented by Elizabeth Howard (with Scott Morris), IIT, that although Multiple Events Tests provide an effective way to analyze aggregated data across samples, the outcome may differ depending on which test is used. The uncorrected Mantel-Haenszel test was found to have the best performance among the tests examined, with the highest power to detect a significant when such a difference occurred.

 

 

  • A discussion by Phillip Mangos (with John Morrison), Kronos Inc., in which the results of a study indicated that high levels of focal group faking on employment assessments may reduce or reverse patterns of adverse impact.

 

 

  • A review of the adverse impact-test validity tradeoff by David B. Schmidt (with Alexander R. Schwall and Evan Sinar), DDI, accompanied by a discussion on the potential over-focus on adverse impact statistics, emphasized that over-focus of such statistics may affect other important organizational goals (e.g., diversity).

 

Overhauling Hiring Methodologies: Unproctored, Automated Assessment in Federal Hiring Reform

Jone Papinchock of U.S. Office of Personnel Management, Ryan Shaemus O’Leary of Personnel Decisions Research Institutes, Laurie E. Wasko of HumRRO, Elaine D. Pulakos (Discussant) of Personnel Decisions Research Institutes & Brian S. O’Leary (Discussant) of U.S. Office of Personnel Management shared the development and implementation process of a nationwide testing program using unproctored, automated assessments in response to a memorandum signed by President Obama in May 2010. The symposium discussion focused on the following:

    • History of testing (e.g., beginning with the pre-1883 Spoils System);

 

  • Nationwide testing program goals (e.g., decrease applicant burden and improve quality and speed of hiring, capture assessment data to use and reuse, and allow job seekers to take assessments anytime and anywhere);

 

 

  • Determination of jobs to include in initial testing program (e.g., 12 occupations that make up approximately 33% volume of jobs);

 

 

  • Process of selecting the appropriate online, unproctored assessments, and;

 

 

  • an overview of challenges and achievements encountered during the development process (e.g., controlling size and scope of project, buy in, successfully implementing the testing program for a handful of the 12 jobs).

 

The panel concluded with an overview of challenges and achievements encountered during the development process (e.g., controlling size and scope of project, buy in, successfully implementing the testing program for a handful of the 12 jobs).

OFCCP/Legal Defensibility Safeguards: Hit ‘em With Your Best Shot

Lilly Lin of Development Dimensions International led a symposium with David Schmidt of Development Dimensions International, Laura Mastrangelo Eigel of Frito-Lay North America, David Cohen of DCI Consulting Group, and Kevin Murphy of Pennsylvania State University on OFCCP trends and best practices in proactively dealing with regulatory agencies. The symposium focused on the following:

    • Emerging trends seen in OFCCP audits over the previous year and legal updates, including the updates to the Americans with Disabilities Act Amendments Act (ADAAA) and proposed updates to the Vietnam Era Veterans' Readjustment Assistance Act (VEVRAA)

 

  • Best practices for dealing with regulatory agencies and in conducting job analyses and collecting assessment validity evidence

 

 

  • A case example of Frito-Lay’s effective proactive actions taken during a recent OFCCP audit

 

 

Abolish the Uniform Guidelines

Another interesting session focused on the current state and use of the Uniform Guidelines on Employee Selection Procedures (UGESP). UGESP are the federal regulations that inform on adverse impact measurement and, in cases where substantial adverse impact exists, on validation research strategies that are acceptable for demonstrating job-relatedness and/or business necessity. The esteemed panel included (1) Dr. Mike McDaniel, Virginia Commonwealth University, (2) Dr. Jim Sharf, Employment Risk Advisors, (3) Dr. James Outtz, Outtz and Associates, Dr. Art Gutman, Florida Institute of Technology, and (5) David Copus, Ogletree Deakins.

As many readers of this blog are aware, the Uniform Guidelines were written in 1978 and have not been revised. Some of the panelists emphasized that the UGESP were (1) ‘stale’, (2) inconsistent with contemporary research findings on adverse impact and job relatedness, and (3) uneven related to guidance from professional authorities like the AERA/APA/NCME Standards and the SIOP Principles. Given this perspective, these panelists suggested that UGESP should be revised or even abolished.

Other panelists suggested that (1) the UGESP provide some very useful guidance, (2) we should try not to ‘throw the baby out with the bathwater’, (3) courts have essentially updated and corrected UGESP in their interpretations and rulings and (4) it would be extremely difficult to revise the UGESP and toss out 33 years of relevant case law without any clear understanding of what would replace them. Given this perspective, these panelists seemed to suggest that the UGESP can still be useful in situations where they are applied and interpreted responsibly.

We suggest that readers interested in this topic also review a recent article. The article, written by Mike McDaniel and colleagues at VCU in the journal Industrial and Organizational Psychology: Perspectives on Science and Practice, is titled The Uniform Guidelines Are a Detriment to the Field of Personnel Selection. Responses to this article are due by May 23rd, and will be published as a set later in the summer. We expect the set of response articles to be very diverse in perspective. Stay tuned.

On a closing note, some DCI staff members were formally recognized by SIOP at this year’s conference. First, Eric Dunleavy received the first SIOP Award for Distinguished Early Career Contributions-Practice. This award is given to a recipient who has made distinguished contributions to the practice of I-O psychology within seven (7) years of receiving a doctorate. Second, Art Gutman, who contributes to this blog spot and is a Professor at Florida Institute of Technology, was awarded SIOP Fellowship. Fellow status in SIOP is an honor granted through a nomination process. Society Fellows are distinguished industrial and organizational psychologists who have shown evidence of unusual and outstanding contributions or performance in I-O psychology, through research, practice, teaching, administration, and/or professional service. Congratulations to Eric and Art.

by David Morgan, Keli Wilson, Jana Moberg, Amanda Shapiro & Eric Dunleavy, Ph.D., DCI Consulting Group