The 32nd Annual Conference for the Society of Industrial and Organizational Psychology (SIOP) was held April 26-29, 2017 in Orlando, Florida. This conference brings together members of the I/O community, both practitioners and academics, to discuss areas of research and practice and share information.
Many sessions cover topics of interest to the federal contractor community, including employment law, testing, diversity and inclusion, big data, and regulations for individuals with a disability.
DCI Consulting Group staff members were well represented in a number of high profile SIOP presentations and also attended a variety of other sessions worth sharing. Notable session summaries and highlights can be found below; you may use the list below to navigate to a particular summary.
- Analytics has a Seat at the Table: Now What?
- Burden of Proof: Can I-Os and Employment Counsel Successfully Collaborate?
- Making Better Business Decisions? Risks and Rewards in Big Data
- Solving the Law Enforcement Staffing Crisis
- I/O's Role in Advancing HR in the Big Data Charge
- Optimizing Validity/Diversity Tradeoffs in Employee Selection
- O*NET Based Research: Leading Edge or Wasted Opportunity
- Leading the Charge: IGNITING Veteran–Workforce Integration Solutions
- Annual EEOC/OFCCP Practitioner Update
- Mentoring for Women in I/O: Career Changes, Interruptions, and Transitions
- Innovative Adverse Impact Analysis
- New Directions: Enhancing Diversity and Inclusion Research and Practice
- Novel Workplace Diversity Interventions: Field Experiments with Promising Results
- Caught on Video: Best Practices in One-Way Interviewing
- What is Machine Learning? Foundations and Introduction to Useful Methods
- Applicant Reactions During Selection: Overview and Prelude to a Review
- The Pre-conference Master’s Consortium: Advantages & Strategies for Building Your Business Acumen
- Industry Differences in Talent Acquisition
- "That Company is Great!" Best Practices for Improving Candidate Experience
- Physical Abilities Testing: Lessons Learned in Test Development and Validation
- Master Tutorial: R Shiny Apps in I/O
- Everything UGESP Forgot to Tell You About Content Validity
Analytics has a Seat at the Table: Now What?
A panel of I/O practitioners discussed considerations and challenges when building workforce analytics functions, including critical partnerships, cultural change, understanding stakeholders, and maintaining a business focus. These experts shared tips and tricks for implementing workforce analytics in their organizations, including:
- Need an understanding of the systems, stakeholders, and IT in order to shape the input and pulls to improve data output
- If organizations are siloed, it’s important to make connections across groups to ensure you can make the case that workforce analytics are needed
- The role of the HR Business Partner is changing – analytics will be a part of their role if not already. Companies can provide skill building workshops to ensure knowledge and skills are appropriate. Online courses for statistics that can also be utilized.
- If certain HR groups are opposed to utilizing analytics, advertise the benefit to HR stakeholders
- Frontline HR: analytics can streamline work, provide better results, give better insight into workforce, candidates, etc.
- Strategic HR: analytics can help balance a culture of constant deliverables, defining relationships by leadership and analytics, translating vision into action.
- HR leadership: analytics can build the business case, define ROI, provide a common language
- If rolling out a new process or program, use pilots to your advantage to make the business case, design research for short and long term results, and have two-way feedback loops with stakeholders. Leverage interviews to round out research and implementation – know your users and stay up-to-date on what’s important.
Burden of Proof: Can I-Os and Employment Counsel Successfully Collaborate?
A panel of labor attorneys and I/O psychologists, both practitioners and educators, spoke on the complexities of working on employment law issues and challenges in organizations. Listed below are several recommendations, mostly focused on considerations when developing, validating, or selecting a (off-the-shelf) test.
- Federal agencies are focused on the search for less adverse alternatives for tests. I/O panelists shared that this has been a challenge in getting this search addressed fully in the technical report; this is especially a challenge with vendors who only have one or few tests and they don't have alternates available.
- Panelists cautioned about using automated resume screens and machine learning. Without perfect correlations there's a high likelihood that at least one resume will make it through that is not good – this can limit defensibility of process. Or the tools could be weighing for words like church, Africa, etc. which would create risk. Also, how do you validate a test that is constantly changing? A panelist from PepsiCo noted that they opted to not do this - they don't want to be the first in the courtroom.
- Panelists also cautioned about working with vendors who specialize in gaming assessments – these vendors often don't have I/O or validation knowledge and skill set.
- Panelists recommended that organizations get legal involved early when selecting a vendor and/or new test – involve them in the RFP process.
- It was also noted not to forget about international considerations, especially data privacy. It’s recommend that the organization involve counsel from each country.
Making Better Business Decisions? Risks and Rewards in Big Data
This session was moderated by DCI’s Dr. Emilee Tison and highlighted both the risks and rewards in using big data techniques to inform employment decisions. As data analytic techniques continue to evolve and incorporate increasingly sophisticated methodologies, employers are cautioned in using such techniques with little to no transparency on the how or why results are being calculated. Although big data approaches in employment decision making can offer great benefits in terms of overall cost, time constraints, more positive candidate experiences, and better statistics, it is imperative for employers to also weigh these benefits against any legal and practical considerations. Such considerations may include privacy/confidentiality concerns, and also the increased potential for such techniques to lead to adverse impact against protected groups if variables are not fully validated or researched at the onset.
Big data techniques are only increasing in popularity and will only continue to evolve at a rapid pace moving forward. Although these techniques can seem very appealing to an employer in informing decisions on the front-end, they can prove very difficult to defend in court at a later date. For this reason, companies are advised to be cautious in implementing any new and improved techniques, as they will be asked to explain how the information being used is job-related. The major takeaway here is for employers to ensure they have the right justification for what they are doing before they incorporate any big data approach into business decisions.
Solving the Law Enforcement Staffing Crisis
DCI’s Dr. Michael Aamodt, together with representatives from both the U.S. Secret Service and U.S. Customs and Border Protection, lead an open discussion on the challenges that many law enforcement agencies are experiencing present day in both attracting qualified candidates to their organizations and meeting demanding staffing needs. Often times, demand is high, but agencies struggle to fill positions as a result of small applicant pools and applicants who do not pass the background check stage in the process.
Discussions focused primarily on causes of the applicant shortage (i.e., working conditions, job location, and strict policies/requirements), strategies for assessment and recruitment, and changes that may be warranted with regard to the background check process. It was also suggested that agencies review their current policies and procedures, and update where possible any strict policies that appear to exclude otherwise qualified applicants (i.e., work to reduce a strict tattoo or piercing policy).
I/O's Role in Advancing HR in the Big Data Charge
Panelists in this session included DCI’s Dr. Eric Dunleavy and others from diverse backgrounds in both applied research and analytics departments who discussed recommendations with regard to how the I/O community can advance the current state of human resources management.
As big data continues to become increasingly prevalent in the world of HR, I/O psychologists find themselves in a position to lead the big data charge and contribute their knowledge and expertise in this realm. Companies are tasked with balancing a great deal of risk associated with the use of big data techniques in the employment decision making process, and establishing meaningful and legally defensible models requires a lot of human touch and input. New tools may make compiling and analyzing big data simple, but the tools themselves don’t tell us why something occurred and what we should do based on those results. This fact represents an opportunity for I/O psychologists to assist HR professionals in turning results into actionable information.
Optimizing Validity/Diversity Tradeoffs in Employee Selection
The session entitled “Optimizing Validity/Diversity Tradeoffs in Employee Selection” included three presentations that discussed alternative ways to handle the tradeoff between selection procedure validity and adverse impact. Typically, high validity is associated with high adverse impact, and this session focused on ways to maximize validity while keeping adverse impact at a tolerable level. Methods included algorithms to identify biodata scoring systems that balance this tradeoff, pareto optimal methods for weighting the selection components in a composite, and methods to estimate the extent to which pareto-optimal weights established in one sample generalize to other samples.
This session highlighted the challenges inherent in developing a selection system that is both highly predictive and free of subgroup differences. Art Gutman, the discussant for this session and a frequent contributor to DCI’s blog, commented on the approaches presented within the context of legal precedent. He noted that while these approaches may seem reasonable from an academic/research perspective, there may be sizeable hurdles to overcome in the courtroom. DCI will be on the lookout for the utilization of these approaches.
O*NET Based Research: Leading Edge or Wasted Opportunity
This symposium entitled “O*NET Based Research: Leading Edge or Wasted Opportunity?” showcased novel uses of O*Net data. A presentation by DCI’s Dr. Kayo Sady examined the importance of various personality characteristics in predicting salary across different industries. For example, he found that extraversion is highly valued in the tech industry. Dr. Sam Holland presented a tool that explored O*Net data from a network perspective to help job seekers find jobs that offer many potential career options. Using this tool, one can identify both the most advantageously situated jobs and the patterns of characteristics associated with those jobs.
Leading the Charge: IGNITING Veteran–Workforce Integration Solutions
A diverse panel consisting of representatives from academia, The SHRM Foundation, the military, consulting, and employer perspectives led a discussion touching on a specific timepiece in a veteran’s transition into the civilian work life. Challenges facing veteran transitions are broad in nature, and the lack of available data to assist with researching veteran outcomes as they transitioned to civilian life was discussed. Resources to help recruit and retain veterans have been published by the SHRM Foundation. Future research is underway to help understand retention challenges for veterans in organizations.
Annual EEOC/OFCCP Practitioner Update
DCI’s Mike Aamodt and Joanna Colosimo were joined by colleagues from Fortney Scott, LLC and Capital One to update the SIOP community on current EEOC and OFCCP enforcement trends and implications from the presidential election. The panel focused on current pay equity enforcement trends, strategic outreach and recruitment for protected groups, and selection issues from an EEO perspective. Best practice takeaways from the session highlighted the importance of collaborating with legal counsel, conducting proactive pay equity studies, and proactively monitoring the effectiveness of selection, recruitment and outreach programs on protected groups.
Mentoring for Women in I/O: Career Changes, Interruptions, and Transitions
In a moderated panel session, the presenters discussed issues for women in I/O that arise due to non-linear career trajectories. For example, job changes often are seen as resulting from indecision rather than strategy. Panelists and moderators were:
- Silvia Bonaccio - University of Ottawa
- Irini Kokkinou - SCAD
- Kea Kerich - Marriott International
- Alison L O'Malley - Deere & Company World Headquarters
- Tatana M. Olson - United States Navy
- Kristen M. Shockley - University of Georgia
- Jane Wu - IBM
- Lynda Zugec - The Workforce Consultants
The primary focus of the session was on anecdotes of the panelists’ own career trajectories. Additionally, the panelists were asked to respond to the following questions:
- What factors led you to a non-linear career path?
- What challenges did you face in pursuing a non-linear career path? How did you handle these challenges?
- What opportunities resulted from your non-linear career path?
- What skills did you develop from implementing your career change(s), interruption(s), or transition(s)?
- What advice do you have for graduate students going on the job market or for more experienced professionals considering interrupting/changing careers?
In the final portion of this session, panelists met with groups of audience participants to discuss in more detail their experiences and advice.
Innovative Adverse Impact Analysis
This expert panel covered a variety of topics related to complex adverse impact analyses. The panelists shared innovative approaches to constructing analytics when responding to intricate personnel decision-making situations. Moderators of the panel were Scott B. Morris (Illinois Institute of Technology) and Eric M. Dunleavy (DCI Consulting Group). Panelist topics included the following:
- Donald R. Deere (Welch Consulting) discussed options for accounting for non-neutral analysis of cases in which multiple applicant records exist for a candidate.
- Daniel C. Kuang (Biddle Consulting Group) discussed use of a composition analysis through binomial statistics to measure the difference of % selected versus % expected.
- Fred Oswald (Rice University) discussed alternative measures to using impact ratio. Such measures included: odds ratio, Phi, absolute selection rate difference, Cohen’s h, and shortfall.
- Richard F. Tonowski (U.S. Equal Employment Opportunity Commission) discussed the utility of measuring practical significance, sharing examples of when the addition of practical significance is critical to cases seen by the EEOC.
Alternative Session: New Directions: Enhancing Diversity and Inclusion Research and Practice
In a thought-provoking, alternative session entitled, “New Directions: Enhancing Diversity and Inclusion Research and Practice,” five scholars and practitioners took stage to discuss the current state of diversity and inclusion research and how to better align research with practice. The following quotation, which was cited twice during the session, really resonated: “Despite a few new bells and whistles, courtesy of big data, companies are basically doubling down on the same diversity and inclusion approaches they’ve used since the 1960s.” (Frank Dobbin, Harvard University; Alexandra Kalev, Tel Aviv University). Dr. Alice Eagly, Northwestern University, argued that right now there is a large gap between what research shows (academia) and what generalizations policy makers and practitioners are using. She challenged researchers and practitioners to be better, more honest stewards of diversity and inclusion knowledge so that “fake news”-esque generalizations propagated by advocacy groups, in particular, do not hinder forward progress in this field.
Up next was Julie Nugent, Vice President of Research at Catalyst, who led a discussion on what inclusion and exclusion feels like for employees in organizations. The Catalyst study found that employees feel included when they are valued for their specific contributions (uniqueness) and are welcomed among their peers (belongingness). In contrast, feelings of exclusion in employees arise if they are devalued/dismissed for their unique characteristics, especially their gender, race/ethnicity, nationality, age, religion, and sexual orientation.
Last, Dr. Gabriela Burlacu, SAP SuccessFactors, wrapped up the session by explaining how each step of the employee life cycle and the personnel decisions made therein, from applying to being hired, paid, trained, promoted, and terminated, can be better tracked and managed by technology – technology that when used appropriately can mitigate the threat of unconscious bias. She spoke of “technology nudges,” such as defaulting bonuses and pay increases to absolute values rather than percentage increases based on base salary.
At DCI, mitigating unconscious bias through the creation and administration of structured personnel systems is something we assist our clients with every day, especially as it relates to EEO risk. We will continue to follow developments related to this type of technology, as well as other forms of “big data,” used to make personnel decisions and keep you posted with our recommendations.
Symposium/Forum: Novel Workplace Diversity Interventions: Field Experiments with Promising Results
In this well-attended session entitled “Novel Workplace Diversity Interventions: Field Experiments with Promising Results,” five researchers and practitioners presented on the effectiveness of four field experiments in promoting positive diversity-related outcomes and improving diversity management in organizations. Dr. Alex Lindsey’s research focused on diversity interventions such as perspective taking (to produce empathy), goal setting (to increase internal motivation) and reflection (to produce guilt) and their effects on pro-diversity attitudes and behaviors. He found that the reflection intervention was most effective in increasing internal motivation and pro-diversity behaviors, but it also promoted anger and frustration a week later. Dr. Lindsey admitted future research (perhaps into more hybrid reflection and goal-setting activities) might be necessary to reach resistant diversity trainees in organizations.
In one more example, Jose David and Carolyn Fotouhi from Merck presented on their company-wide women’s sponsorship program. Jose explained that Merck had recently come out with key healthcare products in oncology, HPV, and insomnia, but their sales were lagging, so they decided to revamp their operating model. After some internal research, Merck found that females comprised less than 1/3 of incumbents in critical roles and slightly over 1/3 female incumbents in director-level roles, yet females make 90% of healthcare expenditure decisions and make up more than 50% of the patient base. Thus, Merck developed an advancement-focused women’s sponsorship program that allowed women protégés of all ranks to interface with women in leadership positions through one-on-one virtual sessions and networking circles. They found that women protégés experienced a higher rate of internal movements (9.5% more females in critical roles and 3.5% more females in Director-level roles) and greater representation of females in succession planning slates. It will be exciting to note if increases in female representation in critical/director-level roles translates into increased key product sales. Perhaps only time will tell.
Caught on Video: Best Practices in One-Way Interviewing
The “Caught on Video: Beat Practices in One-Way Interviewing” session kicked off with a definition of one-way interviewing and how it differs from traditional two-way interviewing. In a nutshell, one-way interviewing is the practice of utilizing video recording to capture applicant’s responses to interview questions that can then be scored at a later time.
One-way interviewing is still relatively new and not widespread in practice. Therefore, the panelists recommended some best practices based on their experiences including:
- Filming actual recruiters (as opposed to actors playing recruiters) asking the interview questions
- Creating a behavioral indicators checklist for recruiters to quantitatively and systematically rate applicants
- Developing questions through a rigorous process which may include a job analysis
- Continually rotating questions to prevent question-sharing among applicants
According to the panelists, initial reactions from applicants have been positive. For example, they like the flexibility of one-way interviewing. Recruiters also enjoy the method for its flexibility (some recruiters watch the videos while on the treadmill!) and like that the interviews have a clear scoring rubric.
What is Machine Learning? Foundations and Introduction to Useful Methods
The session entitled “What is Machine Learning? Foundations and Introduction to Useful Methods” was targeted at individuals with basic to intermediate understanding of machine learning. Supervised vs. unsupervised models of machine learning were discussed. As a best practice, the panelists recommended cross-validation to estimate the R2 shrinkage. Optimally, the data would be broken into a training, validation, and test set so that the researcher may develop, train, and then test the final model.
There are several important concerns that may impact machine learning models. For example, overfitting occurs when the model predicts data too well for the sample and does not generalize. Another concern with machine learning is the bias/variance tradeoff. This was likened to reliability/validity in that high reliability is akin to low variance and high validity is akin to low bias. Finally, the curse of dimensionality refers to the fact that more predictors require a bigger sample. As tempting as it may be to add many predictors, it’s prudent to keep in mind what your sample size is when building a machine learning model.
Applicant Reactions During Selection: Overview and Prelude to a Review
With the rise in technology-based selection tools, the panelists in the “Applicant Reactions During Selection: Overview and Prelude to a Review” session claim that research on applicant reactions (ARs) has not kept pace with the advances in the types of selection tools used in practice. Research results suggest that applicants favor technology-based testing over traditional media (e.g., Potosky & Bobko, 2004). However, when it comes to face-to-face interactions (e.g., interviews), individuals still prefer they be in-person without a technology interface (Straus, Miles, & Levesque, 2001).
Another topic centered on why employers should be concerned with ARs. The argument is that because ARs are not directly tied to measurable performance on the job for hired employees, it may not matter to employers how applicants perceive the selection process. However, the panelists argue that ARs do matter. Further, AR research has been limited in that there are several moderating variables, including selection context variables (e.g., hiring expectations and selection ratio, job desirability), organizational context variables (e.g., organizational size), and individual-level variables (e.g., personality) that have not been examined but that may significantly impact ARs in differing ways.
The Pre-conference Master’s Consortium: Advantages & Strategies for Building Your Business Acumen
A pre-conference session on “Advantages & Strategies for Building Your Business Acumen” was presented by Keli Wilson with DCI Consulting Group. The basis for this talk was to help I-O psychologists entering the workforce to understand the following three concepts: (1) the bottom-line; (2) business development; and (3) the request for proposal process.
Specifically for the bottom-line discussion, information was shared regarding how to communicate and tie organizational outcomes to savings (e.g., minimizing turnover, mitigating risks such as discrimination claims by understanding the mission of EEOC and OFCCP, and leveraging press releases of other companies to demonstrate purpose and savings for embracing I-O practices).
As I-O psychologists advance in their careers and understand the organization they work for or consult with, there may be opportunities to identify potential gaps and communicate opportunities for efficiencies and business growth. A high-level overview of what this process entails was covered during the session (e.g., conduct market research, recognize stakeholders, have a strategic vision, prepare and implement a business plan, market and create buy-in). The guidance provided was to hone presentation skills and learn how to pitch and sell ideas.
Finally, given I-O psychologists commonly go into consulting careers, there is a need to understand the request for proposal process in order to win and partner with clients on projects. An overview of the typical stages of a proposal process were shared with the graduating students (e.g., sales call, proposal, interview, question/answer, sales demo, negotiation of pricing, and signed contract). It was discussed that the scope of the project within the proposal should clearly state the identified problem and provide a proposed solution. Additionally, a tutorial on proposal pricing was provided in this session (e.g., pros/cons of hourly, daily, or project based pricing).
In summary, students graduating with an I-O Master’s degree were provided with the opportunity to be exposed to the business aspects that may not be covered in I-O graduate programs.
Industry Differences in Talent Acquisition
This SIOP conference session was led by a panel of speakers from various organizations: Jenna C. Cox, IBM; Amanda Klabzub, IMB; Mary Amundson, Land O Lakes; Jennifer M. Dembowski, The Home Depot; Nicole Ennen, Google; Hailey A. Herleman, IBM; and Lisa Malley, DDI. The focus of the discussion was on the similarities and differences of talent acquisition across industries. The similarities across industries is in the area of attracting and selecting the talent that supports business initiatives, but differences were noted in the scarcity of talent and need to grow specific talent (e.g., through educational programs), as well as the geography in which the company operates in (e.g., the talent mix in different markets). Also, unemployment rates may be great for job candidates, but not for retailers because of the reduction of qualified pools. It was shared that manufacturing jobs that pay very well can be difficult to fill due to schedules that are unappealing, unpredictable work, and often dirty working conditions. Furthermore, it was communicated that agriculture is very hard to staff, particularly middle management. As for the tech industry, it was stated that you may need to source candidates more than other industries because some of the most qualified and best people for the job already have jobs.
The panelists mentioned that a key differentiator in attracting talent is organizational culture and branding. In addition, the panelists touched on the candidate experience and how they strive to bring down the median days to selection. Some examples of how this was achieved were through cutting out layers of approval, using selection tools, and trimming down the number of interviews as long as a fewer number of interviews was just as predictive. In regards to the candidate experience, the company desire to want people to come back to them, as well as to understand what went well and what can be improved, which is gathered through a candidate experience survey.
Finally, a way to make your company more appealing to candidates seeking employment is to focus on the career site (e.g., social life at company and organizational culture). The goal would be to make it easy to use the career site and to allow candidates to find the jobs that they want to apply for. When it comes to the selection process, it was shared that having a hiring committee can help mitigate individual unconscious bias in the hiring process (e.g., come in and review all the materials of the entire process and feedback/scores from structured interviews).
"That Company is Great!" Best Practices for Improving Candidate Experience
This SIOP conference session was panel style with the following guest speakers: Brittany J. Marcus-Blank, University of Minnesota; Sarah A. Brock, Johnson & Johnson; Pamela Congemi, Medtronic; Jim Matchen, Target Corporation; Marina Pearce, Ford Motor Company; and Amy Powell Yost, Capital One. The topic of discussion was on how to create a positive candidate experience.
Taking action to improve candidate experiences not only helps to secure top talent, but it also benefits brand loyalty. The following are some of the practices shared amongst the panelists to increase candidate experiences:
- setting candidate expectations (i.e., transparency of the selection process and communication of their status at each step);
- calling each declined candidate (i.e., receive a personal call from talent acquisition);
- being aware of the physical space in which the interviewees will spend time (e.g., have an identified space that candidates will not be distracted and at the same time impressed by what the candidate can see of the company);
- picking the candidate up from the airport;
- planning a welcome session and/or tour of facility;
- assigning an onsite coordinator to welcome the candidate being interviewed;
- providing the candidate with information about the interviewer(s) beforehand;
- empowering recruiters to use discretion in sending gifts on behalf of the company (e.g., send a small gift to someone who just graduated with an MA degree or who just dropped out of the selection process due to a devastating life event);
- implementing a candidate reaction survey;
- monitoring Glassdoor and similar websites to discover feedback; and
- training employees on good behavior (e.g., actions that will be appreciated by the candidates, such as a recruiter remembering their name).
Physical Abilities Testing: Lessons Learned in Test Development and Validation
This panel, which included DCI’s Emilee Tison, discussed unique challenges associated with physical abilities testing. Panelists identified challenges encountered in the field and shared lessons learned in this area of work. Specifically, the presenters addressed the following topics:
- Test Development – how developing a physical abilities test is different from other types of selection tests
- Adverse Impact – typical assumptions of existing sub-group differences and methodologies to reduce adverse impact
- Validation – strategies typically used for physical abilities tests and how this differs from other types of selection tests
- Criteria for Validation – typical criteria used for criterion-related validation evidence and challenges faced during these analyses
Panelists cautioned organizations to ensure a full understanding of the physical requirements of the job as well as the types of physical abilities tests available for implementation. Physical abilities testing is not a ‘one-size-fits-all’ process; considering it as such increases risk of a mismatch between the physical requirements of the job and the test being implemented, and increases legal risks.
Master Tutorial: R Shiny Apps in I/O
In this session spearheaded by DCI’s Sam Holland, the use of R’s Shiny package was showcased to demonstrate its usefulness in sharing and visualizing analytic results. It allows R users with no programming background the ability to deploy web-ready applications to showcase results. After walking participants through the basic concepts and principles needed to leverage the package, presenters demonstrated how quickly basic R scripts can be transformed into interactive dashboards.
Everything UGESP Forgot to Tell You About Content Validity
This panel, moderated by Emilee Tison, Ph.D. (DCI Consulting Group), discussed the importance, usefulness, and practicality of content-oriented validation methodologies, which is the extent to which the content of the selection procedure reflects important performance domains of the job. This methodology, however, is often criticized for having limited application and questioned as to whether it increases the likelihood of actual prediction of job performance.
Each panel member spoke about a different topic and the role played by content-oriented validation methodologies:
- James Sharf, Ph.D. (Sharf and Associates) provided a history of the development of EEOC’s Uniform Guidelines on Employment Selection Procedures (UGESP). Of particular interest was the use of validity generalization and whether or not it was appropriate to use.
- Mike Aamodt, Ph.D. (DCI Consulting Group) discussed background checks and the various aspects to address when linking risk areas to specific tasks performed.
- Damian Stelly, Ph.D. (Flowserve Corporation) addressed using content validation as a potential alternative to criterion validation for personality assessments. The applicability will often depend on the specific context of the situation.
- Deborah Gebhardt, Ph.D. (HumRRO) discussed a number of important considerations when using content validation for physical ability tests. Some of these include: a reflection of essential job tasks and work behaviors, feasibility of the simulation, using only basic skills and not those learned on the job or in training, safety, ability to standardize test components, using a meaningful scoring metric, and the reliability of test components.
By Amanda Shapiro, Senior Consultant; Brittany Dian, Associate Consultant; Samantha Holland, Consultant; Joanna Colosimo, Director of EEO Compliance; Jana Garman, Senior Consultant; Jeff Henderson, Associate Consultant; Julia Walsh, Consultant, Keli Wilson, Senior Manager of EEO Compliance, D&I; Cliff Haimann, Consultant; Bryce Hansell, Associate Consultant; and Emilee Tison, Associate Principal Consultant, at DCI Consulting Group