Software Picks up on Unconcsious Bias Against Diversity
Unitive is a technology company aimed at helping employers overcome unconscious biases to help them increase diversity while hiring. Spearheaded by Laura Mather, the company is in its pilot stage and hopes to have some “big customers” very soon.
Starting with job posting on recruiter sites like Monster Inc. or Indeed, the technology will help companies identify language that excludes one or the other gender, i.e. using words like “competitive” and “dominant” that turn women off. Mather's technology gives employers and those writing the job postings both the information and the tools to help make them gender neutral.
Mather states that even without recognizing it, employers might be alienating a few well- qualified candidates from applying for positions that interest them.
“The problem is the bias that is now occurring is unconscious. Fighting the behavior that’s caused by unconscious bias is a different fight than fighting the overt behavior of [the] Mad Men days.”
This is especially true, Mather seems to think, with respect to women and those of colour employed in STEM fields.
While considering the resumé of an applicant, Unitive would not reveal the gender and race of an applicant, but rather will compartmentalize components like education, work history, and hobbies so a hiring manger will only see one piece at a time without the influence of the others. During the interview itself, a computer system or a personalized printed template of questions can provide prompts to the interviewer to stay on track and focus on what is most important: the skill set of the interviewee.
Especially since it has been so recently confirmed that an Ivy League education is simply overrated, it seems clear that employers ought not to focus on where the person has graduated from as much as what the person can do and what skills they can bring to the table. That Mather's technology will help them do this is a step in the right direction, but it isn't fool-proof just yet.
It seems like there is still room for human error and bias based on the interviewer's or the employer's opinion on certain components or sections of the resumé. In addition, women and people of colour can be laid off even after they land the job that they want and have the skills to maintain it. Hence, the next thing for Mather on the agenda is the analytics on the hiring process and finding a way to root out bias in performance reviews, talent development and promotions.
The question remains: will eliminating all bias or at least most of it leave a sort of robotic feel to the work environment? Also, employers could end up in positions where they are too scared to have an opinion on an employee who is a member of a minority group.
Another possible downside to Mather's technology could be that the employer might miss out on the charm of seeing the whole resumé as they will be evaluating it in chunks. It remains to be seen what the effects of this new pilot program will be on both interview and evaluation processes, but one thing is clear: it is a promising new step to encourage employers to attract and evaluate each individual by skill set as opposed to just their skin colour or gender.
No comments:
Post a Comment