Konica Minolta is beta testing the program, which has the potential to make hiring more equitable.
Any new technology, such as Copilot, a machine learning feature in the popular recruiting and applicant management software iCIMS, promises to make some difficult, expensive, or tedious parts of life a little easier. As any hiring manager knows, recruiting can often be all three. With generative AI tools like ChatGPT and MidJourney reaching mainstream awareness and many corporations exploring applications for large language models (LLMs) behind the scenes, many companies are asking if AI can solve their hiring woes.
To get some insight into this question, we talked to Christy Harper, director of talent acquisition at Konica Minolta Business Solutions U.S.A. and a member of the company’s AI steering committee. Konica Minolta is currently part of a beta program testing out Copilot, so she has a peek into the future of AI-powered talent hunts.
While admitting that the potential of generative AI in hiring can be both “exciting and terrifying at the same time,” Harper’s experience with the beta program has been positive. “Talent acquisition is all about enabling the business to have the talent it needs to accomplish its goals,” said Harper. “Anything that helps us do that better, more effectively, more fairly is a positive for me.”
Inside Copilot
One of the tasks Copilot is designed to help with is evaluating the job descriptions Konica Minolta might post online to job boards such as LinkedIn or Indeed. This helps the recruiting team ensure that those listings don’t hamper diversity efforts by inadvertently using alienating language that may make job seekers from underrepresented backgrounds feel unwelcome. This is subtle, exacting work that deals with nuance in language and requires mastering a whole glossary of terms that may, in often counterintuitive or surprising ways, communicate unintended messages.
“When used properly, the ways to make it fairer are really interesting,” said Harper. “It allows us to review gender bias or ethnic bias in that job description by the words that are chosen to describe the job.”
In as little as thirty seconds, Copilot can output suggestions on how to revise a job description to avoid terms with subconsciously gendered qualities, such as swapping out the potentially masculine sounding “drive for results” with a more neutral phrase like “lead teams to success.” It can also help HR managers tune their job descriptions to clearly communicate what career stage it’s best suited for, audit cultural bias, and even evaluate other public-facing company writing to ensure that everything a job seeker sees communicates inclusive company values.
While other bias analysis software tools have existed on the market for some time, Harper likes having something integrated into the workflow of her team’s Applicant Tracking System (ATS) without having to take the time to copy and paste from an external tool. “Being able to do that on the fly in a way that opens up your funnel for talent is really, really exciting,” she said.
At the same time, there are some HR applications of generative AI that give Harper pause. A lot of the focus of public discourse about increased automation in hiring is on résumé parsing and using ATS software to filter out applicants, preventing their résumés from ever being seen by human eyes.
“Where I believe AI can do a really good job for a talent acquisition team is to enable them to make better, faster decisions, not necessarily to make the decisions for them,” said Harper. “AI review of résumés gets into a gray area. I don’t believe we are at a maturity level with AI as a tool to effectively use it in a way that we understand what it’s actually doing and being able to counteract any biases that it’s amplifying.”
Often, when marketing hypes up AI as a tool with revolutionary, transformative potential in the workplace, proponents of it are eager to imagine computers replacing the soft skills and judgment calls humans are best at, replacing talent instead of enhancing it.
“Recruiting is art and science, and when you try to science it down to the lowest level, you lose the art,” said Harper. An example she cites is understanding transferrable skills, especially across industries. A job applicant’s work experience is more than a set of keywords, after all, and a savvy recruiter will understand that someone with customer service experience may have honed the kind of people skills that could make them excel at sales or that someone with mastery of several programming languages may be adept at picking up another.
“The tools that we have available today, having AI understand what those transferable skills are between industries, between types of jobs, and then being able to extract that from a résumé and accurately provide recruiters with results would be really difficult today,” said Harper. “You would actually narrow your funnel of candidates more and more as it learned who got hired, instead of broadening the pipeline.”
While technology evolves rapidly, and certainly a lot of investment is going into developing generative AI to its furthest potential, Harper sees a bright boundary line around the responsible use of automation in hiring today. “I am not comfortable with AI replacing human decision making, only augmenting productivity and effectiveness.”
Despite concerns about automated résumé filtering and the borderline paranoia workers can express online about bots blocking their job applications, Harper sees little evidence of this in practice.
“In many cases, there is an overestimation of the amount of automation used. I won’t say there aren’t companies that are very sophisticated in their algorithms, but for 80% or 90% of the companies out there, the most we’re doing is prioritizing which candidates we look at based on how résumés match up with the requirements on the job description. If anything, the challenge is more that there’s not enough recruiters to look at every résumé than the desire not to look at every resumé,” said Harper.
What about the job seeker’s side? Plenty of websites and apps promise to polish résumés, write sure-fire cover letters, churn out emails, and even supply conversation responses in chats with recruiters. While these tools market themselves as ways to level the playing field, Harper is of two minds about their usefulness.
“Part of me says, hey, everyone should use the tools they have available to them to communicate in the best way possible,” she said. “In some ways, a gen AI application is like a calculator for words. But when do you introduce calculators to kids? When they understand how math works. If you get the wrong answer, that’s not the calculator’s fault.”
As with any tool, the key is to understand what problems you’re hoping to solve with it. “Sometimes that will be AI, and sometimes it won’t,” said Harper. “It’s not a blanket cure-all for everything. This is really a technology in its infancy.”
Ultimately, it comes down to understanding AI and taking responsibility for what it produces. “People need to be reminded that they’re responsible for what the AI writes for them,” said Harper. “If AI writes a résumé for you and the things on it aren’t true, that’s your responsibility to make sure that résumé is truly reflective of your experience. It’s what you put your name on.”