Executives who use AI for hiring must 'be vigilant' about reducing bias — or risk landing in hot water, says a labor attorney

Professional people crossing the street New York City with skyscraper in background
New York City enacted a law requiring employers audit recruitment platforms, which lean heavily on AI tools, for potential hiring bias.Getty Images/xavierarnau
  • New laws require executives to take responsibility for biases in the AI systems they use for hiring.

  • Employers need to stay up to date with the latest legal developments.

  • This article is part of "CXO AI Playbook" — straight talk from business leaders on how they're testing and using AI.

Employment issues have long been a focus for Amanda Blair, an associate at the large national labor and employment law firm Fisher Phillips in New York City. Recently, she's had to also become an expert on artificial intelligence, data, and analytics.

The move was inspired by a New York City law, which was enacted in 2021 and went into effect in 2023, that requires employers to perform an independent bias audit before using automated employment-decision tools. These tools incorporate AI, algorithms, and other automation technology to screen and evaluate applicants and notify job candidates and employees that the technology is being used.

While AI isn't new, it's becoming more consumer-facing, Blair said.

"It's new for workers. It's new for employers," she added. "That interests me — a new area to explore where the law is going and how it's going to be used in the workplace."

More companies are using AI and automation in hiring and recruitment, which may lead to bias and discrimination.

Business Insider spoke with Blair about the technology and emerging antibias laws and regulations.

The following has been edited for clarity and length. 

What should executives know about bias as they incorporate AI hiring tools?

We have laws on the books that address discrimination. Just because you're using an algorithm or automated tool won't keep you from having to abide by those laws. Companies need to know where their data is coming from, what their tool does, and why they're using it.

Do you have data that will actually get you the outcome that you're seeking? Is the tool assisting you in getting the best person for the position without discrimination? What's the source of the tool's training data? For example, if your tool was created using a population of 100 white men, it's not going to be the right one to hire in a city with a majority-minority population.

Headshot of Amanda Blair
Amanda Blair is an associate attorney at Fisher Phillips.Courtesy of Fisher Phillips

You also need oversight. We're not at that point where people are just running processes with AI and not looking at them. Hopefully, it stays that way. As people become more comfortable with these tools, that's a concern. You must be vigilant about how you use the technology.

What would you like to see future AI-related antibias laws include?

We have to get everyone up to speed. Not everyone is a mathematician or engineer or knows how large language models work. The individuals using these tools need to know what they're doing so they're not violating any laws.

There needs to be clear definitions; some are still too technical. So what is an automated employment-decision tool? What is artificial intelligence? Clarity in any rules, guidance, and FAQs is key because I think that's going to be one of the biggest barriers to enforcement. Ignorance of the law is not an excuse, but here, a lot of people are ignorant.

How can companies stay on top of the evolving legal landscape of AI in hiring?

AI is not one-size-fits-all. The biggest challenge is relying on a tool that doesn't fit your business. You should address any gaps in understanding of what your tool is doing and why you're using it. Start having conversations to make sure your team understands the role of AI and its implications.

Stay up to date at the city level and with your state legislature. You don't want to be caught off guard if a new law passes. Talk to legal counsel. Develop relationships with vendors to serve as independent auditors, which is required in New York City's law. That may vary by state.

AI is having an impact. Being able to assess the amount of data that we have in society is already revolutionary in what it's going to do for some companies and workers. However, that affects your business. Start to prepare and have those conversations so you're ahead of the game.

Read the original article on Business Insider

Advertisement