You don’t need to answer that question. You do have a right to privacy. Trouble is, New York City and California are transforming their concerns about privacy (not yours, employer), into new laws that will come into effect on 01 January, 2023.
The new laws concern Automated Employment Decision Tools (AEDTs).
And even though the “good news” is that total confusion over the laws and the “high volume of public comments” has led to a delay in enforcement – April 15, 2023 for New York City and July 01, 2023 for California – legal eagles are advising employers to get prepared now. Inviting a couple of lawyers round for Christmas dinner is something to consider.
No? OK – let’s look into it.
Automated Employment Decision Tools – and You
For a non-comprehensive list of tools in use by employers, we’ll point out:
- Resume scanners using specific content to prioritize job applications.
- Video tech used to analyze candidate characteristics or mannerisms.
- Software used to monitor employees to analyze performance.
- “Job Fit” tech generally if used in the evaluation of a candidate/employee.
Under the new law In New York City, requirements will include:
- Implementing a “bias audit” of AEDTs within one year prior to use.
- Informing candidates or employees about AEDT use for hiring or promotion.
- Inform same about the job qualifications and characteristics the AI tech will use.
“Violations of the provisions of the bill would be subject to a civil penalty.” — NYC
Well, that part is pretty clear, even if they did neglect to add “Happy new year!”. Touted fines for NYC are $500 for first violations and up to $1,500 for each subsequent violation.
Just to be perfectly unclear on what AEDTs are, according to the NYC law, here’s a quote:
“The term “automated employment decision tool” (or “AEDT”) is broadly defined as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.”
In California, Attorney General, Rob Bonta, was thinking along these lines in November:
- A business commits an unfair business act or practice if it uses artificial intelligence or other automated decision-making tools in such a way as to have a disproportionate, adverse impact on or causes disproportionate, adverse treatment of a consumer or a class of consumers on the basis of protected characteristics.
- It can be an unfair business act or practice for a developer to sell or a business to utilize AIA without testing, auditing, monitoring, disclosures, and transparency. Transparency measures could include disseminating data and source code for independent review and testing, and disseminating the results of internal and independent audits. It is an unfair business act for an entity to refuse transparency, audit, or monitoring measures.
Try to bear in mind that he’s only trying to help.
Burdens of Souring, Hiring, Retention – And Help!
As an employer or hiring team, your motives for using AEDTs is to improve the process of identifying, interviewing, and ultimately hiring the best fit candidates for open positions, with a view to enhancing productivity, retention and growth.
Increasingly easy online job applications means potentially massive response to advertised positions, with possibly many more fake-it-’till-you-make-it hopefuls than highly qualified experts in the relevant field. What to do? Enter Applicant Tracking Systems and any other automated help that becomes available to support analysis, assessment, and hiring the best.
And now, as you eagerly endeavor to optimize these complex AI assistants to achieve optimal outcomes, you find yourself needing legal assistance just to start pushing the buttons.
With that in mind, related or similar local, state and even federal mandates are looming over the new year and our collective festive cheer, so getting on top of this one may be the first step to being capable of protecting yourself from the next one(s).
Unfortunately, getting on top of this one could be an uphill battle, even for the lawyered-up. That “high volume of public comments” mentioned above includes complaints about how vague many of the key terms actually are, and how many questions have been left unanswered.
So the cynical could think that the delay in enforcement may be more for the people writing those “key terms” to figure out what they actually mean than it is for the employers who must start abiding by them as soon as humanely possible.
In September, the Department of Consumer and worker Protection (DCWP), proffered what it calls “Proposed Rules” to help employers get to grips with the question of how to comply with the new law, which is a nice gesture during the season of giving (and taking). Apparently, they’re still not finalized, but have been stated in this article already and can be found in greater depth here.
What’s the Point?
The point of all this is essentially to assess potential bias against anybody in a protected category: race, ethnicity, sex, for example. Those selected to move forward or given a classification by a AEDT would be analyzed with a view to identifying bias or lack of it. In New York City the following questions can tentatively be answered here (next steps: ask your lawyer):
The audit would require an “independent auditor” – person or group not connected to the development or use of an AEDT responsible for an audit. Potentially, this would allow for consultants/contractors to be brought in and could mean legitimately using an in-house compliance team, if independent in the way mandated.
Informing Candidates and/or Employees?
Giving notice would or could entail posting the notice of AEDT use on the careers or jobs (as applicable in each instance), section of the company website, job posting, or email to candidates or employees.
Plan of attack?
- Don’t limit your evaluations to your own AEDTs: Study vendors used by your company and AEDT tech used on your behalf, as legal responsibility could be open to question, probably depending on a slew of variants.
- Figure out the “Independent Auditor” question and conduct an audit of your AEDT(s) and look into:
- Related data collected and why
- Full analysis of protected groups and comparative results
- Methods and goals of data analysis
- Criteria in place to decide success
- Transparency of these and related processes
- Give notice as described and provide alternatives/opt-out
- Find your own alternative if negative outcomes are identified
In October, the Biden administration published: Blueprint for an AI Bill of Rights. Here’s a quote to get you in the festive mood:
“Among the great challenges posed to democracy today is the use of technology, data, and automated systems in ways that threaten the rights of the American public. Too often, these tools are used to limit our opportunities and prevent our access to critical resources or services. These problems are well documented. In America and around the world, systems supposed to help with patient care have proven unsafe, ineffective, or biased. Algorithms used in hiring and credit decisions have been found to reflect and reproduce existing unwanted inequities or embed new harmful bias and discrimination. Unchecked social media data collection has been used to threaten people’s opportunities, undermine their privacy, or pervasively track their activity—often without their knowledge or consent.”
After all the cynicism prevalent in 2022, doesn’t that remind you of the final scene of A Christmas Carol?