Navigating the Equal Employment and Housing Act (FEHA) and Automated Decision System (ADS) regulations.
California is leading the way in regulating Automated Decision Systems (ADS) employment.
Under proposed modifications to the Fair Employment and Housing Act (FEHA), employers are strictly liable if their AI tools result in a "discriminatory impact," even if unintentional.
Employers must actively ensure three things:
Check your risk level regarding California's automated decision-making rules.
Check if your use of AI in hiring aligns with California's strict anti-discrimination and privacy standards.
| Responsibility | Expert Hire | Employer |
|---|---|---|
| Provide bias-tested algorithms | ||
| Structured data export for audits | ||
| 4-year data retention capability | ||
| Final hiring decision | ||
| Conducting specific workplace bias audits |
We proactively engineered our platform to meet the high standards of FEHA and upcoming CPPA rules.
All selection criteria and data outputs are stored in a format ready for internal or external bias audits.
Our system facilitates human decision-making rather than replacing it. This "augmentation" approach is safer under California law.
The FEHA generally applies to employers with 5 or more employees. There is no broad exemption for small businesses regarding discrimination.
Under proposed regulations, employers are liable for the tools they use. You cannot outsource liability to a vendor completely.
Yes, if the employee is based in California or if the hiring decision is made by a California entity affecting California residents.
Use Expert Hire's transparent tools to stay ahead of California's rigorous employment laws.
Create Free Account