AI Tools Are Skipping Experienced Workers
People with long careers are finding it hard to get job interviews when employers use automated systems.
A man named Derek Mobley, who has a doctorate in veterinary medicine and a law degree, sued a popular hiring platform after he was rejected by the software for being too old. He says his applications were turned away within minutes, as if no human looked at them.
The same problem hit another veteran job seeker who had spent 60 years in medicine, communications and politics. He applied to hundreds of positions that used artificial‑intelligence screening tools, hoping his background would match roles like medical writer or editor. Yet he never received an interview.
A career coach explained that the software was only looking at a short, recent portion of his résumé. Anything beyond ten years was ignored, and the unusual mix of degrees made the system flag him as a mismatch. The coach advised him to remove his veterinary experience, delete dates, and even change his appearance—an approach that seemed to win interviews but did not lead to a suitable job.
When the coach’s advice failed, the veteran turned to part‑time work that matched his real skills. He applied for a position caring for large animals at a zoo, but the automated system rejected him again because it could not find the required keywords in his résumé. The rejection came at midnight, a clear sign that no human reviewer had seen it.
These stories show how AI can unintentionally discriminate against experienced, diverse candidates. Laws are being considered that would require companies to explain how their algorithms make hiring decisions and to ensure they do not unfairly filter out qualified people. The goal is to protect workers, renters, patients and borrowers from the hidden biases of automated systems.