Policy

When an Algorithm May Have Decided a Doctor's Future

A Dartmouth medical student's investigation into AI screening tools raises urgent questions about opacity and disability disclosure in automated hiring.

Last verified:

AI Screening May Have Blocked a Qualified Doctor

A medical student with an Ivy League pedigree, articles in JAMA and other flagship peer-reviewed journals, and uniformly strong recommendation letters received zero residency interview offers — and suspected an automated screening tool was responsible. According to Wired AI, Chad Markey, a 33-year-old Dartmouth medical student, devoted six months to teaching himself Python and conducting his own forensic investigation after watching peers with comparable credentials collect invitations while his applications drew only rejections.

The MSPE’s Double-Edged Language

Medical residency runs on a single annual matching cycle — miss it and a physician’s career stalls by a full year. That compressed timeline made Markey’s situation acutely high-stakes. According to Wired AI, the prime suspect was standardized language in his Medical Student Performance Evaluation, which noted three distinct absences spanning roughly 22 months and an extended academic year attributed to “personal reasons.” Wired AI reports those absences reflected a 2021 diagnosis of ankylosing spondylitis, an autoimmune spinal disease — a legally protected medical condition that the MSPE’s clinical phrasing may have converted into an algorithmic penalty. Wired AI additionally reports that certain programs had adopted a no-cost AI screening tool found to be surfacing incorrect grade data for some candidates, introducing a second failure point in the same pipeline.

One Student’s Audit of a Black Box

Markey’s months of self-taught forensics expose a structural asymmetry: programs wielded algorithmic gatekeeping power that candidates could contest only from the outside, with no formal access to the underlying decision logic. His case points to a concern growing across competitive professional hiring — that AI tools calibrated on historical acceptance patterns may impose implicit penalties on non-linear career paths, even when those detours trace to protected health conditions rather than performance deficits.

Why This Matters

The residency match is among the most consequential hiring systems in American professional life, placing thousands of new physicians in a single coordinated annual event. Markey’s case stands out because he documented his suspicions methodically, producing one of the few external audits of AI screening in a domain where programs face no obligation to disclose automated decision-making. As unaudited AI adoption accelerates across high-stakes hiring, his investigation crystallizes the central regulatory gap: there is currently no authority positioned to audit these tools before a qualified candidate’s career is quietly sidelined.

Frequently Asked Questions

What is the MSPE and why does its language matter for AI screening?

The Medical Student Performance Evaluation is a standardized career summary prepared by a student's school; AI tools scanning it for negative signals may flag disability-related phrasing as a red flag, even when the underlying circumstances are legally protected.

Are AI screening tools in medical residency currently regulated?

As of 2026, no federal standard governs AI use in the residency matching process, leaving programs free to adopt unaudited tools with little transparency to applicants.

#AI hiring #algorithmic bias #healthcare #disability #medical residency