Every recruiter has a story about the candidate who bombed the phone screen but turned out to be brilliant. Or the one who interviewed beautifully and was a disaster on the job. The phone screen, despite decades of industry use, has a fundamental flaw: it is a human measuring another human under highly variable conditions. The interviewer might be tired, distracted, in a hurry, or unconsciously pattern-matching against a previous hire.
The data backs this up. Research consistently shows that unstructured interviews — which includes the majority of phone screens — have a predictive validity of roughly 0.20 on a scale of 0 to 1. That's barely better than chance. Companies spend billions of dollars and thousands of recruiter-hours on a process that is only marginally better than flipping a coin.
The three core problems with phone screens
1. Scheduling friction eliminates candidates before they're even assessed
Before a phone screen can happen, both sides have to agree on a time. In practice, this means three to five emails over two to four days, a 20–30% no-show rate, and a calendar coordination overhead that scales linearly with hiring volume. For a company running 50 active roles, this alone can consume 10+ recruiter hours per week — just on scheduling.
Worse, the candidates most likely to drop off during this friction window are often the ones with the most options. Top candidates are already employed, time-poor, and responsive to the fastest process. If your competitor can move faster, you lose them before the conversation even starts.
2. Interviewer inconsistency makes comparison impossible
Even when a recruiter uses a structured set of questions, the delivery varies. Tone, follow-up probing, time spent per question, and how answers are interpreted all differ from interviewer to interviewer, and from one phone call to the next. When you're comparing 20 candidates across five different recruiters over three weeks, you're not comparing apples to apples. You're comparing how each recruiter felt on a Tuesday afternoon.
3. Recency and halo bias distort ranking
Human memory is reconstructive, not reproductive. By the time a recruiter writes up notes on candidate #12 in a busy week, their recollection of candidate #3 is already degraded and distorted. The halo effect — where one strong quality colours the overall impression — is well-documented and extremely difficult to eliminate even with training.
What AI interviews do differently
AI-conducted interviews — like those Brydg runs autonomously on Google Meet — address all three problems structurally, not just procedurally.
- No scheduling required: the AI is available 24 hours a day, 7 days a week. A candidate can complete their interview at 11pm on a Sunday if that's when they're available.
- Identical experience for every candidate: the same questions, the same tone, the same time per section, every time. There is no 'good day' or 'bad day' for the interviewer.
- Scored against a consistent rubric: every response is evaluated against the same criteria, producing a numerical score that is directly comparable across hundreds of candidates.
- Transcribed and summarised automatically: hiring managers review a structured summary and score, not raw notes from a phone call they weren't on.
In Brydg's early pilots, AI-interviewed candidates moved from application to shortlist in an average of 26 hours — compared to an industry average of 8 days for the same stage.
The objection: 'Candidates won't like talking to AI'
This is the most common pushback, and the data tells a different story. In a 2025 survey of job seekers who had completed AI interviews, 67% rated the experience as 'fair' or 'very fair' — a higher fairness rating than they gave to human phone screens. The reasons cited most often: no awkward small talk, no sense of being judged on appearance or voice, and a feeling that the questions were consistent and merit-based.
Candidates who felt they performed poorly were actually more likely to rate an AI interview as fair — because they understood the rubric. With human interviews, poor performance often leaves candidates wondering whether they were judged on something they couldn't control.
The bottom line
Phone screens are not going away entirely — there are conversations that genuinely require human judgment and human connection. But as a first-pass screening mechanism for volume hiring, the phone screen is an expensive, inconsistent, bias-prone tool. AI interviews deliver more signal per candidate, at lower cost, faster, and more fairly. The companies that adopt them first will have a structural recruiting advantage that compounds over time.