AI with Michal

Candidate ghosting metrics

Candidate ghosting metrics track the rate, stage, and pattern of candidates who stop responding during the hiring process without formally withdrawing, helping TA teams identify where pipeline integrity breaks down and what conditions predict disengagement.

Michal Juhas · Last reviewed May 9, 2026

What is candidate ghosting metrics?

Candidate ghosting metrics measure the rate and pattern of candidates who stop responding during a hiring process without formally withdrawing. Instead of a vague sense that candidates are disappearing, a ghost rate by stage gives TA teams a number to investigate and a named owner to hold accountable.

The basic calculation is simple: what percentage of candidates in a given stage never reply within a defined window? The harder work is agreeing on the window, cleaning the disposition codes in the ATS, and separating true ghosting from legitimate slow movers who are still considering. Once that hygiene is in place, a ghost rate becomes a diagnostic tool: high early-stage ghosting points to slow first contact; high assessment-stage ghosting points to task friction; high offer-stage ghosting points to speed or experience problems in the preceding steps.

Teams that track ghosting by stage consistently find one bottleneck that accounts for most of the problem. Fixing that one stage often moves the overall ghost rate more than any sourcing change.

Illustration: candidate ghosting metrics showing candidates fading out of a pipeline at an amber-flagged assess stage, a per-stage ghost rate bar chart with the bottleneck highlighted, a trend strip of weekly ghost rate readings with a downward improvement arrow, and an at-risk alert card nudging proactive outreach before the silence threshold is crossed

In practice

  • A TA lead pulls a three-month ATS export, filters for candidates who were advanced to phone screen but never booked a slot, and finds a 42 percent ghosting rate at that stage. On investigation, the median time from application to first-contact email is six days. The fix is a 24-hour first-contact SLA, not a new sourcing tool.
  • Vendors label the same pattern differently: pipeline dropout, candidate disengagement, no-show rate, silent withdrawal. Whatever the label, the underlying question is the same: at what stage, and how often?
  • A recruiter managing a high-volume customer service req sets a five-business-day ghost threshold and codes every expired candidate consistently. After eight weeks she has enough data to show that Monday morning screen invites ghost at twice the rate of Thursday invites.

Quick read, then how hiring teams use it

This is for recruiters, sourcers, TA leads, and HR business partners who need shared vocabulary for pipeline reviews, hiring manager syncs, and vendor conversations. Skim the first section for a fast shared picture. Use the second when you are setting up tracking, pulling reports, or deciding what to fix first.

Plain-language summary

  • What it means for you: Ghosting metrics turn "candidates just disappear" from a feeling into a number attached to a specific stage, so you can have a different conversation with a hiring manager than a shoulder shrug.
  • How you would use it: Pick the stage where you feel the most silent drop-off. Count how many candidates you chased twice with no reply over the last 90 days. Divide by how many entered that stage. That's your starting ghost rate.
  • How to get started: Clean your ATS disposition codes first. If "ghosted" is not a disposition option, add it. Without consistent codes, your ghost rate includes pipeline lag and looks misleading.
  • When it is a good time: When time-to-fill is climbing and the team cannot agree on where the pipeline is breaking. Ghosting by stage replaces anecdote with data.

When you are running live reqs and tools

  • What it means for you: A ghost rate is only as reliable as your ATS hygiene. Candidates left in active states because recruiters batch-update on Fridays or delay logging rejections inflate the rate artificially. Audit stage movement frequency before trusting the numbers in a leadership report.
  • When it is a good time: After at least 60 days of clean disposition coding and at least one named owner per stage. Setting targets before owners are agreed creates a metric nobody acts on.
  • How to use it: Connect ATS stage timestamps and disposition exports to a weekly summary. Cross-reference with sourcing funnel metrics to separate ghosting from explicit drop-off, and with time in stage reporting to see whether slow stages correlate with higher ghost rates.
  • How to get started: Start with two stages: recruiter outreach response and post-screen follow-up. Set a five-business-day ghost threshold. Track for four weeks before expanding. Compare ghost rates by source channel to identify which channels deliver candidates who engage versus those who disappear.
  • What to watch for: International candidates who move more slowly for legitimate reasons, passive candidates who need more time to consider, and roles where the screening task is long enough to warrant a 10-day window. A single ghost threshold applied uniformly across all role types produces misleading comparisons.

Where we talk about this

On AI with Michal live sessions, candidate ghosting comes up in both the AI in recruiting and sourcing automation tracks. Sourcing automation sessions cover how to wire ATS disposition exports into a weekly ghost rate summary; AI in recruiting sessions connect ghosting patterns to hiring manager communication cadence and candidate experience decisions. If you want the full room discussion on what ghost rates actually tell you versus what teams assume they mean, start at Workshops and bring your current ATS reporting setup.

Around the web (opinions and rabbit holes)

Third-party resources move quickly. Treat these as starting points, not endorsements, and double-check anything before wiring candidate data into a new tool.

YouTube

Reddit

Quora

Ghosting metrics versus related pipeline measures

MetricWhat it tracksLimitation
Ghosting rate by stageCandidates who go silent without withdrawingRequires consistent ATS disposition codes to be accurate
Sourcing funnel metricsStage-to-stage conversion including all exitsDoes not distinguish ghosting from explicit declines or rejections
Offer decline analysisCandidates who explicitly say no at offerMisses the silent exits before the offer stage

Related on this site

Frequently asked questions

What are candidate ghosting metrics?
Candidate ghosting metrics measure how often candidates stop responding at each stage of the hiring pipeline without formally withdrawing. The core metric is a ghosting rate: the percentage of active candidates in a stage who never reply to a follow-up within a defined window, often five to seven business days. Supporting cuts include ghost rate by source channel, role type, seniority band, and hiring manager. Unlike sourcing funnel metrics, which track stage-to-stage conversion, ghosting metrics isolate silent drop-offs from decisions, rejections, and explicit withdrawals. Separating the two gives TA teams an accurate picture of where candidates disengage versus where the team declines them.
How do you calculate a ghosting rate?
Ghosting rate for a stage = candidates with no response after N business days divided by candidates who received outreach or an invitation in that stage, expressed as a percentage. The hardest part is agreeing on N. Two days is too short for a phone screen invite; 10 days is too generous before a final-round slot. Most teams start with five business days and adjust after reviewing a 90-day baseline. The numerator requires consistent disposition codes in the ATS: a candidate left in an active state when they went quiet should be coded as ghosted, not left open. Without clean codes, the rate includes pipeline lag and looks worse than it is.
Which stages typically show the highest ghosting rates?
Offer-stage ghosting gets the most attention because it is painful, but in-process ghosting between submission and first contact is often higher by volume. In high-volume roles, ghosting rates of 30 to 50 percent at the application-to-screen stage are common, particularly where time-to-first-contact exceeds three days. After the recruiter screen, ghosting typically drops because candidates have invested more time. It spikes again at take-home assessment stages, where a task feels high-effort before any commitment. Understanding which stage has your highest rate tells you whether the fix is speed (earlier stages) or expectation-setting and friction reduction (later stages). Cross-reference with time in stage reporting to see whether slow stages correlate with ghosting peaks.
How can AI help detect or predict candidate ghosting?
AI can surface which candidates in your current pipeline match the profile of past ghosts: long time since last touchpoint, source channel with a historically high ghost rate, role type where assessments have caused drop-off before. A lightweight approach feeds your ATS stage log and disposition history into a weekly model run that outputs a ranked list of at-risk candidates. The recruiter then decides which get proactive outreach. The risk: AI-generated ghost predictions based on thin data surface false positives for passive or international candidates who move more slowly for legitimate reasons. Treat the output as a triage signal, not a verdict. Talent acquisition metrics cover the measurement framework that gives AI enough signal to work with.
What causes candidate ghosting at the offer stage?
Offer-stage ghosting is rarely about the offer itself. The most common causes are a competing offer that arrived first because the process was too slow, a candidate who accepted another role during the debrief-to-offer gap, a compensation number delivered without any prior anchoring, and a candidate who felt disrespected at some point and did not want a confrontation. Tracking offer ghosting alongside offer decline analysis and time in stage reporting usually reveals whether the problem is speed (fix the SLA between debrief and offer send) or candidate experience (fix process quality from submission onward). A verbal offer conversation before the written letter prevents most offer ghosts because it creates a two-way commitment moment.
What do GDPR and data retention rules say about ghosting data?
Tracking which candidates ghosted and at what stage is standard ATS disposition data covered by your candidate retention policy and Data Processing Agreement. The risk grows if you share ghosting scores with third-party AI tools outside your DPA, retain individual-level ghost flags beyond your application retention window, or create a blocklist that prevents future applications based on past ghosting. Aggregate reporting (ghost rate by stage, by month, by source) carries minimal personal data risk and is the safer format for cross-team sharing. If you use AI to predict future ghosting risk from historical individual records, confirm lawful basis before wiring the pipeline. GDPR does not prevent ghosting analytics; it requires proportionality in what you retain and who sees it.
Where can I learn to track and reduce candidate ghosting?
The AI in recruiting and sourcing automation tracks at AI with Michal workshops cover how to build a simple ghosting dashboard from ATS disposition exports, set stage-level benchmarks, and wire an alert before a candidate goes fully dark. The talent acquisition metrics and time in stage reporting terms explain the measurement foundation you need before a ghost rate has useful context. For ongoing calibration with a practitioner group that has tried it live, membership office hours cover real ATS configurations and what teams have found works. The Starting with AI: the foundations in recruiting course covers the data hygiene prerequisites before ghosting metrics give you reliable signal.

← Back to AI glossary in practice