AI with Michal

Sourcing pass-through rate

The percentage of sourced candidates who advance from the initial sourcing stage to the next active hiring step, typically a recruiter phone screen or hiring manager submission, measuring how well sourcing criteria match actual hiring needs.

Michal Juhas · Last reviewed May 9, 2026

What is sourcing pass-through rate?

Sourcing pass-through rate is the percentage of candidate profiles a sourcer works that advance to the next active hiring step, typically a recruiter phone screen or hiring manager submission. It sits at the conversion point between sourcing activity and real pipeline contribution.

A team might source 80 profiles against a req in a week, advance 12 to recruiter screens, and log a 15 percent pass-through rate. That number reflects targeting accuracy, brief quality, and the calibration between sourcer and hiring manager, not just how many messages went out.

The metric matters more as AI tools scale sourcing volume. When a sourcer can work ten times the profiles with AI assistance, raw profile count stops carrying signal. Pass-through rate is what separates productive scale from volume theater.

Illustration: sourcing pass-through rate as a filter gate narrowing a pool of sourced candidate profiles into an advanced subset entering the hiring pipeline, with a hiring manager calibration arrow looping back to sharpen the filter criteria

In practice

  • A sourcer reviews 60 AI-suggested profiles against a software engineering req brief. She shortlists 10 for recruiter review and 7 advance to a phone screen, giving her an 11.7 percent pass-through rate for that batch. She tracks this weekly against her rolling 12 percent target and flags when it drops two weeks in a row.
  • After a hiring manager rejects 8 of 10 submitted profiles in the first two weeks, the TA lead pulls pass-through data and finds the sourcing brief missed a critical requirement. The team adds the missing criteria to the intake template before the next batch is worked.
  • A sourcing automation playbook includes a weekly alert: if pass-through drops below 8 percent, it triggers a 20-minute calibration debrief with the hiring manager before the sourcer continues the search.

Quick read, then how hiring teams use it

This is for sourcers, TA leads, and TA ops practitioners who need a shared metric in pipeline reviews, ICP calibration calls, and sourcing automation debriefs. Skim the first section for a fast shared picture. Use the second when configuring dashboards or setting alert thresholds for AI-assisted sourcing.

Plain-language summary

  • What it means for you: Sourcing pass-through rate answers "of the profiles I worked this week, how many actually became real conversations?" It tells you whether sourcing effort is producing pipeline or producing activity that fades at the first review.
  • How you would use it: Pick one stage boundary (most teams use sourced-to-screened) and track it weekly by req family. When it drops two consecutive weeks, investigate criteria alignment before adding more outreach volume.
  • How to get started: Pull last month of sourcing activity and count how many profiles advanced to the next stage. Divide by total profiles worked. If you do not have that data, that is the first fix: log every profile touched and its outcome in the ATS.
  • When it is a good time: Before increasing AI-assisted sourcing volume, and after every major change to sourcing criteria, so you have a baseline to compare against.

When you are running live reqs and tools

  • What it means for you: At scale, pass-through rate is how you catch calibration failures before they waste a sourcer week. A 5 percent pass-through on an AI-heavy sourcing run usually means the targeting criteria or the ICP input to the AI is off, not that the sourcer is underperforming.
  • When it is a good time: After every ICP update, after adding a new sourcing channel, or after switching AI sourcing tools. Pass-through movement tells you which change mattered and in which direction.
  • How to use it: Set a floor threshold in your weekly sourcing review (for example, under 10 percent sourced-to-screened triggers a calibration call). Log which ICP version and which tool variant each batch used so you can trace drops to a specific change.
  • How to get started: Standardize stage labeling in your ATS so sourced profiles are distinguishable from inbound applicants. Without that separation, pass-through rate for outbound sourcing is contaminated by inbound conversion and the signal disappears.
  • What to watch for: Pass-through rate and response rate moving in opposite directions. High response rate with low pass-through means candidates are replying but not qualifying: either the outreach is too broad or the brief is unclear. Both signals appear in the same sourcing funnel metrics view if you track them together.

Where we talk about this

On AI with Michal live sessions, sourcing pass-through rate comes up in sourcing automation blocks as the core feedback signal that keeps AI-assisted volume honest. Teams walk through how to calculate it from ATS exports and outreach tool data, and how to wire it into a calibration cadence with hiring managers so criteria improve in real time. Full room conversation at Workshops.

Around the web (opinions and rabbit holes)

Third-party creators move fast. Treat these as starting points, not endorsements, and double-check anything before you wire candidate data.

YouTube

Reddit

Quora

Pass-through rate versus related sourcing metrics

MetricWhat it measuresWhen it warns you
Response rateOutreach landing rateMessaging or targeting problem
Pass-through rateProfile-to-pipeline conversionCalibration or ICP quality problem
Screened-to-submittedSourcer-to-HM conversionBrief alignment or sourcer skill gap
Source-to-offerEnd-to-end channel valueStrategic channel mix needs review

Related on this site

Frequently asked questions

What is sourcing pass-through rate and how is it calculated?
Sourcing pass-through rate is the percentage of candidates who advance from a specific sourcing stage to the next active step in the hiring pipeline. The most common version is sourced-to-screened: divide the number of phone screens completed by the total profiles worked in a period. If your team sourced 80 profiles and 12 advanced to a screen, pass-through rate is 15 percent. Some teams also measure it at the sourced-to-response or sourced-to-qualified level, depending on which handoff is the key conversion for their process. Agree on one consistent definition before tracking, or the number means different things across sourcers and reqs and calibration becomes impossible.
How does AI change which sourcing pass-through rates matter most?
When AI sourcing tools increase the profiles a sourcer can work in a day, pass-through rate separates real pipeline contribution from volume theater. A sourcer who works 200 AI-suggested profiles in a week but advances only two to a screen has a 1 percent pass-through rate: the AI saved time on outreach mechanics but the targeting model is off. Pair AI sourcing tools with weekly pass-through review so you can tell whether the tool is surfacing stronger fits or just more volume. Low pass-through after an ICP change is often the fastest signal that new criteria are broken, well before enough data accumulates in sourcing funnel metrics to make the problem visible.
What is a healthy sourcing pass-through rate benchmark?
Benchmarks vary widely by role complexity, hiring velocity, and how your team defines the stage boundary. For direct outbound sourcing in technical roles, sourced-to-screened rates of 10 to 25 percent are common; for executive or niche roles, 5 to 15 percent is typical because criteria are tighter and the pool smaller. Comparing against published industry numbers is less useful than tracking your own trend. If sourced-to-screened was 18 percent last quarter and is now 9 percent with the same sourcer and req type, that drop is the story. Bring actual pass-through data to sourcing automation workshops to calibrate against peers running similar roles, not averages from surveys of mixed seniority and function.
Why does sourcing pass-through rate drop when outreach volume increases?
The most common reason is ICP criteria loosening under pressure to fill pipeline quickly. More profiles are worked, but more are marginal fits, so fewer pass. A second cause is calibration lag: the hiring manager updates requirements after seeing early screens but nobody updates the sourcing brief. A third is candidate data enrichment errors scaling with volume, so profiles that look strong in a sourcing tool reveal gaps when a recruiter reviews them closely. Diagnose by splitting pass-through by sourcer, by channel, and by req before attributing the drop to sourcer skill. Brief quality and data accuracy fail first more often than sourcer approach, and fixing them is faster than retraining.
How do hiring manager feedback loops improve sourcing pass-through rate?
The fastest improvement comes from debriefing with the hiring manager after every five to ten profiles they review: which two or three came closest to the brief and why did the others miss? That input tightens criteria immediately. Without this loop, sourcers fly blind, especially on new roles where stated and actual preferences diverge. Log calibration notes in your ATS or a shared debrief doc so the pattern is visible when you rotate sourcers. A scorecard makes feedback structured rather than impressionistic. In sourcing automation workshops, teams rebuild this loop with a shared intake template so calibration is documented from the start of a req, not retrofitted after ten profiles go nowhere.
How does sourcing pass-through rate connect to time to fill?
Low pass-through rate is a hidden driver of long time to fill that often gets misdiagnosed as insufficient pipeline volume. If the team contacts 500 people in a month but only 12 advance to screens, the funnel has a targeting or calibration problem, and adding more contacts will not fix it. Pass-through rate tells you whether sourcing effort is converting to active pipeline. When it is low, check brief quality and hiring manager feedback latency before asking the team to send more messages. High pass-through with low total screens means source more; low pass-through with high volume means recalibrate criteria. Both diagnoses need different remedies, and only the metric tells you which one you have.
Where can sourcers practice measuring and improving pass-through rate?
Join a sourcing automation workshop where teams build live funnel reports from ATS exports and outreach tool logs, including sourced-to-screened calculations and calibration loop design. The Starting with AI: the foundations in recruiting course connects these metrics to prompt governance and AI targeting so sourcers understand how tooling choices affect their numbers. Bring an export of recent sourcing activity and an ATS report showing which profiles advanced; the group will identify whether your pass-through issue is brief quality, channel selection, or AI targeting drift. After the session, assign one person to own the pass-through definition so numbers mean the same thing when TA leads and sourcers review the same dashboard.

← Back to AI glossary in practice