AI with Michal

Standard operating procedures for AI recruiting

Step-by-step procedures that tell recruiting teams how to use AI tools at each hiring stage: which prompt template to run, what to review before output leaves the system, how to log AI involvement, and when a human decision is required.

Michal Juhas · Last reviewed May 5, 2026

What are standard operating procedures for AI recruiting?

An SOP for AI recruiting is a step-by-step procedure that tells your team exactly how to use an AI tool at a specific hiring stage: which model to run, which prompt template to use, what a reviewer must check before output leaves the system, and how to log the run. It is not a policy statement or a vendor guide. It is the operational document that makes AI-assisted work repeatable, auditable, and recoverable when something goes wrong.

The gap most teams experience is not a lack of AI tools. It is the absence of any shared agreement on how those tools are used. Without an SOP, AI use varies recruiter to recruiter: different prompts, different review habits, different logging or no logging at all. That variability becomes a GDPR risk, a candidate experience inconsistency, and a training problem when a new team member joins.

Illustration: AI recruiting SOP showing a procedure document feeding a prompt template, AI output passing through a human review gate, and a log chip recording the run before the result reaches the ATS record

In practice

  • A TA ops manager reviewing a data protection audit discovers that three recruiters used three different AI tools for the same outreach campaign with no logs showing which version ran. That is the problem an SOP prevents.
  • A sourcer onboarding a new teammate explains the outreach process: we use this prompt template from the library, review the name and role claim before sending, and drop the run ID into the ATS note field. That is an SOP working in practice.
  • A recruiting lead tells their team that using AI is not defined until the SOP covers which tool, which prompt, who reviews, and where the log goes. Anything less is a habit, not a procedure.

Quick read, then how hiring teams use it

This is for recruiters, TA partners, and HR leaders who need a shared definition of AI recruiting SOPs that holds in compliance reviews, tool onboarding, and process audits. Skim the first section for a fast shared picture; use the second when deciding what to document for your live stack.

Plain-language summary

  • What it means for you: An SOP tells every team member exactly how to use a specific AI tool at a specific hiring step: which prompt, who reviews it, what to check, and where to record that it ran.
  • How you would use it: Pick one AI-assisted task you already run every week (outreach drafting is the most common) and write the five steps as a one-page document: who runs the prompt, which template, what the reviewer checks, where the log goes, and who to escalate to when output fails.
  • How to get started: Pull your three most recent AI-assisted outreach sequences. List what you actually did: tool, prompt, review step (if any), log location. The gaps between what you did and what you would want to repeat every time are the SOP sections to write first.
  • When it is a good time: When a new recruiter joins and AI habits vary across the team, when a compliance review asks how candidate data was processed, or when a tool update breaks an existing prompt and the team needs a decision process.

When you are running live reqs and tools

  • What it means for you: SOPs translate tool adoption into reproducible process. Without a written procedure, AI use on live reqs depends on who is on shift and what they remember from the last training session. That is how outreach quality drifts and GDPR gaps accumulate across a quarter.
  • When it is a good time: When an AI tool moves from pilot to standard use across more than one recruiter, when a candidate or DPA asks for documentation of AI involvement, or when a model version changes and you need to decide whether the current procedure still applies.
  • How to use it: Wire the SOP into your actual tools: link the approved prompt template from the prompt library directly in the SOP document, add a required note field in the ATS for AI run logs, and set the review step as a non-optional gate in your workflow automation before any message can be sent.
  • How to get started: Draft the SOP for your highest-volume AI task in one page. Review it with your DPO or legal contact before it goes live. Run it on the next five live reqs and log every deviation. Deviations are the places to clarify the procedure, not to blame the recruiter.
  • What to watch for: SOPs that live only in a shared document become irrelevant within a quarter. Attach the review cycle to something that already happens, such as a monthly TA ops call or a quarterly business review. Log model versions and prompt template versions in the SOP itself, not just the tool name, so a six-month-old run can be reconstructed when needed.

Where we talk about this

On AI with Michal live sessions, SOPs for AI recruiting come up across both the AI in recruiting and sourcing automation tracks: AI in recruiting covers how to define review gates and log AI involvement at each funnel stage, and sourcing automation covers the data-handling and compliance procedures that prevent automation from creating GDPR risk. If you want the full room conversation with real stack questions, start at Workshops and bring your current tool list and one recent req where process broke down.

Around the web (opinions and rabbit holes)

Third-party creators move fast. Treat these as starting points, not endorsements, and double-check anything before you wire candidate data.

YouTube

  • Search for "AI recruiting process documentation" and "TA ops standard operating procedures" on YouTube. Content from TA operations specialists and HR technology practitioners typically covers procedure design in more grounded terms than vendor demos.
  • "GDPR recruiting compliance" and "data protection hiring process" return academic and practitioner content that connects directly to the logging and retention sections of an AI recruiting SOP.

Reddit

  • r/recruiting and r/humanresources have ongoing threads on process documentation, AI tool governance, and how to tell new recruiters what to do when habits vary across the team.
  • r/RecruitmentAgencies has practical discussion on tool standardization across multi-recruiter environments, directly relevant to SOP design for agency and RPO teams.

Quora

  • Search "how to document AI recruiting process" and "recruitment SOP template" for a range of practitioner perspectives across startup and enterprise. Cross-check any specific advice against your own stack and data protection obligations before adopting it.

SOP versus ad hoc AI use

DimensionWith SOPAd hoc AI use
Prompt consistencyShared approved template per taskVaries recruiter to recruiter
Review gateNamed reviewer, defined checklistOptional or skipped under pressure
AI run loggingLogged per req (tool, version, date)Not tracked
GDPR audit readinessDocumentation available on requestRequires memory reconstruction
Onboarding new team memberFollow the SOPShadow existing habits

Related on this site

Frequently asked questions

What should an SOP for AI-assisted outreach actually contain?
At minimum: which tool and model version to use, the approved prompt template or a link to the prompt library, what the reviewer must check before sending (candidate name, job title, factual claims), how to log the AI run (tool, date, req ID), and what to do when output is wrong or uncertain. Outreach SOPs that skip logging make GDPR audits painful months later when a candidate asks how their data was used. In cohort sessions, teams that document these five components consistently can onboard a new recruiter to AI-assisted workflows in under two hours rather than relying on word-of-mouth habits that drift.
How do SOPs prevent AI hallucinations from reaching candidates?
By building a mandatory review step into the procedure, not as optional advice. A well-written SOP names the reviewer (the recruiter who owns the req), lists what to check (name, role title, company, specific claims about the candidate), and sets a hard rule: the AI draft does not leave the drafting tool until the reviewer signs off. In practice, teams add a reviewed-by note field in the CRM or ATS before the message thread unlocks. Without that procedural gate, hallucination risk shifts from theoretically possible to certain under deadline pressure. See human-in-the-loop for the full gate design pattern.
Who owns an AI recruiting SOP - TA ops, legal, or the recruiter?
All three, with different scopes. TA ops or a senior recruiter drafts and maintains the procedure: they know the tools, prompt templates, and workflow edge cases. Legal or the DPO reviews data handling, retention, and GDPR sections, especially where candidate data flows through a third-party AI tool. The individual recruiter owns execution: follow the SOP, log AI use, and flag when the procedure breaks on a live req. If no one owns the maintenance cycle (quarterly at minimum), the SOP drifts stale within six months as tools update and prompts change. Assign each section an explicit owner and a review date before the document goes live.
How often should AI recruiting SOPs be reviewed and updated?
At minimum, quarterly. AI tools update model versions, vendors change APIs, and regulators issue new guidance on automated decision-making faster than most document cycles assume. A practical schedule: a brief check when a tool updates (does the output format still match the SOP instructions?) and a full review every quarter to test approved prompts against current model behavior. Log what changed and why, not just the date. Teams that attach the SOP review to an existing monthly TA ops meeting maintain the rhythm better than those who schedule it separately. Any req that surfaces an AI error is also a trigger for an unscheduled review of the relevant procedure step.
What is the difference between an SOP and a recruiting prompt library?
A prompt library is a curated collection of tested templates organized by task. An SOP is the procedure that tells you when to use which prompt, what to do with the output, and how to handle errors. One answers what to type; the other answers the full workflow. You need both: a library without an SOP lets anyone use any prompt in any context, creating compliance and consistency gaps. An SOP without a library forces each recruiter to write prompts from scratch, reintroducing the variability the SOP was meant to remove. In practice, the SOP references the library by task (outreach draft, screening summary, intake notes) and names the approved template for each use case.
How do SOPs help with GDPR and audit readiness?
When a data protection authority or a candidate asks how their data was processed, the SOP is the audit trail. A well-maintained SOP names the tool, the data inputs (profile fields, resume text, job description), the output type (draft, summary, score), the retention period, and the review step that happened before any output left the system. Without that documentation, answering what AI was used and what data was processed requires reconstructing decisions from memory, which rarely holds in a compliance review. SOPs also help demonstrate that candidate data enrichment and screening steps matched the lawful basis logged at first contact. Write the GDPR section before the tool goes live, not after the audit letter arrives.
Where do teams start when building their first AI recruiting SOP?
Start with the highest-volume AI-assisted task, usually outreach drafting or resume screening summary, and document only what actually happens today, not what should ideally happen. A one-page procedure covering tool, prompt source, review checklist, log location, and escalation contact outperforms a twenty-page policy no one reads. Bring the draft to a workshop session and test it against real edge cases: what happens when the model returns empty output? What if a candidate profile has no public presence? Those gaps surface faster in a room than in isolation. The Starting with AI: the foundations in recruiting course covers the prompt habits and review patterns an SOP should formalize. Membership office hours help compare SOPs across teams running the same tools.

← Back to AI glossary in practice