Day 12: Screen CVs Against a Job Description
The Concept
The average recruiter spends between six and ten seconds reviewing a CV before making an initial pass or fail decision. That figure has been reproduced in research studies across different contexts and different countries, and it is almost certainly familiar to anyone who has been on a hiring panel during a busy recruitment cycle. When you have eighty applications for a role and a shortlist to produce by Thursday, the cognitive shortcuts that allow you to process volume quickly are also the shortcuts that introduce bias.
The problem is not that recruiters are careless or prejudiced. It is that the CV screening task, as typically structured, is designed in a way that makes bias almost inevitable. When the evaluation criteria are implicit rather than explicit, when there is no agreed weighting for different qualifications, and when the assessment happens in a single fast pass, pattern-matching to previous successful candidates — who may themselves reflect historical hiring biases — is the brain's default mode. The result is not a fair meritocracy. It is a reproduction of whatever selection patterns existed before.
Why Consistency in Early Screening Matters
The shortlisting decision is the most consequential point in the recruitment process for equity and inclusion. At every subsequent stage, you are working with a pool that was shaped by this initial filter. If that filter systematically excludes certain profiles — candidates with career breaks, candidates whose names trigger unconscious associations, candidates who came through non-traditional paths — the diversity problem that manifests at the offer stage actually originated at application screening, and it cannot be fixed later in the process.
Consistency does not mean identical candidates. It means that every candidate's application is evaluated against the same criteria, with the same weightings, based on the same evidence standard. A candidate who has done the required type of work in a smaller organisation should score the same as a candidate who did the same work in a brand-name company, if the criterion is the work itself rather than the context. Building the criteria explicitly, before you see any CVs, forces that clarity.
How AI Applies a Structured Rubric Rather Than Pattern-Matching
The core value of AI in CV screening is not speed — it is consistency. A human reviewer applying a scoring framework will still drift over the course of eighty applications: the criteria that felt important at application ten may be weighted differently by application seventy, especially if the reviewer has been doing other work in between. AI applies the same rubric to the first CV and the last CV with the same degree of attention.
AI also makes the evaluation explicit and auditable. When a score is accompanied by a specific statement of evidence — "three years of direct line management experience evidenced by roles at Company A and Company B; criterion scored 2 out of 2" — you can review and challenge that assessment in a way you cannot when the decision was a gut response to a fast scan. This creates a defensible record of the screening process, which matters both for candidates who request feedback and for any future audit of your hiring practices.
Where Human Judgment Still Needs to Lead
There are two areas where AI screening requires active human oversight. The first is the criteria themselves. AI will apply whatever framework you give it — including criteria that are discriminatory or legally problematic. The prompt used in this lesson asks the AI to flag potentially biased criteria before scoring begins, but the human must make the final judgment about what the role actually requires, and that judgment needs to be tested against employment law and your own equality monitoring data.
The second area is non-linear career paths. A candidate who spent three years running a small business, or who has a career break for caring responsibilities, or who moved from a completely different sector, may score lower on a rubric designed around conventional career progression while being an excellent hire. AI can flag these profiles for closer review rather than automatic exclusion — but the decision about whether they represent genuine risk or genuine opportunity is a human call that requires context AI does not have.
Prompt of the day
Copy this into your AI tool and replace any bracketed placeholders.
Prompt
You are a structured recruitment specialist. I need to screen a set of CVs against a job description in a consistent, criteria-based way that reduces the influence of unconscious bias. Please help me build a scoring framework and apply it. Here is the job description: [PASTE THE FULL JOB DESCRIPTION HERE] Step 1 — Before I share any CVs, review the job description and identify: - The five to seven criteria that are most predictive of success in this role (separate must-have criteria from desirable criteria) - Any criteria in the JD that are vague or potentially biased (e.g. 'culture fit', 'polished presentation', 'strong academic background') and suggest more objective alternatives - A simple scoring framework (e.g. 0-2 per criterion) I can apply consistently to each CV Step 2 — Once you have built the framework, I will share the CVs one at a time. For each CV: - Apply the scoring framework and give a total score out of the maximum - Note the specific evidence from the CV for each criterion score - Flag any career patterns that are unusual but may deserve a closer look rather than automatic disqualification (e.g. career breaks, non-linear paths, sector changes) - Give a one-sentence overall assessment: 'Recommend for interview', 'Possible — recommend phone screen first', or 'Does not meet minimum criteria' with a brief reason Step 3 — After all CVs are scored, provide a ranked shortlist with a one-line summary for each candidate and flag any patterns worth noting across the pool.
Your 15-minute task
Pick a live role you are currently recruiting for and paste the full job description into the prompt. Run Step 1 first and review the scoring framework before you share any CVs. This step matters: if the criteria are wrong or biased, everything downstream will be. Once you are satisfied with the framework, add CVs one at a time. Do not share names in the initial scoring pass if you want to reduce name-based bias — you can map names back to scores at the shortlist stage.
Expected win
A structured scoring framework specific to your role, with must-have and desirable criteria separated, and a scored, evidence-based assessment of each CV you share — resulting in a defensible shortlist with a consistent rationale for every include and exclude decision.
Power user tip
After you have your shortlist, send this follow-up prompt: 'Based on the scoring framework and the profiles of the candidates I am taking forward, write five structured interview questions that would probe the areas where each candidate's evidence was thin or unclear. For each question, note which criterion it is testing and what a strong answer would include.' This turns your screening output directly into interview preparation and ensures that the interview is covering the gaps the CV left open, rather than covering the same ground the application already addressed.