What we’re seeing in the market – both here in the UK and more widely – is a rapid escalation on both sides of the recruitment equation. Clients have for years used low-touch, tech-enabled filters as a way to try to cut through the volumes of applications they receive (despite widespread negative feedback from candidates and mixed outcomes for hiring).
Meanwhile, candidates are using AI tools like Chat GPT to help write CVs, cover letters and responses to application questions. But looking at it from their perspective – job specs are often vague, the competition is high, and most large employers are already using AI to triage applications. In that context, why wouldn’t someone use every tool available to help present themselves better?
As recruiters, we’ve started to see more applications that feel overly polished or oddly generic – well-formatted but lacking in voice. Some are easy to spot; others only become apparent at the interview. VC interviews in particular are becoming less and less useful as a medium to conduct interviews – we’ve seen plenty which have clearly been co-piloted by AI where candidates are being fed well-formed responses (with only a minor delay!).
And while the latter is clearly a problem, the former feels to me like a realistic representation of how lots of work will look in the future – the generation joining the workforce now will absolutely be using generative AI tools to support them in executing key parts of their roles, and so demonstrating that usage during the application process shouldn’t – I would argue – be a reason to screen them out.
The real issue, to my mind, isn’t that candidates are using AI. It’s that many hiring processes still rely too heavily on early-stage filters – whether that’s automated scoring, keyword matching, or undercooked first-round screening. When those systems are too rigid, there’s a danger of missing good people simply because their CV was helped along by a machine. That’s not cheating – it’s a reflection of a market where candidates are trying to cut through the noise. And let’s not forget – plenty of the job ads they’re responding to were themselves likely written with the help of Chat GPT, or (more old-school than that) lifted from a generic JD template.
AI might create the application content, but it can’t provide the human context. For us as industry experts, the value we add is in knowing the person behind the CV – the contextual knowledge of a candidate that we can bring, often built over a number of years across many conversations. That means understanding their motivations, probing their decision-making, and providing a proper, human view on how they’ll fit into a team or a leadership structure.
The FT article notes that some employers have started issuing blanket bans on AI-generated applications. I can see the intent, but the reality is: you can’t police how someone drafts a CV. What you can do is structure your hiring process in a way that digs deeper. That likely means more time, energy, and effort – more thoughtful screening calls, live case studies, or simply investing the time to read between the lines. Because ultimately, recruitment is a people business. It’s about judgment, not just data. AI can support that – but it can’t replace it.
So yes, the hiring arms race is real. But it doesn’t need to be messy.