Did you know that much of what makes up a nuclear explosion has its basis in centuries-old Physics?
Take Newton’s Third Law of Motion: “For every action, there is an equal and opposite reaction.”
The sudden expansion of superheated gases leads to a propulsive shockwave.
Terrifying stuff with apocalyptic consequences.
It’s no wonder nuclear weapons are the basis of much international diplomacy as a deterrent - if one uses nuclear weapons, we all lose through mutually assured destruction.
While this might be a worst-case example of humans negotiating an international landscape, it’s a similar psychology to what we see in more mundane interactions, such as the them and us of recruitment.
I’ve always said that ‘candidate resentment’ should be a strong lesson in how we can improve our recruitment - given their reactions to negative experience inform how they might respond to the experiences we create.
And we’re starting to see the same scenario in the ‘AI’ arms race.
I put that in speech marks, because it remains smarter automation rather than AGI, although the explosion in such AI services is much akin to what may prove to be the singularity of AGI.
There must be hundreds of AI solutions on the market for recruitment at this point, as well as ancillary technology that might be adopted into recruitment.
My Samsung has just updated to UI 7.0, offering on-the-fly voice call translation (presumably Star Trek style), transcription and all that Jazz.
Our giddiness to implement the shiny seems entirely sensible.
Who wouldn’t want AI that can do everything for us?
AI agents to take away the grunt work.
AI interviews, so we can assess applications at scale.
AI voice calls so we can bash hundreds of potential candidates around the head, trying to gain their interest.
Scale, scale, scale, it’s marvellous, isn’t it?
Except - did we forget to ask the candidates what they think?
In the horrendous perfect storms that make up many employment markets, let’s not forget the millions of job seekers who ‘just want a chance’.
Those same who have been sold piffle of 93% ATS compliance and AI is rejecting you by dubious career coaches.
Ironically, as despicable as this hyperbole is, in the sale of fear and hope, this is the technology recruitment seems to be aspiring to.
There’s a market of millions of job seekers who can access democratised AI cheaply, to improve their odds with shiny scale, scale:
AI deep fakes, so they don’t have to deal with AI interviews.
AI applications to take away the grunt work, so we can apply at scale.
And how do you solve a problem like volume unqualified applications?
Well, AI seems like a logical solution.
Consider how you react to mass personalised email at the moment.
How do you think people will respond to mass automated voice calls? Democratised technology everyone will have access to.
By scaling, the only competition might be to scale more, rather than do the complete opposite, and focus on individual human experiences.
If you’re proposing to access the most passive of candidates through automation, what do you think their natural response might be? What about an AI agent that deals with the white noise for them?
If AI is the only answer, then it has two dangers:
to be abnegated by equal and opposite AI
to create such a problem with spam, that laws will need to be implemented to mitigate it - look at GDPR, how mass-automated email is dealt with
And if those are two realistic outcomes, that will ultimately only benefit the few and not the masses, is that really what we want from technology?
Where’s the evidence, I hear you ask?
My LinkedIn feed paints a gloomy picture.
Hollywood’s actors’ strike paints a gloomy picture.
As does Elton John, trying to protect the music industry.
I like original films and music. And you?
Wyatt’s Law states that a sane approach in a mass market is to “reciprocate the level of investment you experience.” (👋)
It’s quite Newtonian, but rather than deal with Physics, it relates to behaviour and finding a path through the noise.
It’s a question of intention, too - if you want committed applications from qualified candidates, should you reciprocate the level of investment you want from them?
Because if you’re leaning on AI, you have to expect they’ll treat you much the same.
Kaboom.
Anyhoo, next time, I’ll look at ethics.
Thanks for reading,
Regards,
Greg