GCHQ, the government communications service, is seeking neurodiverse candidates to improve its teamwork, the organisation’s Director revealed this week.
Speaking at an event for the Dyslexia charity Made By Dyslexia, GCHQ director Jeremy Fleming said that “with the right mix of minds, anything is possible.”
Mr Fleming set out a working environment where neurologically atypical staff could collaborate – and often provide unique perspectives on problems or tasks. The programme aims to establish a new type of intelligence – in a very literal sense.
“I have everyone from the country’s best mathematicians, some of the most talented engineers and hopefully some of the best analysts” Mr Fleming said, “But I also have people who are keeping the show on the road, who are making the machines work, who are making sure we are giving our best every day and I can see dyslexics in every bit of the business”.
“We specifically – in some of our campaigns around analytical skills – are looking for people with that sort of neural difference”.
The agency – which celebrated its centenary year in 2019 – is perhaps most famous for its wartime operations at Bletchley Park. It smove to embrace neurodiversity in the digital age demonstrates its willingness to solve problems and build stronger teams in the workplace.
The key to ending unconscious bias
Neurodiverse candidates typically report negative experiences in the hiring process: up to 52 per cent report experiencing discrimination. This is despite estimates putting total numbers of dyslexics at around ten per cent of the global population.
Yet, when high profile employers like GCHQ champion workplace diversity, it sets a new set of priorities for our industry.
And, if diversity is the motivation to end unconscious bias in hiring, then recruitment software may be the solution that helps us to achieve it.
In this article for the Financial Times, one recruitment software firm argues that technology can assist in eradicating hiring bias.
In the article, the hiring software experts suggest that personal and background data may be filtered from preliminary screening processes. The theory suggests that this will allow humans to read resumes without projecting any unconscious biases onto their decisions.
However, we have previously seen how in-house vacancy management software made biases worse for Amazon. In this case, the process failed because it had used poorly-programmed algorithms. It meant that, while humans were not able to detect elements of candidates’ background, the recruitment artificial intelligence could. The system deduced candidates attending all-female colleges were less likely to be hired, based on prior company records. In effect, the AI detected existing company bias, and reinforced it.
The complex case shows that it is not just the type of solution that is important to removing prejudice from the workplace. It is also vital that businesses understand and learn how to use hiring software tools to make effective decisions.