This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Neurotech could discriminate against workers, ICO warns
The Information Commissioner’s Office has warned that brain monitoring tech is in danger of discriminating against people with neurodiverse conditions and is working towards guidance on neurotech devices.
The UK data watchdog predicts that the use of technology to monitor neurodata – information that comes directly from the brain and nervous system – will become widespread over the next decade.
“To many the idea of neurotechnology conjures up images of science fiction films, but this technology is real and developing rapidly,” said ICO’s Stephen Almond, executive director of Regulatory Risk.
Neurotech is already being used in heavily regulated fields such as healthcare where it can treat complex physical and mental illnesses.
The Dutchman Gert-Jan Oskam, for instance, who was paralysed in a cycling accident, is now able to walk again thanks to electronic implants in his brain.
However, as the tech develops across multiple sectors – for personal wellbeing, education, training, sports and marketing for instance, the ICO is concerned that there’s a risk of inherent bias and inaccurate data being embedded in neurotechnology.
The regulator claims that this may affect people with neurodiverse conditions (such as Aspergers, autism and ADHD).
It suggests that discrimination in neurotechnology could occur ‘where models are developed that contain bias, leading to inaccurate data and assumptions about people and communities.
If devices are not trialled and assessed on a wide variety of people or databases are trained on ‘neuro normative’ patterns, the ICO warned, risks of inaccurate data emerge.
“Neurotechnology collects intimate personal information that people are often not aware of, including emotions and complex behaviours. The consequences could be dire if these technologies are developed or deployed inappropriately,” said Almond.
“We want to see everyone in society benefit from this technology. It’s important for organisations to act now to avoid the real danger of discrimination,” he added.
According to the ICO, the use of neurotech in the workplace could lead to unfair treatment if specific neuro patterns or information come to be seen as undesirable due to ingrained bias.
The ICO said that it was developing guidance for firms to follow and views on emerging bias risks which will contain sector-specific examples.
The regulator has also published a new report ICO Tech futures: neurotechnology – detailing possible future avenues of development for neurotechnology in the workplace and for employee hiring.
#BeInformed
Subscribe to our Editor's weekly newsletter