UK MP’s call for new legislation on workplace AI
With Artificial Intelligence (AI) being used to monitor and control workers with ‘little accountability or transparency’, the All Party Parliamentary Group (APPG) on the Future of Work are specifically calling for the creation of an Accountability for Algorithms Act (AAA).
“The AAA offers an overarching, principle-driven framework for governing and regulating AI in response to the fast-changing developments in workplace technology we have explored throughout our inquiry,” said the APPG in its report The new frontier: artificial intelligence at work, published this week.
“It incorporates updates to our existing regimes for regulation, unites them and fills their gaps, while enabling additional sector-based rules to be developed over time. The AAA would establish: a clear direction to ensure AI puts people first, governance mechanisms to reaffirm human agency, and drive excellence in innovation to meet the most pressing needs faced by working people across the country.”
In response to increasing public concern about AI and surveillance in the workplace (which became more pronounced as the COVID-19 pandemic forced people to work from home), the cross-party group of MPs and peers conducted their inquiry between May and July 2021.
The report said: “AI offers invaluable opportunities to create new work and improve the quality of work if it is designed and deployed with this as an objective.” The report adds that this potential “is not currently being materialised.”
“Instead, a growing body of evidence points to significant negative impacts on the conditions and quality of work across the country. Pervasive monitoring and target-setting technologies, in particular, are associated with pronounced negative impacts on mental and physical wellbeing as workers experience the extreme pressure of constant, real-time micro-management and automated assessment.”
The report stressed that the main anxieties for workers around AI-powered monitoring is the “pronounced sense of unfairness and lack of agency” around the automated decisions that are made about them.
“Workers do not understand how personal, and potentially sensitive, information is used to make decisions about the work that they do, and there is a marked absence of available routes to challenge or seek redress,” it said. “Low levels of trust in the ability of AI technologies to make or support decisions about work and workers follow from this.”
According to the report, there are even lower levels of trust in how developers and users of algorithmic systems are held accountable for how they are using the technology.
David Davis MP, Conservative chair of the APPG, stated: “Our inquiry reveals how AI technologies have spread beyond the gig economy to control what, who and how work is done. It is clear that, if not properly regulated, algorithmic systems can have harmful effects on health and prosperity.”
Labour MP Clive Lewis also said: “Our report shows why and how government must bring forward robust proposals for AI regulation. There are marked gaps in regulation at an individual and corporate level that are damaging people and communities right across the country.”
Subscribe to our Editor's weekly newsletter