The UK’s House of Commons voted on 29 November to avoid recommitting the UK’s Data Protection and Digital Information Bill after the recent introduction of government-backed amendments, moving the proposal forward to the report stage of consideration.
Introduced in the House of Commons on 8 March, the bill is intended to update and simplify the UK’s existing data protection framework, the GDPR, with a view to reducing burdens on businesses and researchers.
Michelle Donelan, Secretary of State for Science, Innovation and Technology, said the bill emerged from “a detailed codesign process” with industry, business, privacy and consumer groups, and is expected to boost the economy by £4.7bn ($5.9bn) over the next decade.
Indeed, data-driven trade reportedly accounted for 85% of the UK’s total service exports in 2021, contributing an estimated £259bn to the economy.
However, reactions to the bill have been polarised, having been warmly welcomed by tech lobby groups while sparking outrage amongst civil rights campaign groups.
Particular fury has been directed at some of the late amendments to the bill, whose last-hour additions were called a “total abuse of parliamentary process” by prominent civil liberties campaigner Silkie Carlo. One such amendment was the “bank spying clause”, the provisions of which would force banks to surveil the accounts of welfare and state pension recipients in the interests of fraud detection.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataEnhancing commercial R&D or exploitation of personal data?
In written evidence, TechUK, the membership organisation championing the UK’s tech sector, welcomed the bill, saying that it would provide greater clarity and legal certainty for organisations as to how and when they can use user data in commercial and scientific R&D – for example, in medical trials, training artificial intelligence (AI) algorithms and product development.
Medtronic, a provider of medical technology solutions, also welcomed the bill’s Clause 2, which would amend the definition of scientific research to include data uses “carried out as a commercial or non-commercial activity” or in any way that “could be reasonably described as scientific”.
“We support the clarification that scientific research has a broad meaning to include critical partners in the research ecosystem such as privately funded and industry-led research,” it said.
However, the Open Rights Group worries Clause 2 could allow for the commercial exploitation of personal data under the guise of scientific research. By overextending the definition of “scientific research”, the clause risks undermining public trust in legitimate research activities.
Clause 9, they said, would also “reduce transparency over personal data uses for research purposes, such as by exempting researchers from providing information to large numbers of individuals or when personal data have been collected a long time ago.”
Lucy Purdon, Senior Tech Policy Fellow at Mozilla, said the changes will “remove incentives for companies to respect privacy”, disempowering consumers while empowering the ad tech and data broker industry.
She said: “Introducing “direct marketing” as a “recognised legitimate interest” as grounds for lawful processing and removing the balancing exercise in the interest of the data subject is out of step with efforts to reign in invasive online tracking.”
Clause 6 also amends the wording of the GDPR’s Article 5 so that the requirements on purpose limitation will be loosened, assisting commercial and non-commercial organisations involved in research to re-use personal data obtained from third parties.
The implication of this rewording, Stephen Cragg KC wrote in a legal opinion for Defend Digital Me, is that, if the controller obtained the data from another controller, it will not need to consider the purposes for which that other controller may have originally collected the data.
Automated decision-making by artificial intelligences?
The bill also reframes Article 22 of the GDPR, which is one of the few checks the UK has on the deployment of automated decision-making (ADM). Specifically, the bill removes the general prohibition on ADM undertaken without any meaningful human involvement, replacing it with safeguards only when a “significant decision” is made by automated means.
TechUK welcomed this distinction between “low-risk ADMs”, such as service personalisation, with “high-risk ADMs”, such as mortgage reviews and technologies assisting with hiring and employment.
Rights groups, however, expressed concern about this endorsement of decisions taken without human oversight, even where this decision-making does not have legal or “similarly significant” effects on individuals.
“We are concerned that the proposed changes do not offer sufficient safeguards to protect individuals from unfair or discriminatory outcomes of automated decision-making,” the Equality and Human Rights Commission said. “Because the data used to help the AI-based tools to make decisions may contain existing biases, individuals with particular protected characteristics may be unfairly impacted by automated decision-making.
“For example, an AI system used to monitor employee productivity may make automated decisions that do not take account of the legal requirement to make reasonable adjustments in respect of an individual’s disability, resulting in a person being penalised for not meeting productivity requirements.”
Indeed, “automated decision-making is never neutral,” Algorithm Watch explains. ADM outputs are only as good as the data they are trained on, and where data is unfair or biased, machine learning propagates and amplifies these biases.
Mozilla’s Lucy Purdon also expressed concerns that the reframing of Article 22 prioritises the right for organisations to innovate at all costs above the rights of ordinary citizens. “It shifts the burden onto users to not only know when automated decision-making is involved but also the impact it has on their lives, what data has been used about them and the avenues to challenge decisions,” she said.
Boon for tech companies, privacy nightmare for consumers?
The new legislation will undoubtedly prove a boon for tech companies and advertisers, the latter of whom will benefit from the introduction of “direct marketing” as a “recognised legitimate interest” as grounds for lawful data processing, while changes to GDPR articles on the processing of data for scientific research will strengthen companies engaged in data-powered R&D, such as in the development of AI models.
Provisions for more far-reaching Smart Data schemes have also been welcomed by the fintech industry and open banking networks such as TrueLayer, who say such schemes have the potential to “spur huge amounts of innovation, competition and economic growth”.
For consumers, however, the landscape looks significantly more bleak.
Civil liberties group, Big Brother Watch, says the bill risks severely diluting individual rights, while empowering big government (and indeed Big Tech). “The bill,” they said, “must be majorly revised in the course of its passage through parliament or revoked in order to protect the individual and collective privacy rights of the British public, safeguard the rule of law, and uphold key rights to equality and non-discrimination.”