top of page

Digital information bill could grant UK Government extraordinary financial surveillance powers ahead of decision on whether a CBDC will be introduced



Big Brother Watch has been running a campaign called “No Spycoin” to alert the public to the risks of the UK implementing a central bank digital currency (“CBDC”). 


Last year Big Brother Watch, a civil liberties campaign group fighting for a free future,  published a detailed report examining the impact on citizens of CDBC projects in various countries.  None of the projects passed the test of preserving privacy and prohibiting surveillance.


Although the Bank of England and the Government stated in January that no decision has been made to pursue a CBDC, also called a digital pound, in the same announcement it was proclaimed that “work will continue during the design phase exploring its feasibility and potential design choices.”  To do this work, three years ago the Digital Pound Taskforce was created to work on the digital pound and related CBDC policy.  Despite claiming no decision has been made, it seems they are indeed pursuing a CBDC.


Meanwhile, a bill is moving through Parliament which could grant the Government extraordinary new financial surveillance powers.  Is it just a coincidence?


Seven Key Issues Relating to CBDCs


Recently, Big Brother Watch made a short video to explain the serious problems for those who live in the UK with implementing a digital pound.


“A UK CBDC would have a major impact on this country,” they said and identified seven key issues.


1. A CBDC is a solution in search of a problem.


2. Privacy intrusion – Generalised surveillance of CBDC transactions would be inevitable given the context of the current legal landscape, particularly counter-terror law, anti-money laundering law and investigatory powers law.


3. Programmable money and financial control – The potential to program the public’s personal finances or welfare payments could lead to financial control, an invasion of privacy, potentially a breach of the right to protection of property and, depending on the limitations set, could pose a serious threat to a range of other fundamental rights – from freedom of expression to freedom of assembly and protection from discrimination.


4. Digital ID, CBDC and discrimination – It is nigh on impossible to issue a UK CBDC without a comprehensive digital identity system. Combining digital identity and CBDCs poses a serious risk of surveillance, security breaches, hacking/identity theft, and discrimination.


5. Data exploitation – The consultation points out that providers can use personal data to “develop marketing activities” and “tailor products and services”.


Exploiting personal data in this way would endorse mass surveillance and exploitation of the public’s sensitive personal data, further shrinking the private sphere in a growing digital panopticon.


6. Security risks – A centralised CBDC system would create a huge platform of population data and, as such, become a “critical piece of national infrastructure”. This would provide hostile state and non-state actors with a large target to focus cyberattacks on.


Combining digital identity and CBDC poses a serious risk of security breaches and hacking/identity theft and a successful breach could put the entire public at risk.


7.  Undemocratic – The decision to develop a UK CBDC should not be made by the Bank of England and HM Treasury alone – yet the pilot planned for 2025 seems to minimise parliamentary involvement.


The Government has now committed to introducing primary legislation with a vote in both Houses of Parliament before launching a digital pound.



Big Brother Watch: CBDCs | 7 things you NEED to know, 18 March 2024 (5 mins)

Last year, Big Brother Watch published their report ‘CBDC – a privacy-eroding pound?

Lessons from international central bank digital currency pilots for the UK’ which explored central bank digital currency (“CBDC”) projects in various countries and their severe impact on privacy, surveillance and financial exclusion.


The report analysed CBDC projects in Nigeria, Jamaica, Israel, Uruguay, Sweden, the EU and China and found:


  • None of the CBDC projects investigated in our report have been designed, trialled, or launched in a way which properly preserves privacy.

  • None of the CBDC pilots we looked at have been successful. Many have had a low uptake.

  • CBDCs are not a solution to the complex causes of financial exclusion and can make this problem worse.


Further reading: Central Bank Digital Currencies are Spycoins, The Exposé, 16 January 2024




Meanwhile, the Data Protection and Digital Information Bill is making its way through Parliament which could grant the Government extraordinary new financial surveillance powers.


The Bill was introduced to Parliament on 18 July 2022 following the publication of the Government’s response to the ‘Data: a New Direction consultation.  The consultation was part of the UK’s National Data Strategy.


Amendments to the largely under-scrutinised bill were announced in November 2023, giving banks powers to monitor the accounts of private citizens under the premise of searching for fraud and error in the welfare system. Banks would then be compelled to flag people who meet unspecified criteria to the Government.



The Bill has been passed by the House of Commons and is at the Committee stage in the House of Lords.  The latest Committee debate held on 27 March 2024 highlighted a particular concern regarding Clause 14 of the Bill titled ‘Automated Decision Making’.


The UK General Data Protection Regulation (“GDPR”) treats a solely automated decision as one without “meaningful human involvement” and the public is protected from being subject to solely automated decision-making where the decision has a legal or “similarly significant effect.”


Clause 14(1) inserts a new Article into the UK GDPR, which allows the Secretary of State for Science, Innovation & Technology to make regulations that deem a decision to have involved “meaningful human involvement,” even if there was no active review by a human decision-maker. The new Article also allows the Secretary of State to add or remove any of the listed safeguards for automated decision-making.


Lord Clement Jones, who wants Clause 14 to be deleted from the Bill entirely, said: “If the Government wish to amend or remove safeguards on automated decision-making, that should also be specified in the Bill and not left to delegated legislation.”


Delegated legislation, also known as secondary legislation, is a form of law created by ministers or other bodies under powers given to them by an Act of Parliament, also known as primary legislation.


Baroness Kidron agreed.  “Clause 14 removes the right not to be subject to an automated decision and replaces that right with inadequate safeguards,” she said. “The fact that the Secretary of State has delegated powers to change the safeguards at will … [means that UK citizens] have lost the right not to be subject to an automated decision.”


She said that the Government was knowingly diminishing people’s rights. “The fact that the Government have left some guardrails for special category data is in itself an indication that they know they are downgrading UK data rights,” she said.


Baroness Kidron noted there was a “steady drum beat of resistance” against the Government’s plans to change data rights to benefit the commercial interests of technology companies at the expense of children.


Baroness Harding gave an example of the risk that automated decision-making poses to employment.  She then highlighted the harm Clause 14 would cause to children. “We know that automated decision-making can do irreparable harm to children,” she said.  “If we are unable to persuade the Government to remove Clause 14, it is essential that the Bill is explicit that the Secretary of State does not have the power to reduce data protection for children.”


The Committee agreed on an amendment to insert a new clause after Clause 14 which puts a legislative obligation on public bodies using algorithmic tools that have a significant influence on a decision-making process to publish reports under the Algorithmic Transparency Recording Standard (“ATRS”). 


The ATRS was launched in November 2021 as part of the National Data Strategy to help public sector organisations provide clear information about the algorithmic tools that they use, how they operate and why they are using them.


The Committee also proposed another new clause that puts the onus on public sector actors to ensure safety and prevent harm, rather than waiting for harm to occur and putting the burden on citizens to challenge it.


This second new clause imposes a proactive statutory duty to ensure that “automated decision systems … are responsible and minimise harm to individuals and society at large.”  It includes duties to be proportionate, to give effect to people’s human rights and freedoms, and to safeguard democracy and the rule of law. It applies to all “automated decision systems” – to partly automated decisions, as well as those that are entirely automated, and systems in which multiple automated decision processes take place.


Although the House of Lords Committee is quite rightly showing concern about the risks automated decision-making poses to employment and children, the concerns are far more reaching and have a far larger impact on all of our daily lives.  


Unless it has been debated elsewhere, it’s worrying that there seems to be no concern in either the House of Commons or the House of Lords that Clause 14 could be used in conjunction with the surveillance of our financial transactions and bank accounts.


The National Data Strategy from which the Bill stems, aims to develop a data economy while “ensuring public trust in data use.”  Allowing automated decision-making and granting the Secretary of State the power to further curtail our rights, particularly in light of the looming implementation of programmable CBDCs, makes it appear as if the Government is not making any effort to gain our trust in their use of our data.


On the contrary, we could suspect that the Data Protection and Digital Information Bill is being moved through Parliament in preparation for a CBDC; despite the Bank of England stating “we haven’t made a decision” on whether a digital pound will be introduced and a “decision of whether to build a digital pound will be made around the middle of the decade at the earliest.”  In other words, they are getting the legislation ready and then, when it is in place, the Government and the Bank of England will announce they have decided to introduce the digital pound.

Comments


bottom of page