AI Regulation Bill in the UK: Stakeholders Eager for Policy Clarity

TL;DR
This blog discusses the Artificial Intelligence (Regulation) Bill, that was recently introduced in the House of Lords. The Bill proposes statutory guidelines for AI governance and establishes a central oversight Authority. The article discusses how the Bill requires businesses using AI to appoint designated AI Officers, raising concerns regarding the compliance burden. It also emphasises how the Bill needs to provide details on the proportional implementation of restrictions on AI, based on the evaluation of risks and benefits. It concludes by cautioning against legislative ventures without stakeholder engagement, which lead to regulatory uncertainty and may undermine confidence in the UK's AI governance efforts.

The Artificial Intelligence (Regulation) Bill was introduced in the House of Lords of the UK on 22nd November 2023[1]. The Bill was sponsored by Lord Chris Holmes, a Member of the House of Lords from the Conservative Party. The Bill is an omnibus law, seeking to regulate Artificial Intelligence (AI) for safety, transparency, fairness, accountability, contestability and compliance with anti-discrimination, data protection, privacy and intellectual property legislation. This omnibus approach focuses on central oversight and is a departure from the UK’s previous positions on AI Governance, which included a whitepaper 2023 White Paper titled “A pro-innovation approach to AI” that emphasized a non-statutory approach to AI regulation. The non-statutory approach refrained from enacting any legislation to apply governing principles to AI, and instead leveraged regulators’ domain-specific expertise to customize the principles to the context in which AI is used. Given that the member that introduced the AI Bill hails from the ruling party in the UK, its introduction in the UK Parliament points to some confusion within the UK government on how to deal with this subject. Consequently, it may be useful to unbundle some of the Bill’s key features, even though private members’ bills mostly fail to translate to legislation.  

First, the Bill establishes an AI Authority tasked with overseeing and harmonizing the sectoral regulators’ methods of AI governance. The establishment of an AI Authority takes a page from the EU AI Act which establishes an EU level body, the Artificial Intelligence Office, for harmonized implementation of the Act. However, the establishment of a new oversight authority does not address the apprehension in the House of Commons Committee’s interim report on the governance of artificial intelligence that existing Regulators may require additional resources and powers to govern AI deployment within their sectors effectively. 

Second, the Bill notes that any restrictions on AI must be proportionate to the benefits, nature of the service or products, nature of risk, the cost of implementation of the compliance burden and its impact on enhancing UK international competitiveness. However, merely stating the need for proportionality is insufficient.  It is necessary to outline how compliance requirements would vary based on the use case of the AI, its specific risks, or benefits. 

Third, the Bill requires businesses to appoint designated AI Officers if the business develops, deploys or uses AI. These AI Officers would be tasked with ensuring the safe, ethical, unbiased and non-discriminatory use of AI, based on unbiased data. An obligation to appoint additional officers adds to the compliance burden on businesses, more adversely impacting start-ups and small enterprises. Additional compliance requirements increase barriers to entry, thereby stifling innovation. It would be inadvisable to impose such an onerous burden on businesses in the absence of any consultation or evidence to show the efficacy of specialized compliance officers for AI. 

Finally, the Bill requires the submission of records of all third-party data and intellectual property used in AI training (Section 5). It mandates that the data and IP must be collected only through informed consent after complying with IP obligations. Unfortunately, this provision is futile, given the lack of clarity on whether training AI on copyrighted works can fit into the limited exceptions to copyright infringement. One of the exceptions allowed by the Copyright, Designs and Patents Act 1988 is the creation of temporary copies of copyrighted literary works if copying itself is not economically significant, but is necessary for a technological process that uses the work for legal purposes. There is yet to be a judicial decision settling the application of this exception to AI systems that only create temporary copies during the training process.[2] Another exception relevant to AI training is the text and data mining (TDM) exception that allows copying copyrighted material during the automated analysis of legally accessed material, provided that it is for non-commercial research. The UK had attempted to extend the TDM exception to all purposes beyond non-commercial research - which allowed AI training on copyrighted material for research. However, this proposal was unsuccessful owing to criticism from the creative community. To address this confusion on the IP obligations surrounding AI training, the Intellectual Property Office, is expected to release a Code of Practice[3] shedding light on copyright licensing for AI training. Pending such clarity, the Bill’s requirement to comply with IP and copyright obligations while training AI does not add anything meaningful to balance the interests of rights holders and businesses working with AI. 

The Bill attempts to regulate AI through legislation, while the previous 2023 white paper focussed on flexible non-statutory intervention. The divergent attempts at AI regulation emerging from within the government create regulatory uncertainty and confusion. For AI related businesses and innovation to thrive in the UK, it is imperative to resolve the differences in opinions within the government and offer clear policies.

[1] https://bills.parliament.uk/bills/3519

[2] https://www.cliffordchance.com/expertise/services/intellectual-property/global-ip-updates/2023/q2/large-language-doddle-generative-ai-and-uk-copyright-law-explained.html

[3] https://www.gov.uk/guidance/the-governments-code-of-practice-on-copyright-and-ai