Blog
Compliance
February 25, 2024
Approx 12 min read

2023 Compliance Unwrapped: Hits, Misses, and what’s coming up in 2024

Written by
The Naq Team

Last year, we anticipated a bustling period for data protection and information security legislation – and we weren’t disappointed. From the UK edging closer to finalising its new take on the GDPR - the Data Protection and Digital Information Bill, to establishing a new data bridge with the US and the growing adoption of the UK Government's Cyber Assessment Framework across the NHS and wider public sector. 

Across the Channel, the EU wasn't idle either, with significant legislative developments like the European Data Act, the Data Governance Act, and an agreement on the Draft Cyber Resilience Act. And, of course, the headline-grabber of every compliance round-up this year: the rapid advancements and regulatory attention around AI.

As we kick off 2024, we'll review the key developments in data protection, security, and information governance that shaped 2023, what we saw coming, what took us by surprise and take another shot at what's coming up for 2024. 

Data Protection and Digital Information Bill (No.2):

Last year, the UK grew closer to finalising the Data Protection and Digital Information Bill (No.2), which is currently under consideration in the House of Lords. This Bill is the UK's attempt to diverge from the GDPR, aiming for a more UK-centric approach to data protection. However, it's not without its drama. 

On the one hand, you've got organisations and MPs ready to bid farewell to the stringent compliance requirements of the GDPR. On the other hand, data privacy advocates have raised concerns, suggesting that the Bill will significantly depart from the protections afforded by the GDPR and weaken individuals' right to privacy

So, what does this mean for businesses? Well, there are a couple of changes and definitions that, when you look at them really, really, closely, shift away from the GDPR.

Under the DPDI, organisations will no longer be required to have a DPO but must instead appoint a Senior Responsible Individual (SRI) to oversee and take responsibility for high-risk data processing activities. Similarly, Data Protection Impact Assessments(DPIAs) and Records of Processing Activities (RoPAs) will only need to be conducted if high-risk data processing is likely. The ICO has yet to publish what will count as "high-risk" processing under the new DPDI. 

But what about compliance? Will businesses with interests in the UK and the EU need to comply with both? Well, in March 2023, legislators and the ICO stated that compliance with the GDPR would still be on par for meeting the UK's data privacy legislation requirements, even after the introduction of this new Act, suggesting that organisations with business interests both in and outside the UK, won't find themselves juggling two sets of rules. Organisations with strictly UK-centric activities must determine which framework - existing GDPR standards or the new UK-specific regulations – best aligns with their operational needs.

EU Cyber Resilience Act

As 2023 drew to a close, the EU Parliament and Council reached an agreement on the Cyber Resilience Act (CRA), which aims to establish a set of mandatory cyber security requirements for all "products with digital elements" or PDEs across the EU, including hardware and software.

Manufacturers of PEDs will be required to adopt a "security by design" approach, integrating cyber security into every stage of their product's lifecycle, from concept to decommissioning. Manufacturers will also be required to provide ongoing support and security updates for their products, undergo conformity assessments, and report unpatched vulnerabilities to the relevant authorities within 24 hours of discovery, a requirement which has been hotly criticised by cyber security specialists. 

While these specialists recognise the Cyber Resilience Act as a positive step towards ensuring the security of digital products and software throughout the European Union, some concerns have been raised about the additional bureaucratic hindrances this may introduce to the vulnerability resolution process and the legislative burden it will place on smaller businesses. Additionally, there is a worry about the potential creation of government-held databases containing open security vulnerabilities across the region, which is bound to become an attractive target for cybercriminals.

Importers and distributors of PDEs will also be subject to strict due diligence obligations to ensure the products align with the standards set out by the CRA or be liable to a fine.

The finer details of the Cyber Resilience Act, including its full text and the comprehensive list of hardware and software that falls under its purview, are set to be unveiled early this year. The list is expected to include critical software such as antivirus solutions and hardware such as IoT devices, toys, and wearable devices. As we'd expect from the EU, the fine for non-compliance will be a steep one, up to €15 Million or 2.5% of an organisation's global turnover.

The CRA is expected to receive formal approval in early 2024, with the vulnerability reporting requirements scheduled to come into force in late 2025. The remaining security requirements outlined in the legislation will be phased in over the next few years.

Meta Fined for Transferring Personal Data to the US

In January of 2023, we wrote about the Irish Data Protection Authority publishing a draft decision that would make it impossible for Meta to transfer personal data or European citizens to the US, throwing a big spanner in the works for Meta's operations in Europe. 

Of course, Meta's operations continued, albeit not without consequences. In a landmark judgement in 2023, the Irish Data Protection Commission (DPC) determined that Meta's practice of transferring EU citizen data to the US violated Article 46(1) of the GDPR. The DPC mandated that Meta must amend its data processing operations to conform with GDPR, including ceasing any unlawful data processing and data storage activities within the US.

The fallout from this ruling was substantial for Meta. The company was hit with a record fine of €1.2 billion, marking one of the most significant penalties for GDPR non-compliance. Meta has contested this ruling, arguing that the standard contractual clauses (SCCs) – the legal frameworks that facilitate data transfers from the EU to the US – are widely used by thousands of US companies with interests in Europe and that through this ruling, Meta has been unfairly singled out for non-compliance. 

UK and EU agree on data bridge with the US

Staying on the topic of US data transfers, in July 2023, the European Commission established a renewed chapter in data transfer relations with the US by adopting the EU-US Data Privacy Framework (DPF), maintaining seamless data transfers between the two regions. 

Replacing the previously invalidated EU Privacy Shield, the DPF lays out a framework to ensure EU data processed by US organisations is afforded a similar level of protection as it does under the GDPR. Under the framework, US organisations that handle EU data must certify their adherence to the principles detailed in the DPF. 

However, with the UK no longer part of the EU, the DPF did not cover data transfers between the UK and the US. To address this gap, the UK introduced its own counterpart in October 2023 - the UK - US Data Bridge. This extension mirrors the EU's Data Privacy Framework, affirming that the US provides an adequate level of data protection for certain types of transfers under the UK GDPR and the Data Protection Act. Just like in the EU, US organisations must certify their compliance with the DPF and the Data Bridge. 

While currently effective, these adequacy decisions have not been without their controversies. One notable issue revolves around the language used in both the EU-US Data Privacy Framework (DPF) and the UK-US Data Bridge, particularly how they diverge from GDPR and UK-GDPR standards regarding sensitive data.

Under the GDPR and UK GDPR, 'special category data' is accorded additional protections, recognising the severe implications a breach could have on an individual's rights and freedoms. However, within the UK-US Data Bridge, the criteria defining sensitive or special category data vary. This raises concerns that the stringent protections for such data under the UK GDPR might only partially apply once the data crosses over the Atlantic.

Moreover, both the GDPR and UK GDPR have specific measures to safeguard individuals from significant impacts due to decisions based solely on automated processing if these decisions are deemed to have a significant or legal impact on individuals. This crucial aspect of the GDPR and UK-GDPR is less thoroughly addressed in the DPF and the UK-US Data Bridge. The absence of these safeguards in the new frameworks has already sparked legal challenges. Although the European Court of Justice recently dismissed a challenge to the DPF, it likely won't be the end of the controversy in EU/UK-US data transfers.

AI governance: What's happening, who's doing it, and who's doing it right?

What else would we finish with if it wasn't AI? Last year, the UK government set out the framework for the UK's approach to regulating AI. Emphasising flexibility and sector-specific requirements, the UK government did not establish a single, centralised AI regulation body. Instead, it is delegating the responsibility to existing, sector-specific regulatory bodies such as the ICO, the Competition and Markets Authority, the Health and Safety Executive and the Human Rights Commission, among others. This approach aims to allow for AI regulations to be tailored to the unique demands and risks of each sector. 

Along with this decision, 2023 saw a huge generation of reports, whitepapers and guidelines on AI. Notable continuations included the NCSC's guidelines on secure AI development, the impact of AI on market development and an updated report on the risks associated with AI technology. The vast amount of AI governance information published by the UK government this year highlights the challenge for government bodies to effectively legislate this rapidly evolving technology and the need for clearer, more concrete guidance.

In contrast, the EU has taken a more centralised approach with the provisional agreement on the AI Act (AIA) reached in December 2023. This groundbreaking legislation categorises AI systems based on risk level, focusing particularly on high-risk systems that could significantly impact health, safety, fundamental rights, the environment, and democracy.

Under the AIA, a comprehensive rights assessment is required for high-risk AI systems before they enter the market. This means developers of AI-powered platforms processing sensitive data, like those in healthcare, must conduct thorough rights assessments to ensure compliance before launching their products. Under the draft legislation, specific use cases have also been unacceptable, banning the scraping of faces from the internet to create facial cognition databases, social scoring, and emotional recognition in the workplace and educational institutions. This EU-wide legislation aims to create a uniform standard for AI regulation, striking a balance between innovation and the protection of fundamental rights.

So, unified framework vs sector-specific regulation, which will be more effective? As with most things in life, the optimal approach will likely lie somewhere in the middle, a model combining strong legislative oversight, the legal powers to manage non-compliance and the flexibility of a sector-specific focus.

AI regulation is still very much in its infancy, and regulations written before 2023 are now being re-evaluated due to the rapid development of AI tools such as ChatGPT, Clearview AI, BARD and LLaMA. While it is likely that during 2024, we will see the formalisation of these regulatory frameworks and more and more countries adopting AI regulations, policymakers will need to contend with the task of juggling three very delicate challenges: technological evolution, safeguarding the public interest and promoting innovation, a task we don't envy one bit.

Upcoming Data Protection Regulations:

The Digital Services Act: This regulation aims to "ensure a safe, predictable and trusted online environment" by requiring large online platforms to remove illegal online material and content that doesn't meet the standards outlined within the Act. Additionally, the legislation aims to bring in additional protections around consumer rights and greater accountability around data protection for digital intermediary services such as e-commerce platforms, search engines and content-sharing platforms. 

The Online Safety Bill: This regulation aims to "make the UK the fastest place in the world to be online" by requiring large online platforms to remove illegal and harmful online material. No, you didn't just read that twice. Much like the EU's Digital Services Act, the Online Safety Bill aims to moderate both illegal and "legal but harmful" content online, placing the responsibility on search engines, social media and user-generated content platforms to quickly remove or prevent this type of content from being published in the first place. 

Unsurprisingly, both pieces of legislation have faced an enormous backlash, particularly in the UK, where the Act includes powers to demand messaging services like WhatsApp, Signal and iMessage to examine the content of encrypted messages for illicit material. The EU's Digital Services Act has faced a similar backlash, with data privacy advocates stating that the legislation violates the EU's protections around freedom by allowing the government and organisations like Meta and Google to dictate what can or cannot be said on their platforms.

UK organisations with EU interest will face additional compliance regulations:

While in the UK, the introduction of the aforementioned Data Protection and Digital Information Bill (No.2) will surely make waves next year; however, compared to the EU's strides in data legislation, the UK seems to be playing catch-up.

Although not set to come into force until 2025, the EU's Data Act and the Digital Operational Resilience Act (DORA) the impact of these pieces of legislation will go far beyond the physical borders of the EU and will introduce additional security, information governance and data protection requirements for organisations with interests in the region. As always, we'll be sure to bring you the impact this will have on your organisation.

And there we have it. A lengthy but much-needed wrap-up of some of the key developments in the world of security and information governance legislation over the last 12 months. 

Take the complexity out of managing growing compliance regulations

As we've explored in our blog, the world of compliance is evolving rapidly, but that's just the tip of the iceberg. The upcoming year will introduce an array of sector-specific compliance requirements that add another layer to the already complex landscape. Organisations will find themselves navigating not just a growing list of general information governance, cyber security and data protection requirements but also grappling with additional, more nuanced regulations specific to their sectors. This is especially true for industries like healthcare, where the stakes for compliance are exceptionally high, and the regulations are correspondingly intricate.

Naq's automated compliance platform was built to simplify your organisation's compliance management, cutting down the time and effort needed to meet the diverse regulatory and compliance requirements demanded by your customers and your sector. Whether it's aligning with GDPR, ISO 27001, SOC2, or adhering to industry-specific frameworks like NHS DSPT and HIPAA, Naq's comprehensive platform is engineered to enable organisations to seamlessly achieve, manage, and continuously maintain their compliance posture, even as their businesses grow and regulations change.

Find out why hundreds of organisations use Naq to achieve, manage and maintain the compliance frameworks they need to grow. Book your 15-minute Naq demo.