Bytesize Legal Updates | Fieldfisher

Bytesize Legal Update - The UK's Online Safety Act

September 28, 2023 Fieldfisher Season 1 Episode 6
Bytesize Legal Update - The UK's Online Safety Act
Bytesize Legal Updates | Fieldfisher
More Info
Bytesize Legal Updates | Fieldfisher
Bytesize Legal Update - The UK's Online Safety Act
Sep 28, 2023 Season 1 Episode 6
Fieldfisher

After a long series of delays, the UK's Online Safety Act (OSA) has finally finished its journey through Parliament, and just awaits Royal Assent to become law.
Like Europe's Digital Services Act (DSA), the UK law aims to tackle the challenge of online harms and protect vulnerable groups online, but varies significantly in both its scope of application and range of imposed obligations.

Now in (almost) its final form – who does the Act apply to? When can we expect it to take effect? And what do businesses need to do to make sure they comply?

In our latest Bytesize Legal Update, Fieldfisher's  James Russell and Imogen Boffin distill down the key takeaways and outline the next steps for businesses.

Show Notes Transcript

After a long series of delays, the UK's Online Safety Act (OSA) has finally finished its journey through Parliament, and just awaits Royal Assent to become law.
Like Europe's Digital Services Act (DSA), the UK law aims to tackle the challenge of online harms and protect vulnerable groups online, but varies significantly in both its scope of application and range of imposed obligations.

Now in (almost) its final form – who does the Act apply to? When can we expect it to take effect? And what do businesses need to do to make sure they comply?

In our latest Bytesize Legal Update, Fieldfisher's  James Russell and Imogen Boffin distill down the key takeaways and outline the next steps for businesses.

0:00:04 - Imogen
Hello, I'm Imogen, and I'm James, and we're both Tech and Data Specialists from Fieldfisher at Silicon Valley. Today, we are talking about the UK's new online safety act and the key details you need to know. At long last, it has finally finished its journey through Parliament, the revolutionary piece of legislation that we've been promised will overhaul the protections for vulnerable groups online. But, in its final form, who does the act apply to? When can we expect it to take effect, and what do businesses need to do to make sure they comply? So, james, I feel like we've been hearing about this forever, but for those of our listeners hearing about it for the first time, what is the online safety bill and why has it taken so long to be enacted? 

0:01:02 - James
To start off, it might worth just flagging that we're recording this before final Royal Assent has been given for the act and there's no date that has been set for that as yet, so timelines that we're going to be talking through might need to move around a bit, but the hope is that we're almost at the end of a very long road. So, as the name suggests, the objective of the online safety bill was to grapple with the challenges posed by harmful activity in content online. Particularly when we were looking at vulnerable groups like children, with a rising concern about the use of social media platforms to promote and facilitate extremism and the spread of misinformation, particularly like we saw during the 2017 UK general election, as well as concerns around childhood bullying and access to harmful content online as, of course, most tragically we saw with the death of Molly Russell in 2017, there was a broad degree of consensus that something needed to be done, but since then, the bill has gone through a bit of a saga since it was first conceived of in the online harms white paper, which was all the way back in 2019. At the time, it was something of a trailblazing initiative and it had the lofty ambition of making the UK the safest place in the world to be online. Since 2019, it's fallen foul of a series of delays. That means it's already long since been overtaken by some of the other examples we've seen worldwide. 

So the Australia's online safety act, which came into force on the 23rd of January 2022, and, most importantly, the EU's Digital Services Act, or the DSA, which came into force in November last year and became applicable, at least for the biggest services, just earlier on this year. A lot of this has been caused by a deadlock around some of the key challenging topics of debate. First amongst these was the definition of legal but harmful content, specifically when that was relating to adults, so this language has now been dropped from the act in favour of what we're now calling user empowerment provisions, although they still target in many ways the same kind of content. Secondly, the obligation on platforms under the so-called spy clause, which required companies to use accredited technologies to scan for terrorist and CSAM so child sexual abuse material content in a manner which was fundamentally according to at least its proponents antithetical to an end-to-end encryption system. This remains in the act, despite some last-minute assurances from the government that these aren't going to be enforced until those technologies can be implemented in a privacy-protective manner, but it's important to remember that they are still in there. 

Thirdly, then we have the perennial question of government paternalism and the potential threats to free speech, and I think, at least for this office, we can say that we have no doubt that those debates will continue long after the act starts to be enforced. However, I think it's fair to say that, even if it's no longer trailblazing, the ambition of the act still remains. So if you thought the DSA was burdensome, I'm afraid that we've got some bad news for you. Just looking at the two side-by-side on the table, the Digital Services Act is around 100 pages, and the Online Safety Act is almost three times that length, with a considerably more intimidating list of obligations which it imposes on all of those businesses that fall in scope. 

0:04:19 - Imogen
But there's some good news there, at least right. The online safety act doesn't quite have the same broad scope of application. 

0:04:25 - James
Yeah, that is true. So under the Digital Services Act, as you might know, it applies pretty broadly to all intermediary services that are offered to users. So that includes businesses and individuals who are located or established in the EU, including internet service providers, domain registrars, cloud and web hosting services, social networks, etc. By contrast, the UK's Online Safety Act is somewhat narrower. So, broadly speaking, it applies to two categories of regulated services. First of all, user-to-user services. Second, search services. 

The third category, which sometimes gets missed out because it's sort of an extra wrinkle on this, but that's the providers of pornographic content. Probably you know who you are if you're already in that category, but it's worth just mentioning. But it's not just the two. The third category is there as well. At the government's last estimate, it was expected that the total number of in-scope services in the UK alone was as high as 25,000 different companies. But it's important to remember that this legislation will also have extra-toritorial effect. So if your service has links to the UK and that means that you're capable of being used by UK individuals or there's reasonable grounds to believe that there's a material risk to UK individuals by use of your service, then you are going to be caught in scope. 

0:05:47 - Imogen
But presumably it isn't everyone who is subject to the same level of scrutiny as the Google's and the Amazon of this world, is it? 

0:05:53 - James
No, so thankfully, not Like the DSA, the OSA, the Online Safety Act, takes a tiered approach to this compliance obligations. So whereas under the DSA, the largest V-LOPS and V-LOZE's, so the very large online platforms, the very large online search engines, have the largest number of obligations to comply with and the smallest platforms are exempt from many of the obligations, although, again, notably not all of the obligations. The OSA similarly imposes a tiered categorization approach. So, starting at the bottom of the ladder, all in-scope providers have to make illegal content risk assessments, they need to take or use proportionate measures to mitigate the risks of illegal content, they need to implement appropriate reporting and complaints procedures, and they need to balance these with freedom of expression and privacy concerns. 

Moving then, sort of up our ladder, if, in addition to that, your service is likely to be accessed by children, you also have to conduct children's risk assessments and you need to take or use proportionate measures to protect children's privacy online. We then get into the numbered categories. So, on top of this, if you're a category 2b provider, you also have to submit additional annual transparency reports. If you're a category 2a provider, you have to also use proportionate systems and processes to prevent fraudulent advertising, and you have to make details of your risk assessments public and then, at the very top, if you're in the category 1 services, then you need to do some of this empowering of individuals that we talked about at the start. So you need to empower adults to increase their control over exposure to horrible content, and that includes things like content that would encourage suicidal behaviors or eating disorders or which might be considered abusive, even if it's legal. On top of that, they also need to protect content of democratic importance, which includes things like news publishers or journalistic content. 

0:07:55 - Imogen
Well, okay, that's quite a lot. So how do you know which category you fall in? 

0:08:00 - James
Well, unfortunately, this is another area where we're gonna have to wait for OFCOM to provide more details. 

So, unlike some of the very few provisions which do come into force on the day that the act is enacted and again we're doing this before Royal Assent, so we're hoping that'll be any day now but the vast majority of provisions will only come into force according to the terms of secondary legislation which is yet to come. 

That's still got to be prepared by the Secretary of State, being advised by OFCOM. Some of that secondary legislation is gonna add really critical detail to obligations which are drafted pretty broadly. I think it's fair to say so. For example, we're expecting the first code for illegal harms to provide detail on what a suitable and sufficient illegal content risk assessment looks like and what practically that is gonna entail Critically. The codes also set out the thresholds for categorization to determine whether you fall into the different numbered categories that we talked about, whether you're a category 1, 2a or 2B service. So for now all we know is that those are going to involve some kind of consideration of the number of users, the functionalities of your service, but also this rather broad category of any other characteristic that the Secretary of State considers relevant. 

0:09:15 - Imogen
Oh, okay, but then where does that leave businesses? What do they need to do now to prepare their compliance plan if they don't know what categorization they will fall under? 

0:09:25 - James
Yeah. So OFCOM have made it clear that they're going to be taking a staged approach to their preparation of this additional secondary legislation, as they set out in their Roadmap to Regulation from July 2022, and which was updated recently. In July of this year, OFCOM planned to publish their code and guidance in essentially three phases. The first of these is going to be focusing on illegal harms and the duties that surround that, and that was originally planned to be published a hundred days after commencement of the law, but now, with the July update, is expected to become shortly after commencement. Essentially, they've had the extra time to prepare. So OFCOM slightly ahead of the game. We're anticipating that this could be sometime, maybe November-December this year, but potentially as early as mid-October. The second category is then focusing on child safety and pornography which, again going by our rough calculations based on the time periods, we're expecting could be before the end of the six months from commencement, so maybe around spring next year. And then, finally, the third category, which focuses on those elevated duties for the numbered categories. So, if you're in 2B, 2A or 1, these responsibilities about things like preventing fraudulent advertising, transparency reports, user empowerment tools those things could begin as early as those that we mentioned in the first category, so as early as mid-October. But what we certainly know is that they are expected to be done, and all the required secondary legislation to be finished, by the end of one year from commencement of the act, so it will all be done around October 2024. 

So what does that mean businesses need to do? To come back to your question, Well, we would suggest that businesses should probably follow that chartiered, tiered approach that OFCOM are suggesting, Starting out by determining, first of all, are you caught within the scope of the provisions at all and therefore, broadly, what obligations are going to apply to you, without even getting into the detail of how to comply, just are you caught at all? Once you've figured that out, we would say that step two is then preparing a gap analysis where you identify what are we doing already? To what extent might we already comply with the provisions, and then begin to prioritize, based on feasibility and logistics, what's missing and what you're going to need to put in place. 

Then as further details of the codes of practice prepared by OFCOM are circulated and we get more details on that secondary legislation, the plan can be tweaked and updated to fit the more concrete details that we get from OFCOM and of course we'll be back to provide more updates as those become available. So, as we mentioned earlier, ofcom have updated their roadmap and it's possible that they might do so again. The details of the timing might change, but hopefully that gives you at least a rough idea of what's coming down the track. 

0:12:13 - Imogen
Thanks, James. So, in terms of the key takeaways, then, after all the delays, the online safety act is finally here, but there's still a lot of details to be finalized. Businesses should therefore begin by identifying whether they fall in scope are you a user to use a service or are you a search service? And what are the key obligations which are likely to apply. Businesses can then analyze their existing processes to determine what changes will be needed. According to the existing guidance, businesses could be quite onerous and go beyond the requirements for a legal content moderation under Europe's DSA, but, crucially, they will need to review and update this as more details on codes and guidance from OFCOM is released and the secondary legislation is finalized. 

0:12:55 - James
Yeah, exactly. So the law has passed, the regime is potentially more burdensome, but the details are unclear. This is definitely one that you're going to want to watch, and we know it might seem a little overwhelming at first, but we think that OFCOM's phase approach should hopefully make things a little easier to break down and hopefully make the beast a little bit more bite-size. 

0:13:16 - Imogen
Oh, James, you're an absolute cheeseball, but thank you for joining us on this latest episode of FieldFisher's Bite-Size Legal Podcast, your source for concise legal updates on the key legal developments in technology and data protection law. If you have any questions about today's update, don't hesitate to reach out to us, and if you found it useful, do make sure to give us a like or a review on your podcatcher of choice. Thanks for taking the time to listen and we'll see you next time. 

0:13:40 - James
See you next time. Thanks, everybody.