DCMS Committee Finds Online Safety Bill Gets the Balance Wrong

A new cross-party UK report from the DCMS Select Committee has warned that the Government’s new Online Safety Bill (OSB), which will task Ofcom with tackling “harmful” content online, fails to get the balance right and “neither protects freedom of expression nor is it clear nor robust enough to tackle illegal and harmful online content.”

At present much of the online content you see today is governed by a self-regulatory approach, which has often struggled to keep pace with rapid online changes. Various examples exist for “harmful” content, such as the rise of the ISIS terrorist group, child abuse, as well as state sponsored propaganda from hostile countries, online bullying, racism and the spread of COVID-19 conspiracy theories etc. Some of this is already illegal.

The big social media firms (Facebook, Twitter etc.) usually catch-up with tackling such problems, but they’re often perceived to be too slow or unwilling to act unless pressured to do so, while other websites seem to exist solely to promote the worst of humanity. The OSB appears set to pressure all sites, from big social media firms to small blogs, to act or face the consequences (e.g. access blocked by ISPs, huge fines and making some people liable for what others may say on their website).

The OSB is the Government’s response. But trying to strike the right balance between Freedom of Speech and outright censorship may be difficult, which is what happens when you attempt to police the common and highly subjective public expression of negative human thought. Faced with such a heavy risk of liability, most sites are likely to become overzealous when filtering user-generated content or prevent people from speaking at all.

DCMS Committee Finds Online Safety Bill Gets the Balance Wrong

Suffice to say, balancing all of this against complex issues of context (e.g. people joking about blowing up a city in a video game vs actual terrorists), satire or parody, the application of such rules to non-UK based sites, political speech and the risk from overzealous automated filtering systems (only economically viable for the biggest sites) – since manually moderating all content on major platforms would be economically impossible – is going to be a nightmare to get right.

Like it or not, this is not just about policing online platforms, but about policing what YOU all do and say online. Granted, some people go to extremes, but those often cross over into more clearly defined aspects of illegality, while what is being talked about in the OSB extends to the murkily ambiguous area of “lawful but still harmful.”

Findings of the DCMS Committee Report

The new report warns that the draft OSB would fail to prevent the sharing of some of the most “insidious” images of child abuse and violence against women and girls, while at the same time failing to protect freedom of expression.

The report proposes several amendments to the definition and scope of harms covered by the regime that would bring the Bill into line with the UK’s obligations to freedom of expression under international human rights law.

On top of that, it also recommends that the Government proactively address types of content that are “technically legal“, such as insidious parts of child abuse sequences like breadcrumbing and types of online violence against and women and girls like tech-enabled ‘nudifying’ of women and deepfake pornography, by bringing them into scope either through primary legislation or as types of harmful content covered by the duties of care.

Moreover, it found that the current provisions that provide Ofcom with a suite of powers and users with redress are “similarly unclear and impractical … we urge the Government to provide greater clarity within the Bill on how and when these powers should be used to ensure they are both practical and proportionate.”

As usual, the committee concludes with a long list of recommendations (see below), which helps to provide some context for their conclusions. The recommendations include various things, such as the need for a greater consideration of context when assessing whether content is “harmful“, as well as the need to categorise whether something it “harmful” to an adult or only children.

However, at the end of the day, there’s no escaping the fact that the OSB is an immensely complex piece of legislation, which perhaps reflects the fact that it is trying to police an immensely complex problem. At the same time it’s difficult to see how Ofcom could ever realistically hope to find the resources to adequately police the whole internet.

Suffice to say that many people in the industry are concerned that, in practice, big parts of the new bill may be unworkable in the real-world, at least not without causing significant harm (e.g. too many legitimate pieces of content being removed and Ofcom being swamped with requests).

Popular Articles