Last week plans were announced that Ofcom will be put in charge of regulating the internet in the UK. The move came as part of the Government’s response to a consultation over the Online Harms White Paper, details of which were unveiled by The Guardian newspaper in April 2019.
Until now, much of the online world has operated through a system of self-regulation, which many have criticised for letting harmful content appear online. These new plans mean there will be a new law for websites, which will be enforced by Ofcom via a Code of Practice.
Under the proposals Ofcom will not have the power to remove posts from the internet but it will require internet companies that allow User Generated Content (UGC) to be published on their websites to publish clear statements outlining what content is and is not acceptable and ensure that these standards are enforced.
But it’s not just media companies, such as newspaper and magazine websites and social media platforms that will be held to account. Under the proposals ANY online business that enables the sharing of UGC will be required to publish annual transparency reports explaining how they are meeting their standards.
The standards will be based around two types of content; that which is illegal (such as terrorist and child sexual abuse content) and that which is harmful with the focus on the former being removed quickly (or prevented from being posted at all) and the latter either being removed or that businesses make it explicitly clear to users of the site that that type of content could be found on the site.
The re-introduction of age verification for certain websites is also part of the proposals. The government will publish a full response to the consultation setting out further details of Ofcom’s potential enforcement powers but in the meantime you can see the initial response here.