Home sci-tech Online Harms bill: Warning over ‘unacceptable’ delay


Online Harms bill: Warning over ‘unacceptable’ delay

by ace
Online Harms bill: Warning over 'unacceptable' delay

Image copyright
Getty Images

Image subtitle

Lord Puttnam accused the government of "losing" the online damage bill

The chairman of the Lords Democracy and Digital Committee said the government's online protection bill could be delayed for years.

Lord Puttnam said the online damage bill may not go into effect until 2023 or 2024, after a government minister said she could not commit to taking it to parliament next year.

"I think we laughed," he said.

The government, however, said the legislation would be introduced "as soon as possible".

The Online Harms Bill was introduced last year amid a wave of political action after the story of 14-year-old Molly Russell, who killed herself after viewing online images of personal injury, came to light.

It is seen as a potential tool to hold websites accountable if they cannot deal with harmful content online – but it is still in the proposal, or in the "White Paper".

The Department of Digital, Culture, Media and Sport (DCMS) said the legislation will be ready at this parliamentary session.

But the Lords committee report said that DCMS Minister Caroline Dinenage would not commit to bringing a bill to parliament before the end of 2021, prompting fears of a long delay.

In her evidence for the committee in May, she had warned that the Covid-19 pandemic had caused delays.

But speaking to the BBC's Today program, Lord Puttnam said, "It's over."

  • Ofcom predicts big fines for harmful online content
  • More powers for Ofcom to police social media companies

"Here is a project that the government has shown to be very important – and it is – that they managed to lose in some way."

The government originally introduced the idea of ​​online regulation in 2017, following it with the White Paper 18 months later, and a full response is not expected until the end of this year.

Lord Puttnam said a potential 2024 date for it to take effect would be "seven years from conception – in the world of technology, which has two lives".


By Angus Crawford

Molly Russell's death seemed to galvanize the online damage debate.

At just 14 years old, she took her own life after seeing a tireless stream of negative material on Instagram. Days after his father Ian's decision to speak publicly about what happened, government ministers were calling for a "purge" of social media.

The heads of the technology companies were summoned, dressed and warned that they could be held personally responsible for harmful content. New legislation was demanded, drafted and put into consultation.

And 18 months from now, we leave a warning that the new law may have to wait until 2024.

Online Harms bill: Warning over 'unacceptable' delay

Media playback is not supported on your device

After Molly Russell took her own life, her family discovered distressing material about suicide on her Instagram account

Some activists were depressed at the prospect of real change. But two notable things happened this summer.

First, the campaign and the hashtag #stophateforprofit. Advertisers began to pile up on Facebook and Instagram, the company's stock price fell 8% in one day and Mark Zuckerberg promised to act.

And here in the UK, reported only sparingly, the Age Appropriate Design Code has been presented to Parliament. It forces online services to provide children's data with the highest level of protection. This includes preventing automatic recommendation of harmful content for young people.

Stricter regulation is coming to the sector, but it will be far from a smooth process.

Lord Puttnam was speaking after the launch of your committee's latest report on the collapse of trust in the digital age.

In a statement, the committee said that democracy itself is threatened by a "pandemic" of online misinformation, which can be an "existential threat" to our way of life.

He said the threat of online misinformation had become even clearer in recent months during the coronavirus pandemic.

Among the report's 45 recommendations was that the social media regulator – considered the current broadcast regulator, Ofcom – should hold platforms responsible for the content they recommend to a large number of people, once it exceeds a certain limit.

It also recommended that companies that do not repeatedly comply should be blocked at the ISP level and fined up to 4% of their global turnover, and that political advertising should be conducted with stricter standards.

The new chief executive of Ofcom has warned that heavy fines would be part of his plans if he were appointed regulator.

The DCMS said: "Since the beginning of the pandemic, specialized government units have been working around the clock to identify and refute false information about the coronavirus.

"We are also working closely with social media platforms to help them remove incorrect claims about the virus that could endanger people's health."


Related Articles

Leave a Comment

19 − 9 =

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More