The Guardian’s view on regulating pornography: £1 million fine doesn’t prove the Online Protection Act is working editorial

by
0 comments
The Guardian's view on regulating pornography: £1 million fine doesn't prove the Online Protection Act is working editorial

AThere is increasing awareness of the harm caused by online pornography. Last month, the government bowed to pressure from campaigners and promised to make depictions of strangulation illegal. This is extremely disturbing given that research shows that most children have seen such material, even more so given the evidence that watching “choking” makes people – mostly men – more likely to do so in real life. This week, the Guardian investigated the distressing effects of deepfake pornography in schools, and interviewed the women behind the successful campaign to criminalize the non-consensual creation of deepfake intimate images.

Ofcom’s announcement that it has fined Belize-based pornography company, AVS Group, £1 million seems appropriate. The regulator’s online security director Oliver Griffiths referred to the “tide changing” when enforcement powers in the Online Security Act came into force on BBC radio. Age-verification checks on AVS websites introduced to protect children have not been deemed effective enough. If the company does not pay, Mr Griffiths said he would take steps to block the site.

One thing that everyone involved with online regulation agrees on is that technology is advancing at an incredible rate. The danger is that society, and the systems we use to manage risks, cannot survive. While it is good to see Ofcom taking action, it is worrying that 90 other companies – 83 of which run pornography sites – are also being investigated, and it is said that further fines could be imposed. Liz Kendall, the technology secretary, warned last month that the regulator risked losing public trust if it did not speed up implementation of the Online Security Act and address emerging threats.

Regulation and enforcement in the online sector is complex. When Ofcom fined the controversial forum 4chan, the companies behind it and another forum, Kiwi Farms, £20,000, filed a legal caseThey want the US court to rule that Britain’s online security laws and code of conduct do not apply to them,

Such challenges make Ofcom’s job difficult. But one should not be afraid of this. Campaigners such as Ian Russell, whose daughter Molly took her own life after viewing harmful material, are right to highlight the moral imperative to make the internet safer for children. They want a new “duty of candor” imposed on tech companies to public authorities as well, and for the regulator to be more proactive and less passive.

All this is made even more urgent by emerging concerns about agentic AI or chatbots, which have been accused in several US lawsuits of acting as “suicide coaches”. If Ms. Kendall believes that there are Shortcomings in current online security lawsAs he said Wednesday, he has to shut them down. Crossbench peer and online safety campaigner, Beebon Kidron, has tabled an amendment to the Government’s Crime and Policing Bill which would achieve this. Ministers should not delay this, as happened with deepfake images.

Online safety is not the only area in which Ofcom has been accused of damaging inaction. It has also appeared reluctant to tackle racism and misinformation about climate policy on GB News. Regulatory oversight over news and other media has always been important. But in our age of pocket computers, shape-shifting technology and political polarisation, questions about Ofcom’s performance – particularly in relation to children – have arguably never been more pressing.

Related Articles

Leave a Comment