Monday, November 18, 2024

Ofcom vows to ‘come down hard’ on social media firms that fail to protect children online

Must read

Her comments came as Ofcom, the online regulator, set out the timetable for companies to comply with the new Online Safety Act. It follows a call by Peter Kyle, the Secretary of State for Science, Innovation and Technology, for Ofcom to implement the legislation “as soon as possible”.

The regulator said that in December it will publish the “first edition” of its codes and guidance setting out how companies should protect users from illegal harms including child sex abuse, terrorism, assisting suicide, harassment, hate offences and revenge porn. Firms will have three months to conduct risk assessments.

“It’s definitely not just a paper exercise,” said Dame Melanie. “We are very clear that the first step that any responsible company needs to take, is to actually assess risks that they’ve never assessed before.”

She added that companies needed to be “honest and transparent” about what their “services are actually exposing their users to”.

“If we don’t think they’ve done that job well enough, we can take enforcement action, simply against that failure,” she added.

Tech Firms have already made changes

Dame Melanie said changes could include allowing people to take themselves out of group chats, without anyone else being able to see they had left.

“Young people should be able to take themselves out of group chats that they know are toxic for them, without everybody being able to see and that’s one of the things that we are going to be expecting to see change from social media and messaging services,” she said.

Ofcom said it had already secured better protections for users from UK-based video-sharing platforms, including OnlyFans and other adult sites introducing age verification.

It said BitChute had also improved its content moderation and user reporting; and Twitch had introduced measures to stop children seeing harmful videos.

Meta and Snapchat had made changes that Ofcom proposed in its illegal harms consultation to protect children from grooming. These included Instagram, Facebook and Snapchat introducing changes to help prevent children being contacted by strangers; and Instagram’s ‘Teen Accounts’ to limit who can contact teens and what they can see.

Latest article