crossorigin="anonymous"> Social media has been given a ‘last chance’ to crack down on illegal posts. – Subrang Safar: Your Journey Through Colors, Fashion, and Lifestyle

Social media has been given a ‘last chance’ to crack down on illegal posts.


Getty Images A boy sits on the floor looking at his smartphone.Getty Images

Online platforms must start checking whether their services expose users to illegal content by March 16, 2025 or face financial penalties with the implementation of the Online Safety Act (OSA). is

Ofcom, the regulator enforcing the UK’s internet safety law, published its final regulations on Monday on how firms should deal with illegal content online.

Platforms have three months to conduct risk assessments that identify potential harm to their services or face fines of up to 10% of their global turnover.

Ofcom chief Dame Melanie Davis told BBC News it was the “last chance” for the industry to make changes.

“If they don’t start seriously changing the way they operate their services, I think there’s going to be more and more calls for things like child bans on social media,” he said.

“I am now asking the industry to step up, and if they don’t they will hear from us with enforcement action from March.”

But critics say the OSA fails to address many of the harms to children.

Andy Burrows, head of the Molly Rose Foundation, said the organization was “surprised and disappointed” by the lack of specific, targeted initiatives for platforms to tackle material on guided suicide and self-harm.

“Strong regulation is the best way to deal with illegal content, but it is not acceptable for the regulator to take a gradualist approach to immediate threats to life,” he said.

Under Ofcom’s codes, platforms will need to identify if, where and how illegal content may appear on their services and how they will prevent it from reaching users.

According to the OSA, this includes child sexual abuse material (CSAM), controlling or coercive behavior, extreme sexual violence, promoting or facilitating suicide and material involving self-harm.

Ofcom launched a consultation on its illegal content codes and guidance. In November 2023.

It says it has now “strengthened” its guidance for tech firms in several sectors.

It includes clear requirements for the removal of abusive content from intimate images, which helps guide firms on identifying and removing content about coercion of women.

Ofcom Codes

Some of the child protection features required by Ofcom’s codes include ensuring that social media platforms stop advising people to friend children’s accounts, as well as prohibiting them from sharing personal information. Also warn about dangers.

Some platforms must also use a technology called hash matching to detect child sexual exploitation material (CSAM) — a requirement that now applies to small file hosting and storage sites.

Hash matching is where media is given a unique digital signature that can be checked against a hash associated with known content – in this case, a database of known CSAMs.

Many big tech firms have already come up with security measures for teenage users. Controls to give parents more oversight over their social media activity Precautionary regulations to address risks to juveniles.

For example, on Facebook, Instagram and Snapchat, users under the age of 18 cannot be searched or messaged by accounts they do not follow.

In October, Instagram too Started blocking some screenshots in direct messages. To combat attempts at sexual exploitation – which experts warn are on the rise, often targeting young people.

‘Snail’s Speed’

Concerns have been raised throughout the OSA’s journey over its rules applying to a large number of different online services – with campaigners often warning about the privacy implications of the platform’s age verification requirements. are

and parents of children who have died after being exposed to illegal or harmful content. Criticized Ofcom for moving at a “snail’s pace”..

The regulator’s illegal content codes will need to be approved by Parliament before they come into full force on March 17.

But platforms are now being told, with the assumption that the codes will have no problem getting through parliament, and firms must take steps to prevent users from accessing illegal content by that date.



Source link

Leave a Reply

Translate »