Tech firms need to ‘act now’ to protect children from online harm, says Ofcom

Children as young as primary school age are being exposed to violent content online, a study by Ofcom has found.

The regulator spoke to 247 children and found that they had commonly encountered violent 18+ gaming content, verbal discrimination and fighting. All of the children reported seeing violent content online, mostly via social media, video-sharing and messaging sites and apps, with “many” saying this had happened before they had reached the platforms’ minimum age restrictions.

Sharing videos of local school and street fights has also become normalised for many of the children, for some because of a desire to build online status among their peers and followers and for others to protect themselves from being labelled as “different” for not taking part. Some children also mentioned seeing extreme graphic violence, including gang-related content, albeit much less frequently.

Ofcom said the teenage boys it spoke to were the most likely to actively share and seek out violent content, often motivated by a desire to “fit in” and gain popularity, due the high levels of engagement the content attracted. Some 10 to 14 year olds described feeling pressure not only to watch violent content, but to find it “funny”, fearing isolation from their peers if they did not.

Older children are less likely to share violent content, however, this is because they appear to be more desensitised to it. Most children said they unintentionally encountered violent material through large group chats, posts from strangers on their newsfeeds or through systems which they referred to as “the algorithm”.

Many felt they had no control over the violent content suggested to them, or how to prevent it, which could make them feel upset, anxious, guilty and even fearful, particularly given they rarely reported it.

A second study for Ofcom by Ipsos UK and social research agency Tonic, revealed that children and young people who had encountered content relating to suicide, self-harm and eating disorders characterised it as “prolific” on social media, saying that such frequent exposure contributed to a “collective normalisation and often desensitisation” to the gravity of the issues.

Again, children and young people tended to initially see the content unintentionally through personalised recommendations on their social media feeds. Some of those who had lived experience of the issues experienced a worsening of their symptoms after first viewing this content online, while others discovered previously unknown self-harm techniques.

According to a new study by the National Centre for Social Research and City University, young children are being bullied online regularly. This has had a bad impact on their feelings, as well as their mental and physical health.

Children say that messages and comments are the big ways bullies can reach them. Some were even picked on in group chats where they didn’t even want to be. Ofcom admitted that youngsters often don’t trust the system to help them when they’re having trouble online.

They worried that dwelling on harmful content in order to report it could lead to being recommended more of the same, leading most to scroll past it instead.

Some kids have said that when they do report it, they only get an automated message and they’re afraid they won’t keep their anonymity. Gill Whitehead, Ofcom’s online safety group director, said: “Children should not feel that seriously harmful content including material depicting violence or promoting self-injury is an inevitable or unavoidable part of their lives online.”

She added: “Today’s research sends a powerful message to tech firms that now is the time to act so they’re ready to meet their child protection duties under new online safety laws. Later this spring, we’ll consult on how we expect the industry to make sure that children can enjoy an age-appropriate, safer online experience.”

Source Link

LEAVE A REPLY

Please enter your comment!
Please enter your name here