• Half of British teenagers exposed to harmful content on social media
  • Two in three don’t report posts that they believe shouldn’t be allowed
  • Three in four people believe social media companies should do more to protect children from harmful content

Half of teenagers (46%) aged 13-17 who use social media have seen posts that they believe should not be allowed, new research published today by the Chartered Institute of Marketing (CIM) has revealed.

Following reports that more than four people a week are being referred for hypochondriac treatment in the North East and experts blaming social media for increasing levels of anxiety, the survey of over 2,500 adults and teenagers, shows that 95% of young people aged 13-17 have a social media account, with the most popular being YouTube (79%), followed by Instagram (73%), Snapchat (66%) and Facebook (45%).

Despite many children coming across potentially harmful posts on social media platforms, very few are doing anything about them. Almost two thirds (62%) of teenagers who have seen content they think they shouldn’t have, say they either rarely or never report these posts. Only 7% say they always do.

Seeing these posts does seem to be discouraging some children from engaging on social media; close to half (44%) agree that they would be put off from engaging in discussion and conversations online. But very few are prepared to give up their accounts; two in three (66%) said that seeing posts on social media that should not be allowed would not make them want to delete their account, while more than half (52%) said it would not put them off signing up for an account in the first place.

The survey put similar questions to adults and found that almost half (44%) of those who had seen harmful content on social media say they rarely or never report it. While only one in five (20%) say that they always report it.

In the North East, 29% of adults say they have seen harmful content online in the last six months, with 33% never or rarely reporting it.

 Who is responsible?

When it comes to who should be protecting children under the age of 18 from harmful or inappropriate content on social media, the public place responsibility on parents and social media companies.

Three quarters of people over 18 say it is the responsibility of parents/guardians (76%) and social media companies (74%) to protect children on social media.

However, most people believe strongly that social media companies should be removing harmful content from social media.

  • Who’s responsible? Eight in ten (83%) said that social media companies have a responsibility to monitor for harmful content on social media. Many people also felt there was a role for government (49%), and individuals themselves (57%).
  • Who pays? When it came to paying for dealing with harmful content on social media the vast majority of the public felt this was the responsibility of social media companies. 67% of adults said the cost of monitoring and regulating harmful content on social media should be borne by the social media companies themselves, compared with only 14% who said government was responsible.

Revenue from marketing and advertising is the main source of income for most social media companies and the Chartered Institute of Marketing believes more must be done to protect users on social media if UK businesses are to continue to spend their marketing revenues reaching customers through social media platforms.

Charlie Nettle, North East Chair at CIM, said: “When new regulations take effect social media companies will have a legal responsibility to do something about inappropriate content once it has been reported. But we don’t believe we should wait for the regulations, this is something that can happen now.

“Our research shows that we could make a huge difference if we all hit the report button when we see something that shouldn’t be on social media.

“It is alarming that so many children have seen inappropriate posts on social media and failed to report them. Moreover, while more adults do report harmful content, it is concerning that only one in five always do so.

“Subsequently, we are calling on government to fund a social marketing campaign to educate people about how to report harmful content, and the importance of doing so whenever you see it.”

The research also demonstrates the prevalence and impact of harmful content being seen by adults on social media:

  • Harmful content: Three in ten (29%) adults said that they had seen content that could be damaging if seen by children, encourage illegal activity or be considered abusive or offensive, in the last 6 months. Only one in five (21%) said that they had not seen harmful content, while a third (32%) were not sure or couldn’t recall.
  • Who’s seeing it? Younger adults are much more likely to recall seeing harmful content than older generations; 46% of 18-24 year olds say they had seen it in the last 6 months, compared with only 16% of those aged 55 and over. Those people most active on social media are the most likely to have seen harmful content. Among those who are active on all three of the most popular platforms, Facebook, Instagram and Twitter, 44% say that they had seen harmful content in the past six months.
  • Stifling debate: Three quarters of people who use social media (74%) say that the presence of abusive or offensive content can put them off engaging in discussions on social media, while more than half (52%) agree that it would make them consider deleting their account.