YouTube’s child protection mechanism is breaking down, according to some of the company’s volunteer watchdogs.
There’s a constant anxiety that those seeking to abuse or groom young children will use social media to reach them – and YouTube is aware of the problem. The video-sharing site has a special network of volunteers, called Trusted Flaggers, who help identify worrisome posts and comments on the network.
But now members of YouTube’s Trusted Flagger programme have told BBC Trending that the company has a huge backlog of reports, some months old, and that the company responds to only a small fraction of complaints from the public about child endangerment and suspected child grooming.
One volunteer says that he made more than 9,000 reports in December 2016 – and that none have been processed by the company.
A small group of Trusted Flaggers also grew suspicious about effectiveness of YouTube’s public reporting abuse page. Over a recent 60-day period, they used it to flag up hundreds of accounts which potentially violated the site’s guidelines.
However, out of a total of 526 reports, they received only 15 responses, and the volunteers say the survey is emblematic of a larger problem with lack of child protection on the site.
Sexually explicit comments
The reports were made against accounts which leave potentially objectionable comments, mostly on videos made by young teenagers and children.
The videos themselves, according to the Trusted Flaggers and examples of which have been seen by Trending, are not pornographic in nature and do not contain nudity. Many are innocent videos of young people emulating their favourite YouTube stars by performing make-up tutorials and filming their “morning rituals”, or exercising, or just goofing around with friends.
The comments below the videos, however, are often sexually explicit. Some encourage the young YouTubers to make videos with fewer or no clothes, talk about their bodies, or simply make graphic sexual references. Some ask children to move to private chat apps or other, less public means of communication.
Trending previously reported on inappropriate comments on similar videos which sparked rumours of a huge “paedophile ring” operating on the site.
Those allegations of a large organisation or “ring”, spread by a few popular YouTube stars, were backed up by scant evidence. But the persistence of sexualised comments on videos made by young people has troubled several Trusted Flaggers who contacted BBC Trending and who believe the popular video-sharing site isn’t doing enough about potential sex offenders using the site.
YouTube’s Trusted Flagger programme began in 2012, and is comprised of groups – including law enforcement agencies and child protection charities – along with concerned individuals, some of whom work in the tech industry.
The volunteers aren’t paid by YouTube, but do receive some perks such as invitations to Trusted Flagger meet-ups.
They are given a tool which allows them to report multiple videos, comments or accounts at one time for concerns ranging from child exploitation to violent extremism.
YouTube employees then review the complaints, and the company says reports of violations by Trusted Flaggers are accurate more than 90% of the time.
Despite that headline hit rate, however, of the 15 responses received by Trusted Flaggers testing the public reporting mechanism, just seven (47%) resulted in action being taken.
Given the volume of the reports they file on a regular basis, the Trusted Flaggers who spoke to Trending estimated that there are thousands of potential predators using YouTube to contact young people.
The disturbing YouTube videos that are tricking children
Facebook criticised over refusal to remove child images
Listen to BBC Trending radio on the BBC World Service
Visit BBC Trending on Facebook
Trusted Flagger insider
One of the Trusted Flaggers, who requested to remain anonymous so not to jeopardise his volunteer role, told BBC Trending that lack of response shows “there is no reliable way for a concerned parent, child in danger, or anyone else to reliably report and get action on a predatory channel.”
The volunteer, who joined the Trusted Flagger programme in 2014, said that the time it took for YouTube to take action on his reports has steadily increased over the time he’s been involved in the programme.
“It’s been an on-going issue since I joined, with the average report I send directly to staff taking three months to be reviewed. Over the last year, it has been significantly worse and as a result of this I still have reports outstanding from last year,” he says.
For example, the volunteer says, he is still awaiting responses on more than 9,000 complaints he made in December 2016.
“They [YouTube] have systematically failed to allocate the necessary resources, technology and labour to even do the minimum of reviewing reports of child predators in an adequate timeframe,” he says. “There also seems to be an overall lack of understanding regarding predatory activity on the platform and that thousands of children are being targeted and manipulated.
“YouTube has inadvertently become a portal of access to children for paedophiles around the world,” he says. “In the long term, YouTube needs to change their stance from being reactive to proactive.”
Staying safe online
The NSPCC has a series of guidelines about keeping children safe online
They promote the acronym TEAM: Talk about staying safe online; Explore the online world together; Agree rules about what’s OK and what’s not; and Manage your family’s settings and controls.
There are even more resources on the BBC Stay Safe site.
In March, YouTube told Trending that the company has a “zero-tolerance policy for sexual content involving minors. Engaging in any type of activity that sexualizes minors – including leaving inappropriate comments – will immediately result in an account termination.”
But the volunteer who spoke to BBC Trending says that the policy “doesn’t translate into action”, and although reports submitted by the Trusted Flaggers are eventually acted upon, the same cannot be said of reports submitted by the public.
“When or if our reports eventually get reviewed, YouTube generally apply their policy correctly and terminate the reported channels,” he says. “I believe the same cannot be said for public reports.”
The volunteer says that in recent weeks, there has been an effort by YouTube paid staff to clear some of the backlog of child endangerment reports submitted by Trusted Flaggers, but that there is “still an enormous backlog in reports and YouTube seem to be doing the bare minimum to fix these issues.”
YouTube declined to give an interview. In a statement, a company spokesperson told Trending: “YouTube strictly prohibits content that sexually exploits minors… We encourage all users, including Trusted Flaggers, to continue flagging videos or comments so we can take action.”
Will Gardner, chief executive of internet safety charity Childnet International says the video-sharing site should take heed of the Trusted Flagger concerns.
“I would hope that YouTube take this very seriously as the social media environment does rely on the effectiveness of the reporting system and user confidence in the reporting system,” he says.
Gardner says communication is the key to keeping children safe online.
“It sounds like a bit of a cliche, but take an interest in what your child is doing online just as you do with what your child is doing offline,” he says. “You need to keep your personal information safe and you need to recognise that arranging to meet someone that you’ve only met online is dangerous
“You have to think about the reliability of the information that you see online – not everything is true that you see and not everyone that you speak to is reliable or trustworthy.”
In a statement, the UK Home Office told Trending: “It is vital that both Government and industry work together to tackle the issue of internet safety… But we all still need to do more and it is important that internet companies also take their responsibilities in this area very seriously.”