Children can’t live-stream YouTube videos unless accompanied by an adult
Silhouettes of mobile device users are seen next to a screen projection of Youtube logo in this picture illustration taken Mar 28, 2018. (Photo: Reuters/Dado Ruvic)
The Google-owned video platform, in a blog post on Monday (Jun 3), also said it is limiting recommendations of videos that depict "minors in risky situations".
The updated policies come after YouTube in February announced that it would disable the ability to leave comments on nearly all videos featuring children.
The announcement also comes in the wake of a New York Times report on Monday, citing research that YouTube's recommendation system has been suggesting videos of "prepubescent, partially clothed children" to users who had watched sexually themed content.
According to YouTube, it has applied new restrictions to the algorithm-based recommendations system curbing recommendations of videos with minors to "tens of millions of videos".
However, it will continue recommending many videos with children because an all-out ban would hurt creators who rely on the recommendation engine to generate views, according to YouTube.
YouTube has been targeted by critics for years over its inability to stop the spread of inappropriate content and behavior on the platform as it relates to minors. Earlier this year, a scandal involving sexually coded comments left by child predators on certain videos led several big brands to suspend their advertising.
On the live-streaming front, YouTube said it is banning live-streamed broadcasts by children "unless they are clearly accompanied by an adult."
YouTube added that it has added new artificial-intelligence classifiers in recent months for live video to "find and remove more of this content". Channels that run afoul of the updated policy may lose their ability to live stream.
"Responsibility is our number one priority, and chief among our areas of focus is protecting minors and families," YouTube said in the blog post. "With this update, we'll be able to better identify videos that may put minors at risk and apply our protections" across a bigger segment of videos.
According to YouTube, the "vast majority" of videos featuring minors on the service "do not violate our policies and are innocently posted".
Still, there's a large number of videos that do: In the first quarter of 2019, YouTube said, it removed more than 800,000 videos for violations of our child-safety policies (claiming the majority of those were deleted before they had 10 views).
YouTube also noted that it works with law-enforcement agencies to investigate crimes against children. It said reports sent to the National Center for Missing and Exploited Children prompted more than 6,000 such investigations in the past two years./.