A big priority for YouTube right now involves tagging conservative videos as “hate speech” and pushing them to the bottom of users’ search results. But the Google-owned visual content empire apparently sees nothing wrong with algorithmically steering its pedophile contingent towards videos of young children.
It was none other than the The New York Times (NYT), believe it or not, that recently exposed YouTube’s automated video recommendation system for populating pedophile accounts with children’s videos. Not only that, but YouTube’s system is further categorizing some of these videos as “sexually-themed content,” when in fact they’re just innocent representations of kids being kids.
“YouTube has curated the videos from across its archives, at times plucking out the otherwise innocuous home movies of unwitting families,” NYT columnists Max Fisher and Amanda Taub wrote in an eye-opening piece about what YouTube’s algorithms are doing these days. “The result was a catalog of videos that experts say sexualizes children.”
According to Jonas Kaiser, a researcher at Harvard University‘s Berkman Klein Center for Internet and Society, YouTube’s algorithm is not only tagging children’s content as “sexual,” it’s also driving pedophiles who otherwise wouldn’t even know it’s there to view it.
“That’s the scary thing,” Kaiser is quoted as saying, referring to the children’s channels that YouTube is directly connecting to the user accounts of pedophiles. Kaiser added in a statement that the accuracy with which YouTube is now connecting pedophiles to children’s channels is “disturbingly on point.”
For more news about the authoritarian censorship agenda of Google and YouTube, be sure to check out Censorship.news.
As Fisher and Taub also point out in their article, the recommendations that YouTube makes to its pedophile users would appear to become progressively more provocative, starting out with videos of children playing, perhaps, and later moving towards videos of children in swimsuits.
What’s worse is that pedophile YouTube users who tend to watch “erotic” content will increasingly be steered towards younger and younger versions of it, all with YouTube’s approval.
“So a user who watches erotic videos might be recommended videos of women who become conspicuously younger, and then women who pose provocatively in children’s clothes,” Fisher and Taub explain.
“Eventually, some users might be presented with videos of girls as young as 5 or 6 wearing bathing suits, or getting dressed or doing a split.”
It’s important to point out here that the videos of children that YouTube recommends are generally innocent, at least to normal people who aren’t perverts. It’s the fact that these videos are being recommended specifically to pedophiles that indicates YouTube should be focusing its efforts on fixing this problem, rather than targeting conservatives for “hate speech.”
“On its own, each video might be perfectly innocent, a home movie, say, made by a child,” Fisher and Taub explain. “Any revealing frames are fleeting and appear accidental. But, grouped together, their shared features become unmistakable.”
In one instance, the mother of a 10-year-old girl from Brazil was shocked to learn that an otherwise innocent video that her daughter and her friend uploaded to YouTube, which showed them swimming in a backyard pool, had been captured by YouTube’s recommendation system and presented to pedophiles.
A YouTube spokeswoman has since come out to claim that “protecting kids is at the top of [YouTube’s] list.” However, YouTube has yet to actually turn off its recommendation system for children’s videos, instead focusing all of its time and effort on demonetizing video content that in any way challenges the LGBTQ agenda, for instance.
“Google has struggled with pedophilia on YouTube for years, and in December 2017, the company claimed it would hire ‘thousands’ of human moderators to combat the problem,” Breitbart News reports.
Sources for this article include: