Up Next: More Woke Videos for Your Kids
YouTube provides a glimpse into its recommendation system
"Having spoken to LGBTQ creators and YouTube employees, I understand just how important it is that teens and students be able to view [LGBT content]." - Susan Wojcicki, CEO at YouTube.
The above quote comes to us not from a secret meeting at Google headquarters or a semi-private conversation between YouTube and a handful of creators, but from YouTube's own public blog. In June of 2017, Wojcicki wrote a post entitled, "A Message on Pride and LGBTQ Initiatives," which was intended to reassure the LGBT community that YouTube's "Restricted Mode" - a relatively unknown feature used in libraries, schools, and other public institutions to control the content viewers see - could not be used to provide viewers with an LGBT-free experience. She writes, "the voices of our LGBTQ community have been key to pushing society in a more understanding and tolerant direction."
As an aside: Did you realize you were being pushed?
That post stemmed from an incident earlier that year. In March 2017, in response to outrage from LGBT content creators that their videos were being withheld from viewers in Restricted Mode, Johanna Wright, Vice President of Product Management At YouTube, did damage control in a post of her own.
"Over the last several months, and most definitely over the last few days from LGBTQ and other communities, we’ve gotten lots of questions around what Restricted Mode is and how it works. We understand that this has been confusing and upsetting, and many of you have raised concerns about Restricted Mode and your content being unfairly impacted. The bottom line is that this feature isn’t working the way it should. We’re sorry and we’re going to fix it."
Later in the post, she writes:
"Today, about 1.5 percent of YouTube’s daily views come from people who have Restricted Mode turned on. But we know this isn’t about numbers; it’s about the principle of anyone having access to important content and different points of view."
Take that in, in the context of Big Tech's recent censorship campaign against dissenting voices speaking out on topics ranging from election fraud, COVID response, and vaccine mandates.
In the piece, Wright apologized to the LGBT community and assured them that, "thanks to your feedback, we’ve manually reviewed the example videos mentioned above and made sure they’re now available in Restricted Mode -- we’ll also be using this input to better train our systems. It will take time to fully audit our technology and roll out new changes, so please bear with us."
I want to zero in on the phrase, "to better train our systems." I have written in previous posts that the algorithms that dictate the content we see in our feeds are developed and maintained by highly-paid engineers that live and work almost exclusively in the bluest areas of the country. Their post-Christian values infuse everything they do.
So when a new mother in Provo, Utah, exhausted from the crying, the diaper changing, and the guilt that comes with the job, puts her child in front of YouTube to get a few minutes reprieve, what is her child going to see?
Any parent of young children can tell you that screens work. They calm the child, distract him, and gain his attention in a way few other things can. A mother trapped in a waiting room or a restaurant with a fussy baby now has a new weapon at her disposal. Her phone, typically with YouTube at the ready. But while the calming effect of YouTube is nice, every single video her child watches was selected for him not by his mother, but by a secret algorithm from San Mateo county, CA, which gave 78% of its vote to Biden in 2020.
So it was with a sense of anticipation that I read a recent post on the official YouTube blog called, "On YouTube's recommendation system: A deeper look into how YouTube's recommendation system works," written by Cristos Goodrow, VP of Engineering at YouTube, who writes:
"Recommendations are seen as a mysterious black box. We want these systems to be publicly understood, so let me explain how they work, how they’ve evolved, and why we’ve made delivering responsible recommendations our top priority."
Would his post shed some light on the hidden algorithm that we the 74 million trust to entertain and distract our children?
Goodrow defines YouTube's recommendation system as a mechanism that "connect[s] billions of people around the world to content that uniquely inspires, teaches, and entertains," and reminds us that the two places where the recommendations appear are the "Up Next" panel and our own unique YouTube homepages.
He starts with an example featuring what the recommendation system found for his daughter. "For my oldest daughter, it was finding laughter and community with the Vlogbrothers." And later, "A few years ago, our system recommended videos from Tyler Oakley to my oldest daughter, because that’s who many of the people who watched Vlogbrothers also watched at the time. She ended up becoming a big fan, so much so that we later took her to see him at a meet-up."
Tyler Oakley is an LGBT content creator. Not a gay man with a channel that covers fashion, gardening, or some unrelated topic, but a creator that makes LGBT-specific content. Among his most popular videos are, "The 'Boyfriend' Tag," "Watching Strange Porn," and "Twin Twinks Learn Gay Slang." To folks like Goodrow, this is perfectly normal. The algorithm connected his daughter to Tyler Oakley, made her a fan, and there is nothing to see here.
The system began in 2008 as one which placed popular videos in one big "Trending" page which Goodrow claims was unpopular. "Today, our system sorts through billions of videos to recommend content tailored to your specific interests." It does this by using a computer science technique called "machine learning," which attempts to make an algorithm better by feeding it massive amounts of (our) data. "It’s constantly evolving, learning every day from over 80 billion pieces of information we call signals. That’s why providing more transparency isn’t as simple as listing a formula for recommendations, but involves understanding all the data that feeds into our system."
Calls for algorithm transparency or algorithm auditing will likely face this excuse. See, its too complicated, there are too many signals, and there is too much data for us to even approximate why your child sees the videos he sees. Even if we could produce a report for every decision our system made, you're probably not smart enough to interpret it anyway. The post features a graphic called, "Recommended for You: Key Moments in YouTube's Recommendation System" and in 2017 there is the result of the incident I mentioned earlier, "Began evaluating machine learning that powers our system for fairness across protected groups."
But our children are not one of their protected groups. The values of the post-Christian New Faith do not allow families to have a YouTube experience free of LGBT, CRT, feminist, or climate change content. These ideologies cannot be criticized and they cannot be blocked. Once we hit the play button on YouTube, we don't know how their conception of "fairness across protected groups" will manifest itself in the content our children see. The engineers behind YouTube believe that a "push" is necessary for a more understanding and tolerant society. But we know their conception of understanding and tolerance is based upon a worldview in which those ideals cannot be achieved without destroying the systems of oppression (i.e. capitalism, patriarchy, white supremacy, heteronormativity, et al.) that they blame for all of society's problems.
In an upcoming post I'll provide a way to stick your kids in front of a screen without the danger of recommendations provided by Marxist algorithms.
Don't forget to like this post, share it with your friends, and bookmark my censorship-resistant backup site in case I get deplatformed (Tor network access required).