YouTube to Alter Recommendations to Filter Videos with “Questionable” Content

Article Source

YouTube announced in a January 25 blog post that it will begin to reduce recommendations of content that might — by its own standards — “misinform viewers in harmful ways.” Last year, the Google-owned video-sharing service began adding “fact checks” to videos that, in its opinion, challenged “well-established historical and scientific topics that have often been subject to misinformation online.”

You know, well-established scientific and historical “facts” such as man-made climate change. Videos such as this one, where The New American foreign correspondent Alex Newman makes an unnamed climatologist look foolish at the UN COP24 in Katowice, Poland, are flagged beneath the video with a Wikipedia explanation of what climate change is, in their opinion.

YouTube announced the change after a Buzzfeed report, which chronicled how quickly viewers could be led down a “rabbit hole” of conspiracy theories through YouTube’s current recommendations algorithms.

(Question: After the few weeks Buzzfeed just had, how can YouTube be persuaded to take their advice on anything?)

“We’ll continue that work this year,” announced YouTube officials, “including taking a look at how we can reduce the spread of content that comes close to — but doesn’t quite cross the line of — violating our Community Guidelines. To that end, we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways — such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat or making blatantly false claims about historical events like 9/11.”

YouTube was created in 2005 as a way to simply share videos on the internet, and immediately became an enormous success, with internet giant Google obtaining the company in 2006. It has morphed into a media giant with over one billion hours being watched every day. Content creators can earn large amounts of money by partnering with its AdSense application, which allows creators to get paid for ads shown during their videos.

But videos that don’t meet YouTube’s Community Guidelines for one reason or another can quickly become “demonetized,” and earn creators nothing. And YouTube and parent company Google hold all the cards on things that run afoul of their standards.

In the past year, YouTube has shown a blatantly leftist bias in those videos that it considers “harmful” on its platform. In April, YouTube began banning videos which promoted the sale or ownership of guns and gun accessories. In August, the video-sharing service joined other tech giants such as Facebook, Instagram, and Apple in banning popular radio host Alex Jones and his InfoWars program from their platforms. In December, it joined Facebook and Instagram, among others, in banning CRTV host Gavin McInnes over a “Community Guidelines” infraction.

YouTube’s blog post went on. “This change relies on a combination of machine learning and real people. We work with human evaluators and experts from all over the United States to help train the machine learning systems that generate recommendations. These are trained using public guidelines and provide critical input on the quality of a video.”

But among those “human evaluators and experts” that YouTube relies upon is the leftwing hate group the Southern Poverty Law Center (SPLC). The SPLC is notorious for demonizing people who simply disagree with their left-wing/socialist philosophy. If they are among the so-called “experts” that YouTube is consulting with, the video-sharing service is going to take on an even more leftist appearance than it already has.

YouTube will be testing its new recommendations system in the United States before rolling it out worldwide. While the company insists that the changes will affect only a small portion of videos — “less than one percent of the content on YouTube” — the announcement makes one wonder what subjects besides flat-earthers and 9/11 conspiracy videos will be affected.

Will videos that challenge the man-made global warming hypothesis be affected? They were in the last update of YouTube policies. What about videos that challenge the constitutionality of events in Washington? Will videos such as that be affected? What about videos that question the morality of abortion? Illegal migration? Globalism?

While YouTube insists that the videos with what they consider questionable content will still be available, how long will it be until they just decide that no one ever needs to see such videos?

This is yet another case of “soft censorship” of unapproved ideas from a company with a history of “shadow banning” conservative and constitutional thought.

Be the first to comment

Leave a Reply

Your email address will not be published.


*