Win / KotakuInAction2
KotakuInAction2
Sign In
DEFAULT COMMUNITIES All General AskWin Funny Technology Animals Sports Gaming DIY Health Positive Privacy
Reason: None provided.

Some people analyzed YouTube recommendations a few years ago.

One analysis charted the number of views required to trend for various channels. Mainstream consistently needed far lower views to trend, like how PBS needed only 10,000.

Another compared the "similar" recommendations be political bias. The amount of left wing content was substantially larger, but it also recommended less to left/center channels than the reverse.

You also have the banning of certain channels or terms. Steven Crowder noticed this, since you couldn't even search for one of his videos unless you typed in the complete name of the video AND his name.

"Authoritative sources" are also almost definitely going to fill the first page of many search results if its related to politics. Same goes for google search.

They even removed publicly viewing downvotes for political reasons.

There was also the "van life" trend, that started being recommended out of nowhere, starting with youtube automatically adding some random girl doing it to people's subscription list, and with only 1 or 2 videos posted to her channel.

What relies on reporting is shadow banning (excluding Reddit, which uses substantially automates this). Banning people for certain actions also relies on reports, but the rules (or interpretations) does primarily rely on the site and employee's political biases. Different handling depending on targeted race of "hate speech" is interpretation of the rules by staff while banning people for misgendering is a rule created under a political bias.

Companies are working on and improving AI to do reporting of their own, such as YouTube automatically creating transcripts, and using them (or titles) to automatically demonetize videos. The breadth of where they target with this tech will only increase, especially as governments also put in their own specialized requirements. Truth Social also uses the same censorship tech twitter does for its twitter alternative.

2 years ago
1 score
Reason: None provided.

Some people analyzed YouTube recommendations a few years ago.

One analysis charted the number of views required to trend for various channels. Mainstream consistently needed far lower views to trend, like how PBS needed only 10,000.

Another compared the "similar" recommendations be political bias. The amount of left wing content was substantially larger, but it also recommended less to left/center channels than the reverse.

You also have the banning of certain channels or terms. Steven Crowder noticed this, since you couldn't even search for one of his videos unless you typed in the complete name of the video AND his name.

"Authoritative sources" are also almost definitely going to fill the first page of many search results if its related to politics. Same goes for google search.

They even removed publicly viewing downvotes for political reasons.

There was also the "van life" trend, that started being recommended out of nowhere, starting with youtube automatically adding some random girl doing it to people's subscription list, and with only 1 or 2 videos posted.

What relies on reporting is shadow banning (excluding Reddit, which uses substantially automates this). Banning people for certain actions also relies on reports, but the rules (or interpretations) does primarily rely on the site and employee's political biases. Different handling depending on targeted race of "hate speech" is interpretation of the rules by staff while banning people for misgendering is a rule created under a political bias.

Companies are working on and improving AI to do reporting of their own, such as YouTube automatically creating transcripts, and using them (or titles) to automatically demonetize videos. The breadth of where they target with this tech will only increase, especially as governments also put in their own specialized requirements. Truth Social also uses the same censorship tech twitter does for its twitter alternative.

2 years ago
1 score
Reason: None provided.

Some people analyzed YouTube recommendations a few years ago.

One analysis charted the number of views required to trend for various channels. Mainstream consistently needed far lower views to trend, like how PBS needed only 10,000.

Another compared the "similar" recommendations be political bias. The amount of left wing content was substantially larger, but it also recommended less to left/center channels than the reverse.

You also have the banning of certain channels or terms. Steven Crowder noticed this, since you couldn't even search for one of his videos unless you typed in the complete name of the video AND his name.

"Authoritative sources" are also almost definitely going to fill the first page of many search results if its related to politics. Same goes for google search.

They even removed publicly viewing downvotes for political reasons.

What relies on reporting is shadow banning (excluding Reddit, which uses substantially automates this). Banning people for certain actions also relies on reports, but the rules (or interpretations) does primarily rely on the site and employee's political biases. Different handling depending on targeted race of "hate speech" is interpretation of the rules by staff while banning people for misgendering is a rule created under a political bias.

Companies are working on and improving AI to do reporting of their own, such as YouTube automatically creating transcripts, and using them (or titles) to automatically demonetize videos. The breadth of where they target with this tech will only increase, especially as governments also put in their own specialized requirements. Truth Social also uses the same censorship tech twitter does for its twitter alternative.

2 years ago
1 score
Reason: None provided.

Some people analyzed YouTube recommendations a few years ago.

One analysis charted the number of views required to trend for various channels. Mainstream consistently needed far lower views to trend, like how PBS needed only 10,000.

Another compared the "similar" recommendations be political bias. The amount of left wing content was substantially larger, but it also recommended less to left/center channels than the reverse.

You also have the banning of certain channels or terms. Steven Crowder noticed this, since you couldn't even search for one of his videos unless you typed in the complete name of the video AND his name.

"Authoritative sources" are also almost definitely going to fill the first page of many search results if its related to politics. Same goes for google search.

They even removed publicly viewing downvotes for political reasons.

What relies on reporting is shadow banning (excluding Reddit, which uses substantially automates this). Banning people for certain actions also relies on reports, but the rules (or interpretations) does primarily rely on the site and employee's political biases. Different handling depending on targeted race of "hate speech" is interpretation of the rules by staff while banning people for misgendering is a rule created under a political bias.

Companies are working on and improving AI to do reporting of their own, such as YouTube automatically creating transcripts, and using them (or titles) to automatically demonetize videos. The breadth of where they target with this tech will only increase, especially as governments also put in their own specialized requirements. Truth Social also uses the same tech twitter does for its twitter alternative.

2 years ago
1 score
Reason: Original

Some people analyzed YouTube recommendations a few years ago.

One analysis charted the number of views required to trend for various channels. Mainstream consistently needed far lower views to trend, like how PBS needed only 10,000.

Another compared the "similar" recommendations be political bias. The amount of left wing content was substantially larger, but it also recommended less to left/center channels than the reverse.

You also have the banning of certain channels or terms. Steven Crowder noticed this, since you couldn't even search for one of his videos unless you typed in the complete name of the video AND his name.

"Authoritative sources" are also almost definitely going to fill the first page of many search results if its related to politics. Same goes for google search.

They even removed publicly viewing downvotes for political reasons.

What relies on reporting is shadow banning (excluding Reddit, which uses substantially automates this). Banning people for certain actions also relies on reports, but the rules (or interpretations) does primarily rely on the site and employee's political biases. Different handling depending on targeted race of "hate speech" is interpretation of the rules by staff while banning people for misgendering is a rule created under a political bias.

2 years ago
1 score