Short Take: The Youtube Purge

As the Texas Tornado replied, “NOBODY COULD HAVE PREDICTED THAT CALLS FOR CENSORSHIP MIGHT BACKFIRE,” because, of course, he did, I did, many people did. It’s not that we’re prescient, though we may be, but because it was so obvious that there was essentially no chance that censorship wouldn’t result in Youtube overshooting even its own mark.

YouTube’s campaign against hateful and racist videos is claiming some unintended victims: researchers and advocates working to expose racist hatemongers.

A video published by the Southern Poverty Law Center was among those taken down after the company announced plans Wednesday to remove more videos and channels that advocate white supremacy.

Putting aside the irony that “unintended victim” was published by SPLC, the nature of the video was opposite of what the purge intended to remove.

The civil rights advocacy group received an email notification early Thursday that a video of journalist Max Blumenthal interviewing prominent British Holocaust denier David Irving was removed from the SPLC’s YouTube channel.

How does Youtube justify its heavy hand?

“We know that this might be disappointing, but it’s important to us that YouTube is a safe place for all. If content breaks our rules, we remove it,” YouTube said in the email.

The woke response is to blame the algos. After all, algos have no feelings, so they can’t be offended.

“It indicates that they have not refined well enough the difference between someone who is exploring issues of racism and hatred and someone who’s promoting it,” Beirich said.

Perhaps this makes sense for some, but then, what if the use of language is itself as hurtful, violent, as so many seem to believe? Why would these words be less violent when used for “exploring” rather than “promoting,” if words or ideas are inherently evil? When the argument is focused on nonsensical harms and conflated pain, should it matter whether the bullet is fired from the SPLC’s gun rather than some snarkmonger’s?

To accept this notion would be to admit the flaw of the underlying concept, that “hate speech” defies definition, or that the purported violence of which they complain is a lie to rid society of words when used in ways they don’t like by people they don’t like, but otherwise not violence and helpful rather than harmful.

Jessica J. González, vice president of strategy at the media advocacy organization Free Press, said it’s important for tech companies to rely on human moderators as opposed to algorithms to train staff in cultural competency and to ensure their appeal processes are simple, transparent and rapid.

If human moderation was possible (it’s not, but go with it for Jessica’s sake), would that cure the algos’ deficiencies?

González’s organization helped develop a set of suggested content moderation policies. She said the suggested policies were informed by the experiences of people whose posts have been taken down on Twitter and Facebook for calling out racism.

Her “suggested content moderation policies” avoid such algo-driven problems as keywords by eliminating all meaning whatsoever.

MODEL POLICY
Users may not use these services to engage in hateful activities or use these services to facilitate hateful activities engaged in elsewhere, whether online or offline.

Well, that certainly clears things up, and put in the hands of human mods, couldn’t possibly result in all manner of mischief.

For those simpletons who fail to grasp, or just don’t care about, the eradication of expression that even they find harmless, if not actually helpful, as a cost of eliminating “hate speech” so as to make websites like Youtube a “safe space,” the problems remain obvious and flagrant. Not only is there no way to sufficiently define “hate” as to produce a workable solution, but one person’s hate is another person’s love. What if the expression at hand related to Israeli/Palestinian relations? Which side wins the hate battle, dykes notwithstanding?

Wait, can I say “dyke”? Certainly lesbians can say “dyke,” and in fact specifically call their march the “Dyke March,” but I am not a lesbian. If I use their word, will I be p

20 thoughts on “Short Take: The Youtube Purge

  1. Pedantic Grammar Police

    In East Germany the Stasi paid a substantial portion of the citizenry to do their dirty work. Now our “education” system churns out useful idiots who do the dirty work of our rulers for free. The government doesn’t have to impose censorship; the snowflakes beg our corporate overlords for it. And they will never learn. Censorship does work, it does! We just have to get better at it.

    Reply
  2. paleo

    “Jessica J. González, vice president of strategy at the media advocacy organization Free Press”

    Is there anybody anywhere in the public realm that has even a smidge of self-awareness these days?

    Reply
  3. Guitardave

    IMO, an algo programed by snowflakes is no different than human snowflakes. They both are stupid.
    Example; I was writing a description to a vid i posted and i used, in a self-deprecating way, the phrase angry old bastard. YT would not ‘save the changes’ …i had not run into this problem yet, and I’m not to smart with computer stuff, but i had a thought with all this YT censorship crap going on and changed ‘bastard’ to ‘ass’, and what do you know….changes saved.
    Say goodbye to Zappa vids…his anti-censorship infused remains are probably spinning in his grave so fast hes boring a hole thru the earth as we speak.

    Reply
          1. Guitardave

            I guess osmosis is to blame….no one taught me to write, but I have read a bit of good writing….(and i’m here….hmm)

            The quote I was stealing part of is;
            “Blah blah….is like a dead fish on the beach in the moonlight, it both stinks and shines.”
            unknown

            Reply
            1. Hunting Guy

              John Randolph on Edward Livingstone

              “He was a man of splendid abilities but utterly corrupt. Like rotten mackerel by moonlight, he shines and stinks”

      1. Pedantic Grammar Police

        Sorry I’m a bit slow. I’m the guy who, while everyone else is laughing, says “But what happened back at the ranch?”

        Reply
  4. D-Poll

    If it’s any consolation, human moderation is possible and both Google and Facebook, among other major companies, use it. In fact it’s very possible, even likely, that this was done by a human. They farm it out to massive cubicle installations in places like Indonesia where people have to process ludicrous numbers of requests a minute (in languages they don’t speak) because humans are still cheaper in the third world. “The algorithm” is mostly just an urban legend these days.

    Reply
        1. B. McLeod

          As the Clancy Brothers used to say, “It starts out slowly, and then goes rapidly downhill, like a good night of drinking.”

          Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

All comments are subject to editing or deletion if I deem them inappropriate for any reason or no reason. Hyperlinks are not permitted in comments and will be deleted. References to Nazis/Hitler will not be tolerated. I allow anonymous comments, but will not tolerate attacks unless you use your real name. Anyone using the phrase "ad hominem" incorrectly will be ridiculed. If you use ALL CAPS for emphasis, I will assume you wear a tin foil hat and treat you accordingly. I expect civility from you, but that does not mean I will respond in kind. This is my home and I make the rules. If you don't like my rules, then don't comment. Spam is absolutely prohibited, and you will be permanently banned.