Blame The Algorithms And Their Masters

We love algorithms. We love crowdsourcing. We love the democracy of information on the internet. Except when we don’t.

In the crucial early hours after the Las Vegas mass shooting, it happened again: Hoaxes, completely unverified rumors, failed witch hunts, and blatant falsehoods spread across the internet.

But they did not do so by themselves: They used the infrastructure that Google and Facebook and YouTube have built to achieve wide distribution. These companies are the most powerful information gatekeepers that the world has ever known, and yet they refuse to take responsibility for their active role in damaging the quality of information reaching the public.

It seems that a group from 4Chan managed to get itself “amplified” by the algorithm because algorithms don’t differentiate based on feelings. At least is wasn’t reddit. Remember, as bad as it is, it can always get worse.

BuzzFeed’s Ryan Broderick found that Google’s “top stories” results surfaced 4chan forum posts about a man that right-wing amateur sleuths had incorrectly identified as the Las Vegas shooter.

4chan is a known source not just of racism, but hoaxes and deliberate misinformation. In any list a human might make of sites to exclude from being labeled as “news,” 4chan would be near the very top.

Whether that’s true depends, but even assuming, arguendo, that Alexis Madrigal is correct, does that change how algorithms work? Should it? Are we all in on algorithms until we don’t like their results, in which case we demand some human finger poke it to give us the results we prefer?

The truth is that machines need many examples to learn from. That’s something we know from all the current artificial-intelligence research. They’re not good at “one-shot” learning. But humans are very good at dealing with new and unexpected situations. Why are there not more humans inside Google who are tasked with basic information filtering? How can this not be part of the system, given that we know the machines will struggle with rare, breaking-news situations?

The “machines,” of course, learn from the billion sources presented by the internet. That includes the crazy conspiracy sites, the lulz sites, the serious sites and the sites that align with Madigal’s beliefs. So which ones should the “machine” take seriously? If the machine works, it would ascertain which sites meet its criteria for authoritativeness, relevance and freshness.

So why 4Chan? The algorithm doesn’t lie. It just does what it does, what it’s programmed to do. So Madrigal blames the Masters of the Algorithm.

It’s no longer good enough to note that something was algorithmically surfaced and then replaced. It’s no longer good enough to shrug off (“briefly,” “for a small number of queries”) the problems in the system simply because it has computers in the decision loop.

After I followed up with Google, they sent a more detailed response, which I cannot directly quote, but can describe. It was primarily an attempt to minimize the mistake Google had made, while acknowledging that they had made a mistake.

The mistake, apparently, was that they didn’t intervene when the algorithm did its job in a way that Madrigal found wrong. There is an obvious programming solution to the problem. Whenever the algorithm would otherwise “amplify” 4Chan, replace it with Rachel Maddow.

But the point isn’t whether the algorithm prefers left or right, good or evil. The point is that we love us some algorithms until we don’t.

As news consumers, we can say this: It does not have to be like this. Imagine a newspaper posting unverified rumors about a shooter from a bunch of readers who had been known to perpetuate hoaxes. There would be hell to pay—and for good reason. The standards of journalism are a set of tools for helping to make sense of chaotic situations, in which bad and good information about an event coexist. These technology companies need to borrow our tools—and hire the people to execute on the principles—or stop saying that they care about the quality of information that they deliver to people.

Imagine a newspaper posting fake news, except when it’s the fake stuff we like. But that never happens.

There’s no hiding behind algorithms anymore. The problems cannot be minimized. The machines have shown they are not up to the task of dealing with rare, breaking news events, and it is unlikely that they will be in the near future. More humans must be added to the decision-making process, and the sooner the better.

Hiding behind algorithms? These are our truths, our saviors, removing human emotion and bias from the equation and force-feeding us the empiricism we believe will solve all our problems. We hide behind algorithms all the time when they give us the results we want. “Hey, don’t blame me, the algorithm says you suck.” But when they spit out results that we don’t like, we “hide” behind them?

The characterization has a point, that algorithms are only as good as the criteria programmers ask of them. Once programmed, they do the voodoo they do so well, whether they work out the way we want them to or not. They may work great under some circumstances and really suck under others, but don’t blame the algorithms.

That means the Masters of the Algorithms must be at fault, the Googles, Facebooks, Youtubes, who create them, send them scampering across the internets to do the dirty, and return with whatever they’ve been told to return. Perhaps the criteria were wrong, sloppy, inadequate, and that’s why 4Chan won the internet for a brief and shining moment. Or perhaps 4Chan gamed the algorithms, which is what an entire industry of SEO marketeers exists to do.

But once we cry for a hybrid system, algorithms overseen by human masters, who then take what empiricism provides and superimpose their ideology of what is true and what is false, what is good and what is evil, then we’ve undermined the point of algorithms, the god of empiricism. There is no such thing as loving empiricism until it returns the results you don’t like, in which case you alter empirical results to align with your feelz.

All across the information landscape, looking for news about the shooting within the dominant platforms delivered horrifying results. “Managing breaking news is an extremely difficult problem but it’s incredible that asking the search box of *every major platform* returns raw toxic sewage,” wrote John Hermann, who covers the platforms for The New York Times.

Maybe the “toxic sewage” is right sometimes. Maybe not. But just because you think the results are horrifying doesn’t make them wrong.

17 thoughts on “Blame The Algorithms And Their Masters

  1. Dan

    Part of the answer has to be readers applying some critical reasoning of their own–at a minimum, if a story sounds really fantastic, look for one or two other sources of independent confirmation before passing it along. In the absence of that, even having the news hand-delivered from God himself won’t solve the problem of stupid people believing stupid things.

    Sadly, even major media outlets (the ones who have those “standards of journalism”, those “tools” that prevent publication of “unverified rumors” from sources “known to perpetuate hoaxes”) do a crap job of fact-checking, and do things like cite tweets from @DPRK News Service as authentic. Physician, heal thyself.

  2. B. McLeod

    I don’t care all that much for algorithms. one way or the other. They should find a way to make the computers work on folk songs.

    1. SHG Post author

      This might be an opportune time to toss a Judy Collins video into the mix, but I won’t because you don’t deserve both sides now.

  3. Jake

    It’s not the algorithm’s fault that some people are stupid assholes who prefer bias confirmation to journalism.

    This all got me thinking…Who stands to gain if the Almighty Goog can be convinced that we should go back to the old system where a small cadre of humanz are the gatekeepers of information?

  4. MonitorsMost

    Scott,
    Very disappointed you went with the New York Times version of this article as opposed to the Los Angeles Times version. Had you gone with the Los Angeles Times version, you could have quoted this beautiful nugget:

    “‘This is the same as yelling fire in a crowded theater,’ Gabriel Kahn, a professor at the USC Annenberg School for Communication and Journalism, said of Google’s and Facebook’s response. ‘This isn’t about free speech.'”

      1. KP

        Hang on, don’t you only elect them to complain, threaten and abuse them afterwards..? ..and blame them for globalw arming too!

Comments are closed.