While I usually find lawprof Frank Pasquale to be one of the more rational, less emotional, minds in academia, he took me by surprise in a Concurring Opinions post he titled “Platform Responsibility.”
Internet platforms are starting to recognize the moral duties they owe their users. Consider, for example, this story about Baidu, China’s leading search engine:
Wei Zexi’s parents borrowed money and sought an experimental treatment at a military hospital in Beijing they found using Baidu search. The treatment failed, and Wei died less than two months later. As the story spread, scathing attacks on the company multiplied, first across Chinese social networks and then in traditional media.
After an investigation, Chinese officials told Baidu to change the way it displays search results, saying they are not clearly labeled, lack objectivity and heavily favor advertisers. Baidu said it would implement the changes recommended by regulators, and change its algorithm to rank results based on credibility. In addition, the company has set aside 1 billion yuan ($153 million) to compensate victims of fraudulent marketing information.
I wish I could include this story in the Chinese translation of The Black Box Society. On a similar note, Google this week announced it would no longer run ads from payday lenders. Now it’s time for Facebook to step up to the plate, and institute new procedures to ensure more transparency and accountability.
The problem isn’t “platform responsibility,” as the indiscriminate use of private internet platforms to do harm, whether for money or simply lack of caring, is indeed a terrible danger. It can make us smarter. It can make us stupider. It can kill.
But Frank characterizes the problem as one of “moral duties.” That’s shocking, coming from someone who is usually far more rational than that. Who made Google our arbiter of morality? And Frank now wants Zuckerberg to decide morality for us?
There are two related problems reflected in this deep dive down the rabbit hole. The first is that an otherwise intelligent academic sees the issue as one of morality, which is not something to be imposed from on high, but something that each of us is entitled to decide, obliged to decide, for ourselves. This is tantamount to religion, whether the imposition of some Puritan ethic of morality that dictates how much skin an angry god wants women to reveal, or perhaps the Asian ideal of the Kama Sutra, allowing positions other than that preferred by Christian missionaries for quickie procreation.
The point is: who the hell decided what morality means for the rest of us?
The second prong of the problem is that Frank suggests that the new Overlords of Morality are corporations whose purpose is to make money. No one can blame Google or Facebook, or any of the other internet Gods, from doing whatever it is they want to do, whether to make a buck off us or to reflect whatever corporate sense of propriety they determine to fit their interests. They’re private enterprises, and fully entitled to take the money of payday lenders, a scummy bunch for sure, or not. And if Facebook wants to silence speech it doesn’t like, or silence speech to keep its customers happy, that’s fully within its rights.
But this isn’t about morality, and this isn’t about some duty of the private corporations that own the biggest pieces of real estate on the internet to dictate morality to the groundlings.
Part of what’s missing in Frank’s effort to look to big corporations for our daily dose of morality is our own responsibility to use this tool well. People, such as Wei Zexi’s parents, endured a tragic result, apparently because they relied on Baidu, the Chinese version of Google. Yes, a terrible outcome. And as a business, Baidu’s credibility as a source of information should be challenged and questioned because of how it does business.
Does this mean that the public, using the internet, has no responsibility not to believe any bit of idiocy that appears online? We’re awash in crap on the internet, and before any wag notes the obvious, every website including SJ must be scrutinized rather than embraced as some great truth because it appears on a computer screen.
What’s the alternative? Do we want Google to decide for us what websites reflect morality in its eyes, and refuse to acknowledge the existence of websites that reflect political views with which it doesn’t agree? Are we calling on Zuckerberg to limit Facebook to those thoughts and feelings that his view of morality allows?
Like Frank, I share a sense of responsibility for what appears here, and vet its content to eliminate much of the insanity that commenters offer. I refuse to allow comments that propose violence, or “explain” law that is dangerously wrong. I do this because of my sense of responsibility. Whether there is any “morality” involved is another matter.
But to suggest that there is a moral duty on the part of corporate players to save us from ideas that could be harmful, unpleasant or inconsistent with our notion of morality is dangerously misguided. But Frank doesn’t end there, as he recognizes the control over our world that these “dominant platforms” exert:
At their best, platforms recognize that such sovereignty comes with responsibilities as well as rights. The power to rank is the power to make certain public impressions permanent, and others fleeting. As platforms gain commercial, political, and cultural influence, they are the new kingmakers. The question now is whether the state (and companies themselves) will make these processes more comprehensible, fair, transparent, and open to critical analysis by the publics they affect.
When massive platforms combine the functions of conduits, content providers, and data brokers, analogies from old free expression cases quickly fall apart. Too many discussions of social networks and speech are nevertheless moored in murky doctrinal categories, reifications, and inapt historical analogies that do more to obscure than reveal the true stakes of disputes. It is time to think beyond the old categories and to develop a new way of balancing dominant platforms’ rights and responsibilities. Sometimes, that will require the translation of old principles of media regulation (like rules against stealth marketing and unfair/deceptive acts and practices) to new contexts. In other cases, litigation will be needed to stop dominant platforms from abusing their power online. Platforms should also acknowledge their de facto role as public forums, quasi-judicial law interpreters, and fiduciaries, even if they resist taking on all the de jure responsibilities such roles imply for older entities.
The TL;dr is that Google and Facebook already control our hearts and minds, not to mention our economy, because they dictate the information we obtain from the internet. Frank is obviously right about that. But then, that can turn either left or right (not in the political sense, but in the alternative sense). By looking to platform responsibility based upon Google’s morality, politics, sensibilities, Frank imbues these entities exercising quasi-governmental mind control over society with a religious control over us.
If we don’t want American government to become a theocracy, why would we want Facebook to be our morality watchdog? Responsible? Sure. But what that means isn’t morality, unless Frank wants Zuckerberg coming down from Mount Sinai with the Ten Commandments of Facebook.
We seem to be moving toward what might be regarded as new-age sumptuary laws now focused on restraining the luxury or extravagance of critical thinking. Also, I am reminded that I need to pick up and reread a copy of Neil Postman’s “Amusing Ourselves to Death: Public Discourse in the Age of Show Business” (CAD $17.00 for the trade paperback edition at my local Chapters — a good deal).
I assume there’s a worthy point being made here, but I frankly have no clue what it is.
Articulate Spam ??
Could be. If so, it’s damn good. At least it’s spam for a good book.
Why the brain damage on this story? Chinese government trying to control the internet as usual.
Baidu is just an example, as it morphed immediately to google. Bad info on internet kills someone, story at 11.
Bad Info on the Internet !!?????!! … OMG !!
(& ok to be honest, I once googled you.. & they said you were asshole.. opinions may vary, but on some days I do agree..)
“They”? Uh huh. If no one on the internet thinks you’re an asshole, you don’t exist.
“Lack objectivity”. Marxist Party-speak.
My understanding of Pasquale’s Balkin post is that he wants the government to impose some sort of social-justicey corporate-responsibility regimen on tech giants (if he doesn’t like payday lender ads, they must go!), along with plenty of regulation for what they get to show and how.
If there’s anything in there about voluntary moral standards, I didn’t see it. Pasquale says he wants the state to make these companies operate in a “fair” way & push for “accountability.”
He does go to some lengths to frame “legislating morality” as “transparency,” but my Newspeak dictionary might just be out of date…
His argument that dominant internet sources are the “new town square” makes a lot of sense. Whether to impose some quasi-government duty upon them is one possible solution, even though it conflicts with our capitalist system. But when that duty is informed by “morality,” it takes a dangerous turn.
If Facebook is the new town square, wouldn’t that imply that they have a responsibility to NOT censor, consistent with free speech values?
That would certainly be a consequence of assuming quasi-governmental responsibility.
It doesn’t surprise me too much that some fool actually took a search engine’s ranking as an authoritative statement on the effectiveness of experimental cancer treatments. Nor does it surprise me very much that his family would blame the search engine for this, especially when the alternative would be to blame their loved one, or even the Chinese military that ran the hospital. What absolutely boggles my mind, however, is the number of ostensibly thoughtful and informed commentators on the Internet who take their condemnation seriously, as if it were a real issue of corporate ethics, rather than the “man-gets-eaten-trying-to-feed-bears” story that it really is.