One of the difficult questions in publishing something on the interwebz is what to include, what not to include, whether to name names, link links, include pics or let something go. As Avvo’s Mark Britton pointed out to me a few years ago during a lunch (he paid) in Florida, do I really want to call someone out here and cause them tsuris, if not real harm?
It’s a matter of editorial discretion, as Nancy Leong notes.
Over the weekend, an online magazine made a very poor editorial choice. A writer for the magazine wrote a piece about the proper use of the term “bro.” The piece included the sentence: “And I just don’t think the diminutive label of ‘bro’ should be to describe more insidious sexism, let alone violent aggression like rape threats.” The words “rape threats” were hyperlinked to a single tweet by a female journalist.* The tweet was addressed directly to another person on Twitter, in which the journalist had used a variant of the word “bro” in briefly alluding to rape threats she had received. (For non-Twitter-users: when a tweet begins with the “@” symbol and the username of another person on Twitter, only the sender and recipient of the tweet, and any people who happen to follow both users, will see the tweet. Other people can then find the tweet, which is technically public, but doing so requires a specific search.)
Putting aside the question of whether the “proper use of the term ‘bro'” is worthy of a serious magazine article (it’s not, but who cares?), this is a prelude to Nancy’s point: while it may not be a free speech issue, was it an abuse of editorial discretion to include a link to a twit that was otherwise relevant to the issue but involved rape threats that wasn’t directed to the author of the article, didn’t involve a specific individual in the article and was an @ twit?
To some degree, this is a privacy issue, even though the tweet was technically public.
Decisions about how we share information, and with whom, often reflect a complex calculus that relies on “obscurity” — the difficulty with which information can be found — rather than either absolute privacy or absolute publicity. They argue that the law should recognize this reality.
Twitter changes the binary calculus of internet sharing. There is the open twit, without any @ at the start. There is the private message twit, And there is the @ twit, which can be seen by anyone who follows both twitterers or who cares to look. Nancy calls it “technically public,” which is a lot like saying someone is technically pregnant. It’s public. If a person wants their twit to be private, they can do so. Or, they have the freedom not to twit at all, if they prefer it not be known.
People aren’t entitled to create their personal rules of what others may do with public information. The law already recognizes this reality: publicly accessible twits are public, so if you don’t want something you twit to be public, then don’t twit publicly. This is painfully reminiscent of the Teri Buhl claim, and it doesn’t fly.
But that’s not really the point. The underlying question is whether anything publicly available online ought to be broadcast, in the exercise of editorial discretion.
Privacy issues aside, however, the most important issue here is one of editorial discretion. My view is that editors should not draw attention to material about people’s private lives — even online material that falls in the category of “theoretically accessible” — when the material published contributes little or nothing to discourse, is of little or no public concern, and will cause harm to the person involved.
Nancy focuses on the content aspect, that this twit involved rape threats.
A woman’s brief mention of her own rape threats — and the way that she chooses to talk about them — is not a matter of broad public concern. And the harm created by drawing greater attention to the threats is potentially devastating, as the journalist’s experience has already revealed.
This too detracts from the core issue. First, everything on twitter is “brief,” so lets leave the hype out of it. Second, whether a mention is of sufficiently “broad public concern” is at the discretion of the person using the twit, not the person who published it publicly. This conflates their personal desire to use it publicly but not have it broadcast any further than their original use. That’s not their choice, and that doesn’t change if its by a woman about rape threats, a man about some physical inadequacy or monkey about his love of bananas.
But what of the “harm created by drawing greater attention”? That’s something the person who twitted should have considered in the first place. No one forces us to speak publicly. If there is something that one would prefer remain private, then all it takes is to shut up. You can’t have it both ways, where you get to engage in public speech and then complain that you’ve been harmed by your own words.
These editorial best practices are particularly important given what Soraya Chemaly has described as the “digital safety gap” — the fact that women and other targeted groups (non-white people; LGBT people; those with disabilities, etc.) receive disproportionate amounts of threats and harassment when they choose to be active online. The extent and severity of this harassment can hardly be overstated…
If one accepts the notion that there should be special rules for special folks, where they get to do as they please while being entitled to singular insulation for their conduct because they are vulnerable or fragile, then this works. But if we all play by the same rules, as in equality, then this offers no comfort.
So is an author legally allowed to dig up technically public information about someone else, and is an editor legally allowed to approve the piece, and is a website legally allowed to publish it? Sure. But that’s not the point. The point is that just because something is legal doesn’t mean it’s the ethical thing to do.
And finally, the critical point arrives. While I wouldn’t use the word ethical, as I fail to see any ethical issue involved, I would use the word “appropriate.” Just because we have a soapbox that allows us to be assholes doesn’t mean we have to, or we should. We can do harm to others by broadcasting information that happened on a limited platform, that was never meant to be spread far and wide, or used as the exemplar for greater points.
There are times that editorial discretion is used for exactly that purpose. That’s the risk one takes when engaging in public speech. And there are times when we have neither need nor intent to do a person harm, and can exercise the discretion to shut up, even if the person whose speech is at issue didn’t.
But editorial discretion is exercised by the republisher and is not subject to the approval of the original speaker, and if others think it was exercised poorly, then they, like Nancy, can write more words to castigate the decision. That’s how speech works.