Genderless Google AI Meets “Free The Nipple”

Google’s gone woke, at least a little bit.

In an email to developers on Thursday morning, seen by Business Insider, Google said it would no longer use “gendered labels” for its image tags. Instead, it will tag any images of people with “non-gendered” labels such as “person.”

Google said it had made the change because it was not possible to infer someone’s gender solely from their appearance. It also cited its own ethical rules on AI, stating that gendering photos could exacerbate unfair bias.

To say that it’s “not possible” seems an absurd exaggeration. In most cases, it’s not only possible but obvious. That there may be some cases where it’s too close to call doesn’t compel the claim of impossibility; that’s entirely Google’s choice, and one that’s better explained by “exacerbate unfair bias,” which sounds nice if one has a fine-tuned sense of jargon, but doesn’t do much to explain what sort of unfair bias might be exacerbated by stating the obvious.

On the other hand, this shift finds easier justification in the woke agenda.

Frederike Kaltheuner, a tech policy fellow at Mozilla with expertise on AI bias, told Business Insider that the update was “very positive.”

She said in an email: “Anytime you automatically classify people, whether that’s their gender, or their sexual orientation, you need to decide on which categories you use in the first place — and this comes with lots of assumptions.

“Classifying people as male or female assumes that gender is binary. Anyone who doesn’t fit it will automatically be misclassified and misgendered. So this is about more than just bias — a person’s gender cannot be inferred by appearance. Any AI system that tried to do that will inevitably misgender people.”

Certainly Google doesn’t want to misgender anyone. After all, it could get canceled, and that’s hardly a sound business plan, and nobody at Google wants to hurt anyone’s feelings by allowing facts to intrude on their ideology.

But what consequences follow this enlightened change?

Google invited affected developers to comment on its discussion forums. Only one developer had commented at the time of writing, and complained the change was down to “political correctness.”

“I don’t think political correctness has room in APIs,” the person wrote. “If I can 99% of the times identify if someone is a man or woman, then so can the algorithm. You don’t want to do it? Companies will go to other services.”

Consider devs working in targeted advertising, no longer able to distinguish who gets the ads for penis pumps and breast pumps? Okay, you don’t care because AI advertising is already a hated nightmare. What about images of your kids? If you post a fabulous beach pic of your son romping in the surf, how can a genderless AI distinguish it from child porn of some young girl whose chest is exposed? Your son isn’t likely to wear a bikini top when he swims, but his bare breasts aren’t male breasts, but people breasts, which are no different than any other people breasts.

Before, the answer would be because of social convention, even if you found the norm not to your liking. The question of whether you want to “free the nipple” is one thing, but that’s not on the table at the moment, and not your choice in any event as this is Google’s gig, so you don’t get a vote. With a genderless AI, it will be either both or neither, and given the neo-Victorian sensibility to femininity, as promoted by such trusted prudential minds as revenge porn princess Mary Anne Frank, the consequences seem clear.

Google notes in its own AI principles that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief.”

Who doesn’t trust Google, as our gateway Overlord to the interwebs, to distinguish the “unjust impacts” from, you know, facts?

10 thoughts on “Genderless Google AI Meets “Free The Nipple”

  1. Pedantic Grammar Police

    “Who doesn’t trust Google, as our gateway Overlord to the interwebs, to distinguish the “unjust impacts” from, you know, facts?”

    White cis-hetero shitlords, obviously. If you don’t know that, then you are one.

  2. John Barleycorn

    Have a seat esteemed one, I have some news for you; Google don’t-need-no “image tags” to “identify” what they are selling.

    The “unjust impacts” are in the user agreement, the facts are clear, and the only question that remains is descriptive. Are the the “users” a commodity or a product, perhaps both? Does that make all Google users non-binary. in transition, or bi-sexual?

    Best get your head around that and concentrate. I get it…. everyone is “attached” to the genitalia between their ears and legs, but it will not be too long now before the carbon in people will become economically secondary to their data.

    Binary gender is cool and all but it is past time to strap in…. By any AI standard the minimal viability scope math says gender is irrelevant in the end game. You will not be the consumer of AI you are being sold by AI. What is between the legs and ears doesn’t really matter all that much, even if you keep it behind a paywall.

    Easy money! The news in in the streets….

    P.S. I am thinking “gender neutral” SJ AI Zine Tab stickers for the streetlight poles but gender specific for the bus stop shelters. It is the mail box stickers that is gonna be the hard part.

  3. Dan

    “Classifying people as male or female assumes that gender is binary.”

    That would be because it is. Humanity has known it for, minimally, several millennia. The survival of the species depends on this truth. But somehow, in the last decade or two, we’ve begun to pretend that this self-evident truth isn’t actually the case.

  4. Casey Bell

    So I’m not supposed to make an assumption about the gender of
    a person in a photo. Okay, why stop there? Is it not just as inappropriate
    for me to assume that it is a person and not a mannequin or computer-generated
    image or a small tree wearing a dress? If people were not permitted to make
    assumptions the world would grind to a halt very quickly.

Comments are closed.