I took a stab at trying to find the right image to go along with Chris Seaton’s post at Fault Lines, The Legal Antidote to Virtual Sexual Assault, and ran headlong into Rule 34. I was fascinated and repulsed at the same time. Two thoughts ran through my head:
- Is this where someone finds sex when he never leaves his parents’ basement?
- Is this where someone finds sex when no real person will have him?
But then I remembered, I was viewing through the lens of someone whose younger years were spent in the real world. There was no virtual world then, and therefore no virtual option. Maybe I just didn’t get it? Maybe the fond memories of youth that pre-dated the AIDS epidemic, the rise of neo-feminist efforts to recreate sexual rules that defied reality, skewed my appreciation of anime porn that allowed people to enjoy a virtual facsimile of sex without all the headaches and risks that accompany the real deal?
The Medium post upon which Chris relied for his Friday Funny (what we internally call his last post of the week, a send-off with humor after a week of posts filled with misery and trauma) has now been deleted, though the internet never forgets.
But it’s not really important. The gripe was that a woman was playing a game, her avatar was sexually assaulted and she claims, in the breathless way victims of “sexual violence” do, that it was horrifying. Nobody touched her, of course, because none of it was real, but that didn’t change her feelings, that it felt real.
And since feelings cannot be denied, or questioned for that matter since they reflect a person’s “lived experiences,” no matter how batshit crazy, the sexual assault may have been virtual but the feelings were every bit as real as any other. Then again, so too are the feelings toward anime porn that produce a person’s few private minutes of self-pleasure. If virtual reality can make you feel good, why not bad?
The Harvard Business Review warns, however, that our conflicted relationship with the virtual world will cause us to behave in a manner that will ultimately become our nightmare.
Stop swearing at Siri. Quit cursing Cortana. As digital devices grow smarter, being beastly toward bots could cost you your job.
As machine learning and artificial intelligence capabilities proliferate, digital interface design sensibilities begin to accelerate from skeuomorphic to anthropomorphic. Consider how Slack and Hipchat now use bots that go beyond automating interfaces to facilitate smarter and more engaging user experiences. That’s Chatbot 1.0. Ongoing algorithmic innovation assures that next-generation bots will be far sharper and more empathic.
Do you curse at Siri? I know that i curse at voicemail trees. I know it’s pointless, and that I really should save my anger at the waste of my time for when I run into the CEO at a cocktail party, but it just comes out. Of course, calling Siri a mean name doesn’t actually hurt Siri’s feelings, because, well, Siri isn’t a person and has no feelings. HBR isn’t so foolish as to actually think otherwise.
These behaviors are simply not sustainable. If adaptive bots learn from every meaningful human interaction they have, then mistreatment and abuse become technological toxins. Bad behavior can poison bot behavior. That undermines enterprise efficiency, productivity, and culture.
That’s why being bad to bots will become professionally and socially taboo in tomorrow’s workplace. When “deep learning” devices emotionally resonate with their users, mistreating them feels less like breaking one’s mobile phone than kicking a kitten. The former earns a reprimand; the latter gets you fired.
At face value, it would appear as if the HBR admonition makes empathy for the bots primary to human interaction. A cute turn of the phrase, “technological toxins,” where bad human interaction teaches bots to be bad in return. Can you imagine if the response to your telling Siri to play “Sympathy for the Devil” was, “Fine, dickwad, if you want to listen to crap like that.”
That undermines enterprise efficiency, productivity, and culture.
You might well have glossed over this dry line in the otherwise titillating quote, but this is what HBR is really concerned with. You see, big business (delightfully referred to as “enterprise” because without jargon, we would see what they’re doing) believes that Siri is the future and people, to the extent they’re needed at all, will exist to serve her. People are a pain in the ass. Siri can do it better, more reliably, cheaper and without all the annoyance of human feelings. So just don’t screw it (her?) up.
Telling Siri to shove it can’t be the same as kicking a kitten because Siri is just a program. But the business gurus at HBR know that the weak link in technology is, and always will be, people. And if you can’t control yourself with Siri, then your weakness as a person is revealed.
Just as one wouldn’t kick the office cat or ridicule a subordinate, the very idea of mistreating ever-more-intelligent devices becomes unacceptable. While not (biologically) alive, these inanimate objects are explicitly trained to anticipate and respond to workplace needs. Verbally or textually abusing them in the course of one’s job seems gratuitously unprofessional and counterproductive.
Crudely put, smashing your iPhone means you have a temper; calling your struggling Siri inappropriate names gets you called before HR. Using bad manners with smart technologies can lead to bad management.
If you’re the sort of person who virtually sexually assaults someone in a game, then you might have rape in your heart. If you can’t control your anger sufficiently to speak kindly to Siri, then you’re the sort of person who would abuse a subordinate or kick the office cat. If you’re the sort of person who prefer the comfort of an anime sexual partner, then . . . you’ll never be lonely?
For the unduly emotional, this reflects a confluence of virtual and real feelings, as if someone forgot that a chair won’t return the favor of a kindness. But the business overlords, those HBR types who depend upon the feelings of strangers to do their bidding in a shop run by Siri, aren’t taking any chances. They anticipate Siri will make them tons of money in the future, and you’ll be an annoyance unless you learn now to be respectful to virtual reality.
And we’re already well on the way of achieving the necessary obsequience. When feelings of horror arise from a virtual sexual assault, when feelings of pleasure come from a gif of a naked woman, we’re mere baby steps away from taking orders from Siri, intoned in a pleasant and non-abusive manner that tugs at our empathetic heartstrings, but insists that whatever we’ve been commanded to do gets done. Now. After all, Siri asked us nicely, and we’re only human.
For the moment, we ask things of Siri. That won’t always be the case. How will you feel then?