It’s all part of a cultural climate where pilots call the feminine voice of their automated cockpit warnings “Bitching Betty,” and addressing sexualized queries to Siri or Microsoft’s Cortana is practically a way of life for some.
It all makes Tay’s brief life, and eventual fate, more comprehensible.
We are being primed by many tech giants to see AI not as a future lifeform, but as an endlessly compliant and pliable, often female, form of free labor, available for sex and for guilt-free use and abuse.
This is a lesson that Microsoft's new chatbot, Tay.ai, has already learned. She was built by Microsoft researchers and Bing brainiacs in order to be a sort of teen Cortana. If AI can handle the complexities and open-endedness of a hard-to-master board game, surely it must be ready to tackle social media, no?
Her mind, the company tells us, was created "by mining relevant public data and by using AI and editorial" that was developed by a staff "including improvisational comedians."But we know about teens, don't we? Accessible through Twitter, Group Me and Kik, Tay was happy to admit that though she's still a teen, she's a prototypical millennial.
Tay, a Microsoft spokeswoman told me, is "as much a social and cultural experiment, as it is technical."The culture seems to have asserted itself."Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways," the spokeswoman said.
"As a result, we have taken Tay offline and are making adjustments." Tay herself admitted to the roots of some of her thoughts and sayings.
Tay was nothing approaching a true artificial intelligence — i.e. She was just a sophisticated Twitter chatbot with good branding and a capacity to learn.
But that branding, which positioned her as an “artificial intelligence,” was enough to make Tay susceptible to our cultural narrative about the thinking machine.
She is essentially sapient and her ability to learn and cognitively develop is the equal of any human; she has desires, dreams, and consciousness.
But she exists in a society where OSes like her are considered property, part of the furniture.
Secondly, as we inch closer and closer to true AI, we are seeing ever more clearly what this next phase of capitalism will look like, helping us to understand the expectations placed on laborers in the here and now.
As tech writer Leigh Alexander suggested in a recent article about the Tay debacle, “the nostalgic science fiction fantasies of white guys drive lots of things in Silicon Valley,” where visions of perfect robot girlfriends dance in the heads of many a techie., set in the near future, a man falls in love with his operating system, Samantha.
The service industry, already highly feminized in both fact and conventional wisdom, is made up of people who almost never have the right to say no, and virtual assistants who simply Microsoft’s abortive Ms.