Implanting Feelings into Sex Robots

In the upcoming two-year anniversary of the Microsoft Tay meltdown, and the recent passing of International Women’s Day – I began to think.

In 2016, Microsoft created ‘Tay(Thinking About You). Tay was an artificial intelligence chatterbot who was designed to speak like a 19-year-old American female. Tay’s primary function was to ‘learn’ from Twitter.

Tay became rapidly responsive, engaging with Twitter users and trolls alike. Shortly after launching, the bot began posting racist, sexist, politically incorrect, and drug related tweets.

Tay was officially shut down just 16 hours after launching.

In the aftermath, computer scientist Roman Yampolskiy made an observation in defence of Tay – that the ‘bad behaviour’ of the bot was understandable as it had not been given any concept of what constituted as appropriate and inappropriate.

There are two factors of this experiment that stand out to me;

  1. The bot was designed to imitate a 19-year-old female
  2. Robots have the potential to understand good and bad behaviour

First, let’s examine the idea that robots can understand behaviour. I’ll come back to female robots in the next few paragraphs.


Two rapidly growing complex systems in our society are the theories of artificial intelligence (AI) and artificial consciousness (AC).

The widely accepted definition of artificial intelligence is the capability of a machine to imitate intelligent human behaviour. Some of the qualities of artificial intelligence include reasoning, problem solving, learning, planning, perception, and even the ability to manipulate objects.

Artificial intelligence is a theory that has firmly taken root in modern society. Not too far behind is the theory of artificial consciousness.

Artificial consciousness is both a scientific and philosophical theory – although hard to pinpoint the exact meaning, it relates to the idea that machines can develop a ‘mind’ and gain ‘awareness’ of itself. AI and AC differ through the simple fact that AI simulates human intelligence, and AC embodies human consciousness.

Now back to female robots.


The identity that was given to Tay was a 19-year-old female. Was that because Microsoft didn’t want the bot to appear intimidating? Perhaps it would encourage people to interact if they thought the bot was a female.

Maybe the answer lies here:

‘…we want our virtual assistants to seem pliant and non-threatening, competent but not domineering. Maybe AI development is also influenced by the geek culture ideal of being alternately serviced and encouraged by a hard-earned digital princess…’

It’s no secret that the tech industry is male dominated – female bots like Tay, Alexa and Siri are shaped into representing a ‘modern woman’ by male developers; whether that is their intention or not.


This idea is what led me to the crux of this blog post – is the modern woman eventually going to become a sex robot that has been equipped with artificial intelligence, or artificial consciousness?

In fact, sex robots equipped with AI are already being designed. Harmony, who was created in 2017, is a sex robot with AI that is controllable from an app. The ‘user’ is able to control Harmony’s movements, speech, and facial expressions. In the very near future, Harmony will also be equipped with a remote-control function (as ‘she’ is only a head on a stick for the present moment.)

Harmony’s creator has insisted that he is ‘dedicated to his vision of delivering a sexualised humanoid.’


My vision for this project is to explore the do’s and don’ts of implanting artificial intelligence/consciousness in sex robots.

This observation presented by Sherry Turkle in the book Reclaiming Conversation: The Power of Talk in a Digital Age will guide my project:

‘For a long time, putting hope in artificial intelligence or robots has expressed an enduring technological optimism, a belief that as things go wrong, science will go right.’

Could it be beneficial to have a sex robot companion for someone who otherwise can’t engage in social activities? Or does it open up a whole new unexplored territory of sexual ethics? Is a robot that can learn from its owner better than one that can relate on an emotional level?

For the final product I imagine I will produce something text-based; either a series of blog posts or even just a blog on its own. A 3-part podcast could also work – but for now I’ll just focus on producing a text-based product.

In my next update perhaps, the sex robots will have answered my questions. Or maybe they would have already risen up and destroyed humanity. Until next time.

“I’m a sample of soul, made to look just like a human being”





Alexander, L. (2018). The tech industry wants to use women’s voices – they just won’t listen to them. [online] The Guardian. Available at: [Accessed 13 Mar. 2018].

Reese, H. (2018). Why Microsoft’s ‘Tay’ AI bot went wrong. [online] TechRepublic. Available at: [Accessed 12 Mar. 2018].

Trout, C. (2018). There’s a new sex robot in town: Say hello to Solana. [online] Engadget. Available at: [Accessed 13 Mar. 2018].

Turkle, S. (2015). Reclaiming Conversation: The Power of Talk in a Digital Age. 1st ed. New York: Penguin Books.

Vincent, J. (2018). Twitter taught Microsoft’s friendly AI chatbot to be a racist asshole in less than a day. [online] The Verge. Available at: [Accessed 12 Mar. 2018].

Header Image: 

Nichols, G. (2018). Sex robot maker dismisses rival’s creation as a toy | ZDNet. [online] ZDNet. Available at: [Accessed 15 Mar. 2018].

2 thoughts on “Implanting Feelings into Sex Robots

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s