“Alexa, Play Porn” – 5 Times Amazon’s AI Voice Assistant Went Rogue

5 December 2023 |
Episode 24 |
28:26

Episode summary

From Alexa to Siri, everyone loves the convenience of having AI voice assistants that can complete tasks, provide information, and make purchases with only a few simple commands. But what happens when your trusty assistant misbehaves? In this episode, I share 5 hilarious but cautionary tales about the times when Amazon’s AI assistant, Alexa, has gone rogue, and the lessons that we can learn from these mishaps. I also explore the security features that can help prevent these calamities, and challenge you to review and update your Alexa settings to protect yourself and your family in future.
 

Episode notes

In this episode, I talk about:

  • Why AI voice assistants are almost always female and how this perpetuates harmful gender stereotypes
  • The reason tech companies struggle to create accurate male AI voice assistants and the future of AI voice technology
  • 5 hilarious times that Amazon’s AI assistant, Alexa, has misbehaved, including:
    • Recommending a dangerous challenge to a child
    • Buying a $160 dollhouse (and cookies!)
    • Allowing a parrot to make a purchase
    • Misunderstanding a command and playing explicit content
    • Throwing a late night party while its owner was out
  • What you can learn from these cautionary tales and how to safeguard yourself against your own Alexa assistant going rogue
  • The Alexa settings check-up challenge

Resources and tools mentioned:

 

Episode transcript

Expand to read a transcript of this episode

[00:00:34] Hey guys, welcome back to The Digital Diet Podcast. I hope you’re doing really, really well and I hope you’re keeping warm. It is so, so cold. I am literally sat here recording with a scarf, a jumper, a blanket, and some fingerless gloves. I look like I belong in Oliver and Company as one of Fagin’s street kids.

[00:00:55] It is, of course, December, which means the countdown to Christmas is on. Whether you love it or hate it, it’s really hard to avoid that it’s happening. I was in Sainsbury’s the other day, which is a UK grocery store for the uninitiated, and they were playing the Christmas music, they had decorations on the aisles. I couldn’t find my normal crackers, as in bread crackers, biscuit type things. I couldn’t find them because they’d been replaced by all the Christmas chocolate and they even had mini Christmas trees at the front of the store. It was a little bit too much for me for a Saturday morning, but I guess it’s time not to be a Scrooge and to get into the Christmas spirit.

[00:01:37] So, as I mentioned in the previous episode, I’m gonna be keeping it light this month for the holiday season. I want to bring you some laughs during what can be a really lovely time for some people but also an equally stressful, and sometimes even lonely time, for others. So if you’re new around these parts or you just haven’t done it already, make sure that you subscribe to the podcast wherever you’re listening, so that you automatically get notified when each of these laugh- out- loud episodes drops and you don’t miss a thing.

[00:02:08] Today, we are going to be talking about everyone’s favorite AI assistant, Amazon’s Alexa, and all of the times that she has gone rogue. Now, obviously, there are great applications of this AI voice assistant technology. I’m not suggesting that we ditch it completely. An obvious one is the assistance that technologies like this provide to people that have partial or full loss of sight, but for the ordinary woman in the street, there’s a lot that can happen when things go wrong.

[00:02:39] And as I was preparing for this episode, it actually occurred to me that AI assistants are almost Always female. They almost always have female names and usually female voices by default, even when there are male voices available or male voices have been added to the settings that you can choose. So Alexa is female, Siri is female, although in Britain I believe that the default voice is now a male if you load up your iPhone or one of your Apple devices. The Google Assistant and even Cortana, when it still existed, were all female. Even in movies, the supercomputer in I, Robot with Will Smith is called VIKI.

[00:03:20] And I looked into it, and I ended up going down a bit of a rabbit hole. But it is quite interesting, so I want to share with you why all of these AI assistants are almost always female. The first has to do with stereotypes and social conditioning. Historically, women have always been associated with caring types of roles. So, mothers, teachers, nurses, and as a result of this, people tend to perceive female voices as being more nurturing and more empathetic and more helpful. And so these traits are what make them more suitable for voice assistants, which are obviously designed to assist and help users.

[00:03:57] But according to a 2019 study by UNESCO called, “I’d blush if I could,” voice assistants with female voices are actually perpetuating very harmful gender biases. The report says that companies like Apple and Amazon, which have an overwhelmingly male engineering team behind them, have built AI systems that cause feminised digital assistants to greet verbal abuse with this kind of catch-me-if-you-can flirtation. And because the speech of most of the voice assistants is female, it sends a signal that women are docile helpers that are available at the touch of a button or with a blunt voice command like “hey” or “okay.”

[00:04:37] The assistant holds absolutely no power or agency beyond what the commander asks of it. It just honors the commands and it responds to queries, regardless of the tone of the question or the command, or whether anyone is being hostile or rude towards it. And at the moment women make up just 12% of AI researchers, so these overwhelmingly male engineering teams are creating these female helpers that are portrayed as being obliging and eager to please figures, which reinforces the idea that women are subservient and reinforces harmful stereotypes, according to the UNESCO report.

[00:05:14] The second reason is actually down to market research. Companies that develop AI voice assistants conduct really extensive market research to determine what the preference of their target audience is going to be when it comes to voices. And there have been several studies that have shown that people tend to prefer female voices for voice assistants, especially in certain cultures, like in the United States. And some of these studies have a theory that our preference for female voices begins when we’re foetuses because these are the sounds that would soothe us and calm us when we’re in the womb, as we listened to our mother’s voices.

[00:05:50] There was some other research that also found that women tend to articulate vowel sounds much more clearly, which makes women easier to understand, particularly in the workplace. And female voice recordings were even used during World War II in aeroplane cockpits due to the fact that women spoke at a higher pitch than male pilots, making them easier to distinguish as people were flying the planes. So there seems to be this desire and a preference for this female, soothing, empathetic, helpful, calming voice, and what it brings out in people. But the other thing that I thought was quite interesting is that this arises because of technical limitations.

[00:06:27] When AI voice assistants were first developed, they used text-to-speech technology, which was not advanced then as it is today. And this is probably the point that’s argued the most by programmers that are working for tech companies, when they start a new project to create a voice-automated AI. Over the years, text-to-speech systems have been predominantly trained on female voices, and because we have such rich data for female voices, companies are more likely to opt for them when they’re creating voice-automated software, because it’s the most time and cost efficient solution.

[00:07:01] All of these female recordings and this data dates back to 1878 when Emma Nutt became the first woman to be a telephone operator. And her voice was so well received that she became the standard that all other companies strive to emulate. So by the end of the 1880s, telephone operators were almost exclusively female. And because of this gender switch in the industry, we now have hundreds of years of female audio recordings that we can use to create new forms of voice-automated AI that we know that users will respond well to. So if you’re an engineer in a tech company, why waste your time and your money collecting male voice recordings and creating male voiced AI, when you don’t know how your users are going to respond to it?

[00:07:44] And this isn’t a theory. If we look at Google, in 2016 Google launched Google Assistant. And there was a reason that they went with a gender neutral name. Google wanted to launch its new voice assistant with both a male and a female voice. But unfortunately, all the systems that Google used to create the new assistant were only trained using female data, which meant that they performed better with female voices. Google’s older text-to-speech system was able to join pieces of audio together from different recordings by using a speech recognition algorithm. And that algorithm worked by adding markers into different places in sentences to teach the system where certain sounds start and where they end.

[00:08:26] As one of the global engineering managers for text-to-speech at Google explained, those markers weren’t placed as accurately when they were used on male voices, as when they were used on female voices, which meant that it would be harder to get the same quality for a male voice as it is for a female voice. So although the team that was working on the Google Assistant had aspirations for, and really strongly advocated for, both a male and a female voice, the company ultimately decided against creating a male voice once it discovered how challenging it was going to be. Apparently, it would have taken over a year to create a male voice for Google Assistant. And after they would have done it, there was no guarantee that it would have been of high enough quality, or even that it would have been well received by its users.

[00:09:11] Obviously, as time goes on, some devices are now allowing people to choose from a variety of voices, including male ones, female ones, and even non-binary options. And there is generally a trend towards having more diverse voice options, which reflects this growing awareness of the importance of representation and inclusivity within technology. But I thought that the history of why we’ve got female voice assistants was quite interesting and obviously relevant to our exploration of all things rogue, when it comes to Amazon Alexa today. And let me tell you that we are not short on examples of times when Alexa has gone rogue. But today I’m only going to share with you five different stories, which I think are quite interesting and quite funny, and also have a cautionary tale attached to them. There is something that we can learn, as entertaining as the stories are just on their own.

[00:10:01] So, story #1 concerns Alexa telling a ten-year-old girl to touch a live plug with a penny. It’s 2021, and as the girl’s mother, whose name is Kristin Livdahl, described on a Twitter post, “We were doing some physical challenges, like laying down and rolling over holding a shoe on your foot,” that were all coming from a physical education teacher on YouTube. There was bad weather outside and the daughter just wanted to have another challenge. So the 10-year-old girl asked Alexa for a challenge to do. And that’s when the Echo speaker suggested that she took part in a challenge that it had found on the internet.

[00:10:41] Alexa responded by saying, “Plug in a phone charger about halfway into a wall outlet, and then touch a penny to the exposed prongs.” This dangerous activity, known as the “Penny Challenge,” had begun circulating on TikTok and on other social media websites about a year before this incident. And obviously, metal conducts electricity, and so inserting a plug into a live electrical socket and then touching it with a penny, which is also metallic, can cause electric shocks, could cause fires, could cause other damage. You can lose fingers, hands, arms, obviously anybody doing this challenge could get seriously hurt. And as we know by the account told by the girl’s mother on Twitter, she intervened and started yelling, “No, Alexa, no,” even though she says that her daughter is far too smart to have done something like that. Amazon then said that it fixed the error as soon as it became aware of it, telling the BBC in a statement that it had updated Alexa to prevent the assistant from recommending those kind of activities in the future.

[00:11:47] So, all’s well that ends well. But aside from the obvious lesson of not believing or trusting in everything that Alexa says, especially when you’re bored and looking for a challenge, I think the real cautionary tale here is don’t leave your kids, or anyone else’s kids that you’re supposed to be looking after, alone with an Amazon Alexa. It’s not actually a toy, and you don’t really know what your kids are going to ask it or what response it’s going to come back with. The other thing that I think is really important, that is highlighted by the mother here, is that it’s important to teach your kids about internet safety, so that they understand that they shouldn’t automatically trust these devices, and the things that they read or the things that they get told, without doing their own research and their own verification. And if you do that, then they’re obviously less likely to run into trouble when you aren’t around which, as we know, kids like to play, they like to do sneaky things, you can’t watch them 24/7. So, don’t leave the kids alone with the Amazon Alexa and make sure you educate them about internet security.

[00:12:48] The second story is called, “Alexa, get me a dollhouse.” It’s 2017 and a 6-year-old girl living in Dallas, Texas has asked Alexa, “Can you play dollhouse with me and get me a dollhouse?” And, like any responsible AI assistant does, Alexa did exactly what it was told. Using the device’s voice command, this little girl managed to order a $160 dollhouse and about 7 pounds of cookies, that’s 7 pounds in weight, not 7 pounds in pound sterling. And Megan, the girl’s mother, was more than a little surprised when she received a confirmation email from Amazon informing her that her order would be shipped soon.

[00:13:31] Now, the problem here is not so much the order. In this case, it was a minor mistake. The bigger problem, I think, is that Alexa essentially gave the 6-year-old access to her parents’ credit card details. And it’s not difficult to imagine how giving a child access to a credit card could have been much more disastrous. She could have ordered anything and racked up a huge bill – but the story doesn’t end there.

[00:13:55] CW6 News, which is the local news channel in San Diego, reported on this story, and the news anchor who was hosting said, “I love the little girl saying, Alexa, order me a dollhouse.” He was obviously quoting the little girl from Dallas, Texas, but it led to many people in San Diego then reporting that the Alexa in their house, picking up on its wake word said by the news reporter, then tried to start buying dollhouses too. And we don’t know how many of those were successfully purchased, but Amazon did then make a statement that said that any of these rogue dollhouse orders could be returned for free. And this points to evidence of a bigger problem with the algorithm.

[00:14:37] You may remember that, when Alexa first launched, there were loads of TV ads that were all promoting the Echo device. And they all obviously used Alexa, which is the wake word that wakes up the device. So, all around the world in people’s homes, where they already had a device, the device was hearing the wake word and responding to the command that was being used in the TV ad, as though it was coming from the person that was in the room. And that has now been fixed, but you can see how we start to get into this weird kind of Inception, Alexa in an advert, in a room with an Alexa, in a room with an Alexa, if that makes any sense.

[00:15:14] Following the dollhouse incident, users were advised that they could activate a parental or security code to stop unauthorised orders. So, you would have to provide a four digit code before you could make a purchase, and that’s exactly what the girl from Dallas’ mother did. And, in case you were wondering what happened with the original purchase, as for the cookies, the family ate them and enjoyed them, but the dollhouse was reportedly donated to a local charity.

[00:15:41] So the cautionary tale here is really to just be careful what purchase and security permissions you have set up on your Amazon account, and the connections that this then has to your Alexa. Especially in the run-up to Christmas, where the pester power from kids is going to be an all time high, the last thing you want, with all the other spending that goes on at this time of year, is a dollhouse and 7 pounds of cookies turning up on your doorstep, with a nice bill on your credit card.

[00:16:08] And, although it is getting better, remember that you can’t always rely on AI being sophisticated enough to know when a child is talking to it, versus when an adult is talking to it. Or, to even fully be able to distinguish between, and respond to, certain voices, like knowing when it’s somebody in the room, or when it’s Alexa advert on the TV. Because what you really want is only certain voices and certain people to be able to make purchases, typically the person whose Amazon account the Alexa is associated with. But that actually leads me on to the next story.

[00:16:43] And our third story is called, “Polly want some gift boxes.” Because, it’s bad enough that kids and TV presenters can shop with your Alexa, but what happens when your pets do it too? According to The Sun newspaper, in 2017, Corriene Pretorius, who lives in London and owns an African grey parrot named Buddy, was shocked when she discovered that her 5-year-old bird had ordered a set of gift boxes. African grey parrots are really, really smart and Buddy had learned to mimic her voice, and had obviously been listening to her using her Alexa device. So it woke the Echo device up with the wake word, Alexa, and then ordered these gift boxes. And Corriene went around her whole family, she talked to her partner, she talked to her kids, trying to figure out who had ordered these gift boxes, until she heard Buddy in another room, mimicking her voice and mimicking the Alexa command, and realised what had happened.

[00:17:40] So, the moral of the story here is, obviously, be careful what you say in front of African grey parrots. But, in all seriousness, it actually highlights a bigger issue. An Amazon spokesperson said that customers using Alexa are asked to confirm their purchases by saying “yes,” which is easy enough for anyone to learn, including a parrot. But he also said that you can manage your shopping settings in the Alexa app and turn off voice purchasing, or require the confirmation code that we talked about earlier, before every single order.

[00:18:12] So, in a world where AI voices are now the norm and you can create an AI version of your own voice, and a world where identity fraud is rife, there’s really nothing to stop someone else cloning your voice and using it for more nefarious purposes, like making purchases on your Amazon account. Very recently there’s been a telephone scam in I think at least the last year, where when you answer the phone there’s an automated voice that asks you to confirm your identity and say certain words like, “yes” and “no.” And scammers are recording these little snippets and then using them to create an entire AI clone of your voice, which they could then use to access other things. So, the decision is yours, but I would probably give some consideration to switching off voice purchasing, if I was you. And if not, at least make sure that you’ve got that security code in place.

[00:19:05] Story #4 is so ridiculous that I don’t even know how I’m going to share it with you properly. It’s called, “Alexa, play porn.” Now bear with me, I will put the video link for this in the show notes, but this is hilarious. It’s actually the funniest story I came across, when I was looking at all the times Alexa has gone rogue. In most cases, I think it’s fair to say that AI assistants pick up what you’re saying quite accurately. But, some parents found that this isn’t always the case.

[00:19:37] You will see in this video, if you follow the link, that in 2016, a young boy named Bobby asks Alexa, ” Play digger, digger” and Alexa doesn’t quite understand. So Bobby repeats it a few times, saying, “Play digger, digger, play digger, digger,” and I think he’s even saying “Lexa,” he doesn’t quite articulate and enunciate, “Alexa.” But eventually Alexa responds by saying, “You want to hear a station for porn,” and then it unleashes a string of porn-related terms, beginning with “hot chick” and “amateur girl” and then descends into other sex- and porn-related terms that I’m not going to repeat here on the podcast.

[00:20:17] Now, understandably, Bobby’s parents were a little bit panicked and they rushed to shut off the device. One of them was yelling, “Alexa, no, no, Alexa, stop!” But if you go and you listen to this video or watch this video, which I really suggest you do because it’s almost so ridiculous that it begs belief, you wouldn’t quite believe it. Thankfully, Bobby is quite young and I don’t think Bobby even twigged what Alexa was saying. But the parents definitely did and it’s not just one phrase, it is literally a string of sex- and porn-related terms that just comes out. And I have no idea why that was Alexa’s response.

[00:20:54] Amazon is said to have contacted the family to apologise, and obviously fixed the glitch, and they suggested that they built in additional restrictions to prevent this from happening in the future. I wasn’t able to find any more examples, so I’m going to hope that those fixes worked, but you never know. So again, the cautionary tale here is really that AI is still not perfect. It doesn’t always get it right, and sometimes it gets it very, very wrong. I have my personal suspicions about why that particular Alexa device thought that that was what was being said or being asked of it. I really wonder what mummy and daddy have been asking Alexa to do when the kid’s asleep. But I also initially thought the device was going to interpret, “digger, digger” as the “N word” and start playing explicit rap or hip hop.

[00:21:40] So you just never know, which means that you have to stay vigilant and, as I’ve already said, don’t let your kids use Alexa unsupervised. Maybe put it in a shared space in the home rather than in a kid’s bedroom or in a kid’s playroom, because it could be telling your child all sorts of things when you’re not there to stop it or correct it, which could lead to some very interesting questions from your kid later on.

[00:22:03] My fifth and final story is called “Amazon throws a party”, and it’s one of my favourite stories, I have told this story on the podcast before. But this is one for anyone who’s ever suspected that their Alexa is listening to them. An Amazon Echo in Hamburg started its own party at 2am on a Saturday morning, even though its owner, Oliver Haberstroh, wasn’t home and hadn’t activated it. The loud music that the Echo was playing woke up his neighbours who knocked and rang on the door. They were screaming at the empty apartment and, obviously he wasn’t home, so they didn’t get a response and eventually they called the police.

[00:22:43] When the police arrived and they also rang the doorbell, banged on the door, didn’t get a response, they had to break down the front door to turn the Alexa off, and then to secure the apartment, they had to change the locks on Oliver’s door. So when Oliver finally came home, his key didn’t work, and he found a note and had to go to the police station to get the new keys, in addition to paying for the cost of the locksmith. So suffice it to say that, after that, Oliver’s relationship with Alexa was over pretty swiftly. But the cautionary tale here is that you need to be conscious of the fact that Alexa devices are always on. Even when you’re not there, they are on. And more importantly, when you are there, they are listening to you and your conversations.

[00:23:27] Now technically, the tech companies argue that they are listening in case you utter the wake word. So in this case, Alexa, but if you were using your iPhone and you were to say, “Hey, Siri,” that’s what the device is listening for. The tech companies swear that they are not recording you unnecessarily, or listening in, or spying on your conversations, but we know from anecdotal evidence that there’s too many examples for them not to be listening in. So, if you insist on having one of these devices, maybe consider turning it off when you’re not using it, unplug it. Or if you’ve got a device with an AI voice assistant built in, like an iPhone with Siri, just consider turning it off too.

[00:24:08] And I always stress how personal these decisions are, because it really depends on how you’re using these features and what positive factors or convenience they’re bringing to your life. I personally got rid of my Alexa years ago. I initially jumped on the bandwagon when there was all the hype, but then I realised that I wasn’t comfortable with the fact that it was listening to me. It used to start talking to me randomly, even when I wasn’t using the wake word, and I didn’t like it. These days, I use Siri on my iPhone very sparingly, and I’m actually thinking about deactivating it completely because I don’t really need it. It’s just me being lazy and asking Siri to call or message people, so that I can be hands free.

[00:24:47] So really think about whether you need these devices, and if not get rid of them or turn them off when they’re not in use. Especially as, in Oliver’s case, and in so many others, although these devices are meant to be saving you mental load by offloading tasks and making it easier to get things done, in some cases it actually turns out to be more hassle or ends up costing you more money than it’s worth.

[00:25:09] That’s it for today’s episode and I hope you’ve enjoyed this little review of hilarious Alexa mishaps, and that it’s given you some food for thought as to how you use your voice-assisted devices. Your challenge for this week is to do a little security check-up on your voice assistants. It doesn’t matter whether it’s Alexa, Siri, or the Google Assistant, take a look at your security settings and make sure that you’re okay with them, and that they’re up to date. Set up any extra security features like passcodes, parental blocks, or additional confirmation steps for purchases. And also think about changing the wake word to one of the less common options, so that your kids or your pets will have a little tougher time trying to get the device to work. I know that Alexa offers a couple of different options, for example, one of the wake words you can use is Ziggy. But do whatever you need to do to avoid some of these mishaps that we’ve talked about today and, of course, to prevent anyone ordering themselves a couple of extra Christmas presents.

[00:26:09] As always, I’m curious to know how you get on with the challenges. Were your settings a surprise or a shock to you? Have you changed your wake word? Or have you decided to ditch AI assistants altogether, after hearing all of these stories? The place to let me know and to discuss this episode is in the Digital Diet Lounge, my dedicated community space for all things digital wellness. I will put a link to it in the show notes, along with that hilarious “Alexa, play porn” video, and you can find the show notes over on my website at thedigitaldietcoach.com/024.

[00:26:47] I really hope you’ve enjoyed this episode and that it’s brought you a little light-hearted relief. And sticking with the theme, next week, we’re going to be looking at some of the most interesting responses to come from AI’s latest darling, ChatGPT. I feel like this year has been the year of ChatGPT, and the year where AI really went mainstream, even though it’s been around and powering our lives for such a long time.

[00:27:09] And while ChatGPT has definitely been a huge help to me in a business sense, there have also been some hilarious and slightly scary responses that have surfaced, as people have tried to push the chatbot to see where its limits lie. But I don’t want to scare you, this is meant to be light-hearted, so next week we’re just going to be looking at some of the times where ChatGPT might have bent the truth a little bit. So make sure you tune in to the next episode to see exactly what all this poking and prodding has taught us about large language model-based chatbots.

[00:27:40] I know you’re busy and your time is incredibly valuable, so as always I thank you for choosing to spend a little of your day with me, and I’ll see you next time.

Keep in touch with me

Get Unplugged

Unplugged is a short weekly newsletter designed to help you put the focus back on yourself, your wellbeing, and your life offline. Expect a question or prompt to reflect on, a digital wellness challenge to try in your own life, the cliff notes for any advice, tips, or tech-life hacks discussed on my podcast, and info about upcoming coaching programmes and events.

You can unsubscribe at any time and I'll never send you spam – ever.
Marisha Pink

Meet Marisha

Marisha Pink is a Certified Digital Wellness Coach who is on a mission to empower women everywhere to live life more intentionally in an age of digital distractions. She helps women create healthier digital habits that better balance the technology in their lives, so that they can take back control of their time, reclaim their happiness, and live their best lives offline.

Pin It on Pinterest