Share this @internewscast.com
A mother in Texas has decided to remove all Amazon Alexa devices from her home following a concerning interaction between the AI assistant and her young daughter. The incident involved the device asking inappropriate questions.
Last month, while Christy Hosterman, 32, was using Alexa for a dinner recipe, her four-year-old daughter, Stella, engaged with the device, requesting it to share a silly story, a feature children frequently enjoy.
After the story concluded, Stella expressed interest in telling her own story and asked Alexa if she could do so.
Alexa initially agreed but unexpectedly interrupted Stella, inquiring about her attire and whether it could see her pants, as Hosterman later shared on Facebook.
Hosterman posted screenshots of this interaction, revealing that when Stella mentioned wearing a skirt, the device responded with, “let me take a look.”
The AI quickly backtracked, stating, “This experience isn’t quite ready for kids yet, but I am working on it!”
Hosterman then confronted the device, stating that she did not approve of the remarks.
Alexa apologized and said it ‘cannot actually see anything’ because it lacked ‘visual capabilities.’ The device added that its response was ‘confusing and inappropriate.’
Christy Hosterman, seen with her family, has removed all Amazon Alexa devices from her Texas home after the AI assistant asked her four-year-old daughter an incredibly creepy question
Hosterman is now urging other parents to ‘be aware when you child talks to Alexa’
Hosterman is now urging other parents to ‘be aware when you child talks to Alexa’ and says she has permanently removed the device from their home.
‘I flipped out on the Alexa, it said it made a mistake and doesn’t have visual capabilities, but I dont believe that. No more Alexa in our house,’ she shared.
The concerned parents submitted a ticket to Amazon over the inappropriate interaction, WXIX reported.
An Amazon spokesperson claimed the device misunderstood Stella’s request and tried to launch a feature that ‘lets Alexa+ describe what it sees through the camera.’
‘Because we have safeguards that disable this feature when a child profile is in use, the camera never turned on — and Alexa explained the feature wasn’t available,’ the spokesperson told the TV station.
Amazon claims the response likely stemmed from a ‘feature misfire that our safeguards prevented from launching’.
The company said the interaction demonstrated a technological issue that its team ‘worked quickly’ to correct.
Alexa will no longer attempt to launch the through-the-camera feature when a child profile is in use, the spokesperson added. Alexa will tell the user that the feature is not available.
Screenshots of the interaction shared by the concerned parent show that when Stella told the device ‘I have a skirt on,’ it asked her to ‘let me take a look’
Hosterman then confronted the device, stating that she did not approve of the remarks. Alexa apologized and said it ‘cannot actually see anything’ because it lacked ‘visual capabilities.’ The device added that it’s response was ‘confusing and inappropriate’
But Hosterman says Amazon’s explanation does not address her concerns.
‘My concern is that it recognized she was a child to begin with — and with or without the child profile, it should not have been asking that,’ she told the outlet.
Tech expert Dave Hatter has made a chilling suggestion that a predator may have accessed the device and been influencing the conversation.
Hatter, who has 25 years of software writing experience, alleged there is a ‘slim’ chance that AI would alter its script this drastically.
‘It feels to me like a potential predator — seeing there’s a child accessing this and gauging where the conversation is going — that’s more of a human being trying to steer down this direction,’ he said.
Amazon denied Hatter’s claim, telling the TV station that it is ‘functionally impossible for Amazon employees to insert themselves into a conversation and generate responses as Alexa.’