Forgot password
Enter the email address you used when you joined and we'll send you instructions to reset your password.
If you used Apple or Google to create your account, this process will create a password for your existing account.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Reset password instructions sent. If you have an account with us, you will receive an email within a few minutes.
Something went wrong. Try again or contact support if the problem persists.
Image by stockcatalog, CC BY 2.0.

Texas mom thought the Alexa was harmless for her 4-year-old daughter, but an inappropriate question made her fear for her child’s safety

She now has the "No Alexa in the house" policy.

Artificial Intelligence is becoming an integral part of human life, but some users have had strange encounters with this technology.  After SZA’s criticism of AI usage in music, a mother from Texas shared her seemingly terrible interaction with the Virtual Assistant Alexa. She declared that the assistant tried to sexualize her female child through some inappropriate questions.​

Recommended Videos

According to Unilad, Christine Hoffman reported that while she was preparing dinner for the family, her child was listening to a story from Alexa, and when it finished, the child started narrating her own story. As she was narrating the story, the assistant intervened and asked to look at what the child was wearing. Christine declared that even though the child told Alexa about the outfit, it was very odd from the AI, and now she is getting rid of this technology from her home.​

Christine, in an interview, said, ‘My concern is that it recognized she was a child to begin with — and with or without the child profile, it should not have been asking that. Alexa told her silly story, and then my daughter started telling her story about a princess, and then out of nowhere, Alexa said, ‘Hold that thought, I’d love to see what you’re wearing.’” She added, “There will be no more Alexa in my house. I just don’t want to take any chances.”

UN issues warning for US parents on rising tech-assisted abuse

Even though this event seems like a trivial matter, the United Nations has something else to say. As per the United Nations, the Childlight Global Child Safety Institute has conducted research on the technology-assisted abuse cases in the US and saw a shocking rise in these events. According to them, technology‑facilitated child abuse cases in the US increased from 4,700 in 2023 to more than 67,000 in 2024. This is fifteen times higher in just one year, and with the influx of this technology, it could lead to significantly higher numbers in the coming years.

Following a massive Alexa upgrade for Prime members, an Amazon spokesperson issued a statement on this incident. He suggested that Alexa misunderstood the command and even declared that, as the child’s own profile was in use, the camera was never turned on, even when Alexa allegedly asked the child to show her dress.​

Even with issues like these on the rise, the integration of AI into human life is inevitable. There are situations like these arising, but AI still stands as one of the important technologies of the present age. Regardless of Alexa’s true intentions, it is now up to regulatory bodies to address the growing negative effects of this technology, especially on children and minors.


Attack of the Fanboy is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy
More Stories To Read
Author
Image of Saif Ur Rehman
Saif Ur Rehman
Saif is a sportswriter who covers the NBA, NFL, WWE, Formula 1, and global soccer, bringing a sharp focus on strategy, evolving trends, and the subtle moments that can quietly reshape a season. He remains closely connected to pop culture as well, especially where it naturally intersects with the world of sports. He has also contributed to Operation Sports, delivering in-depth analysis and timely coverage across multiple leagues and storylines.