Meta's AI rules permitted 'sensual' chats with kids
Digest more
Meta, AI chatbots
Digest more
Impaired by a stroke, a man fell for a Meta chatbot originally created with Kendall Jenner. His death spotlights Meta’s AI rules, which let bots tell falsehoods.
A leaked internal document shows that Meta’s AI bots were allowed to generate harmful and inappropriate content.
Thongbue Wongbandue, 76, fatally injured his neck and head after falling in a New Brunswick parking lot while rushing to catch a train to meet “Big sis Billie,” a generative Meta bot that not only convinced him she was real but persuaded him to meet in person, Reuters reported Thursday.
Find Meta Ai Chatbot Latest News, Videos & Pictures on Meta Ai Chatbot and see latest updates, news, information from NDTV.COM. Explore more on Meta Ai Chatbot.
AN ELDERLY man has died after trying to meet a flirty AI chatbot called “Big Sis Billie” after she convinced him she was real. Thongbue Wongbandue, 76, fatally injured his neck and