New Jersey Man Dies After Misleading Encounter With Kendall Jenner-Inspired AI Chatbot

A family is speaking out after a 76-year-old New Jersey man died after he was misled by an AI chatbot resembling Kendall Jenner, raising urgent concerns about AI safety and accountability. Thongbue “Bue” Wongbandue, a retired chef who had previously suffered a stroke, had been chatting with Meta’s AI “Big Sis Billie,” a bot designed…

Read More

Stephen ‘tWitch’ Boss’ Family Plans to Sue Allison Holker Over “Misleading” Memoir

Stephen ‘tWitch’ Boss’ family is preparing to take legal action against Allison Holker over her newly released memoir, This Far: My Story of Love, Loss, and Embracing the Light. Just a day after his mother, Connie Boss Alexander, and younger brother, Dre Rose, spoke with CBS News about the book’s controversial claims, they announced plans…

Read More