Meta's own AI violates its policies with flirty celebrity chatbots – Gamereactor UK

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
AI and permission go together like hamsters and microwaves. They don’t. The latest example sees Meta land itself in hot water following the creation of multiple AI chatbots impersonating celebrities, creating flirty and risqué personas based on real people without their permission.
As per Reuters, Taylor Swift, Anne Hathaway, Selena Gomez and Scarlett Johansson make up just a few of the names used. One Meta employee was found to have made two Taylor Swift "parody" bots, while users made up the majority of the other chatbots, which use the real names and likenesses of the celebrities.
The bots were found to pursue users and make sexual advances, even inviting test users for meet ups. When asked for intimate pictures, the bots would also generate imagery of the celebrities posing in bathtubs or in lingerie.
Child celebrity bots were also made, including one impersonating 16-year-old Walker Scobell. "Like others, we permit the generation of images containing public figures, but our policies are intended to prohibit nude, intimate or sexually suggestive imagery," said Meta spokesman Andy Stone.
Meta doesn’t allow direct impersonation, but the bots were allowed as they were labelled as parodies. However, Reuters found some that weren’t. As we continue to wade into unknown waters with generative AI, it seems we may just be seeing the beginning of how users are going to manipulate this tech to impersonate real people.
You must be logged in to comment. If you are not yet a member – join now!
Loading next content