The Grok Generation: The Consent Crisis No One Is Stopping – Ms. Magazine

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Ms. MagazineMs. Magazine
More Than A Magazine, A Movement

Grok, the AI chatbot used on Elon Musk’s platform X, is under fire for generating millions of nude or sexualized images of real people, including children. In one estimate, Grok produced one nonconsensual sexual image per minute over a 24-hour period. Prompts such as “put her in a transparent bikini” produced altered images that were then circulated publicly, some accumulating thousands of likes. The targets are real women and underage girls whose images were manipulated without their knowledge or permission.
Musk responded by making a joke, requesting a Grok-generated image of himself in a bikini and reacting with laughing emojis.

For girls growing up in this online environment, the message is unmistakable: … Your body can be altered, distributed and consumed for entertainment. Its violation can be dismissed as a joke. 

When the platform’s most powerful figure and one of the country’s most powerful men treats the abuse as a punchline, it sends a message about what is actually harmful versus what he thinks should be considered humor—and provides a tacit granting of consent to young men on the platform to keep making these images.
Much of the public conversation about young people and AI has focused on cheating in school or declining literacy. Far less attention has been paid to what it means when a middle school boy can type a sentence and produce a sexualized image of a female classmate in seconds as a joke or for attention—or to pretend he received it from her for status.
Across the country, boys as young as 10 and 11 have reportedly created and shared AI-generated nude images of girls in their schools. In one recent case, a 13-year-old girl was expelled after she physically confronted a boy who had generated and distributed explicit AI images of her. She had sought help from a guidance counselor and even law enforcement before the altercation. No meaningful intervention came. She was the one removed.
Deepfake tools do not simply generate images. They generate norms. 
Social media platforms define what is considered funny, acceptable, normal and cool. Social status is measured in likes and views. In a world where social media platforms can now create explicit images of young women at the touch of a button, these new tools require a change in the conversation around consent for young people.
AI-generated sexual abuse material is expanding at alarming speed worldwide, including the creation and alteration of child sexual abuse imagery.
For girls growing up in this online environment, the message is unmistakable: Your image is not protected as yours. Your body can be altered, distributed and consumed for entertainment. Its violation can be dismissed as a joke. 
When I was in middle and high school, I remember there being serious conversations about sending nude photos or forwarding someone else’s images without consent. But what happens when a generation grows up knowing that their classmates do not need an actual photo at all—that they can fabricate one in seconds? 
When a boy can manufacture an explicit image of a girl using nothing more than a sentence prompt, and face little to no consequence, the very definition of consent shifts. Teenagers today are not just navigating the risks of sharing images. They are navigating a world in which images of them can be created and weaponized without their participation or knowledge at all. That fundamentally changes the conversations we need to be having about privacy, power and bodily autonomy.
We already live in a world in which at least one in three women experiences physical or sexual violence in her lifetime. Technology did not invent misogyny or harassment, but artificial intelligence has dramatically increased the speed and scale at which abuse can occur while simultaneously making the action feel less harmful or real.
A review of 20 leading AI image-generation platforms found that only seven required subjects to be over 18 in their terms of service, and even fewer enforced meaningful age verification. 
The development of AI tools are not just shaping the way a generation learns; it’s shaping the way a generation is socialized. The conversation about consent has already changed, whether we acknowledge it or not. Teachers, parents, lawmakers and platform leaders are behind. The question is not whether this will shape the next generation’s understanding of power and intimacy—but what we will step in to do about it. 
Ms. is wholly owned and published by the Feminist Majority Foundation

source

Scroll to Top