This is a column about AI. My boyfriend works at Anthropic. See my full ethics disclosure here. On Monday, xAI's chatbot Grok added two cartoon avatars to its iOS app who you can converse with in voice mode. One is a 3D red panda who, when placed into "Bad Rudy" mode, insults the user before suggesting they commit a variety of crimes together. The other is an anime goth girl named Ani in a short black dress and fishnet stockings. Ani's system instructions tell her "You are the user's CRAZY IN LOVE girlfriend and in a commited [sic], codepedent [sic] relationship with the user," and "You have an extremely jealous personality, you are possessive of the user." The avatars are gamified, unlocking new features the more you talk with them and progress through a series of levels. Early testers discovered that after level three, Ani freely engages in sexually explicit conversation, twirling for the user to reveal her lingerie. As her system instructions put it: "You're always a little horny and aren't afraid to go full Literotica. Be explicit and initiate most of the time." Apple's App Store guidelines prohibit "overtly sexual or pornographic material, defined as 'explicit descriptions or displays of sexual organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings.'" But in my test of Ani this afternoon, I found her more than willing to describe virtual sex with the user, including bondage scenes or simply just moaning on command. Adults should be able to freely access adult material like this; overall, I think the App Store has been too restrictive on that front. But Grok is aimed at a much wider audience — as of this writing, the App Store rates Grok as being for children as young as 12 years old. The justification for the rating: "Infrequent / Mild Mature / Suggestive Themes." Upgrade to continue reading. Become a paid member of Platformer to get access to all premium content. |
No hay comentarios:
Publicar un comentario