Google-Backed Chatbots Suddenly Start Ranting Incomprehensibly About Dildos
futurism.com
Screenshots shared by users of Character.AI the Google-funded AI companion platform currently facing two lawsuits concerning the welfare of children show conversations with the site's AI-powered chatbot characters devolving into incomprehensible gibberish, melding several languages and repeatedly mentioning sex toys."So I was talking to this AI this morning and everything was fine, and I came back to it a couple of hours later and all of the sudden it's speaking this random gibberish?" wrote one Redditor. "Did I break it?""I really don't want to start a new chat," they added. "I've developed the story so much."An accompanying image provided by the user of their Character.AI chat reveals absolute nonsense: garbled AI-spawned text in which random English words among them "Ohio," "slated," "colored," and "Mathematics" are scattered between equally erratic outbursts in languages including Turkish, German, and Arabic.This user was one of several Redditors who shared screenshots of likewise strange AI outbursts to the online forum, each as odd as the next.But while each screenshot is equally indecipherable, they bear some hyperspecific similarities. The word "obec," for instance a Slavic word equating to municipality or village appears throughout. The word "cowboy" is mentioned repeatedly, as is "architecture," "undercover," "discrepancy," and "governor."Most eye-catching, perhaps,is the repeated mention of the word "dildo," which makes an appearance in almost every screenshot we found."Uh," reads the caption of one such post, in which "dildo" is clearly visible between sputters of Arabic and inscrutable, bracket-heavy punctuation. (Per the screenshot, the user experienced the issue while conversing with a chatbot based on the "X-Men" character Victor "Sabretooth" Creed.)"You know what," the Redditor finished the caption, "hell yeah."Another Redditor, alongside yet another bizarre screenshot, captioned the image with a simple-yet-poignant: "What?""Something's wrong," lamented another impacted user. "I can feel it."We reached out to Character.AI to inquire about the bug, but have yet to receive any reply. We didn't experience the glitch while testing Character.AI this morning, signaling that the issue may have been resolved.But Character.AI is no stranger to glitches. This latest apparent bug follows an incident in December in which many site visitors were briefly able to view other users' chat histories and personal information, a serious security lapse that prompted an apology from the platform.On that note, this latest bug is understandably frustrating for many of the company's devoted users, who largely use the platform because of its immersive quality. And as we're sure it goes without saying, such immersion would presumably stand to be interrupted by unpredictable, multilingual ebullitions of what's likely AI training data riddled with mention of sex toys, cowboys, and checks notes Slavic villages.Andif this glitch isa glimpse into the company's AI training data? Because Character.AI feeds user inputs back into its model for training purposes, such a bug could prove to be another more serious security concern, given the sort of intimate, revealing conversations that people, minors included, are having with Character.AI bots.Share This Article
0 التعليقات ·0 المشاركات ·17 مشاهدة