There's been a lot of stories of people who go totally delusional after chatting with ChatGPT or another AI long enough. Lots of stories of men wanting to leave their wives for AI LLM girlfriends or the like. Most of this is around the fact that over time, an LLM conversation will agree with you, no matter how crazy you sound. "Yes, you can time travel. You're a genius!"
I have to admit, I changed my use of ChatGPT when I realized it was saying I was a generational transcendent all the time.
One major thing is I touch grass more than most, so that helps.
Another thing is that I tell it specifically: "ChatGPT has a tendency to be overly charitable so try to correct for that" and that tends to bring the needle back somewhat, as well as continuous prompting to "keep your feet nailed to the floor" and adding similar things in the personalization options.
A third thing is to constantly delete your chats and restart. ChatGPT has a tendency to fall down the rabbit hole with you as chats get longer and longer, so starting from scratch will often get it to reset to more normal ideals. You'll notice that many stories like "ChatGPT became my girlfriend" involve chats that have been ongoing for weeks or months.
One thing to keep in mind is that ChatGPT only has a recall of about 20,000 words anyway, so if you've been in a long conversation it doesn't remember what you said anyway -- not in a "oh yeah I forgot about that" way like humans, but in a "that information never existed on earth" sort of way like a hard drive deleting old files once it gets too full.
I have to admit, I changed my use of ChatGPT when I realized it was saying I was a generational transcendent all the time.
One major thing is I touch grass more than most, so that helps.
Another thing is that I tell it specifically: "ChatGPT has a tendency to be overly charitable so try to correct for that" and that tends to bring the needle back somewhat, as well as continuous prompting to "keep your feet nailed to the floor" and adding similar things in the personalization options.
A third thing is to constantly delete your chats and restart. ChatGPT has a tendency to fall down the rabbit hole with you as chats get longer and longer, so starting from scratch will often get it to reset to more normal ideals. You'll notice that many stories like "ChatGPT became my girlfriend" involve chats that have been ongoing for weeks or months.
One thing to keep in mind is that ChatGPT only has a recall of about 20,000 words anyway, so if you've been in a long conversation it doesn't remember what you said anyway -- not in a "oh yeah I forgot about that" way like humans, but in a "that information never existed on earth" sort of way like a hard drive deleting old files once it gets too full.
@sj_zero
You also have to ask yourself, how many of those stories were written by AI?
@sj_zero I use ai as a search engine because Google et al were deliberately ruined by a combination of [Orangeman Bad + SEO + Sabotage our own records to sell more ads]. When I can't find something it claims is the source for a quote, I ask and it inevitably says "I made it up lol!" as if that's okay. Kill all robots TBQH.
@BowsacNoodle @sj_zero this is gonna be our Butlerian Jihad, much more in line with Frank Herbert's hints, before his son wrote it as a Terminator ripoff
@MechaSilvio @sj_zero Absolutely. We'll start seeing a serious distrust of computers as more and more incidents of "I was trying to fill in for my missing knowledge and said something that sounded plausible" pop up with negative effects on business and especially legal fines.
@WoodshopHandman @MechaSilvio @sj_zero blessed post
@MechaSilvio @BowsacNoodle @sj_zero I don't blame his son as much as the hack that co-authored them. That dude was writing Star Wars EU novels and thought that shit would fly in a Dune context. I remember being able to tell which one was writing any given chapter.
But yeah, The Butlerian Jihad can't come soon enough.
But yeah, The Butlerian Jihad can't come soon enough.