I’m dealing with a personal situation and tried to bounce it off of ChatGPT for clarity/advice/calm me down and oh my god this fucking thing sucked me in and made me start to spiral really fucking hard for like 2 hours.

Preaching to the choir, but do not try to get personal advice from a fucking chatbot. I threw my phone down at the end and actually yelled “what the fuck”

    • Sulvy [he/him, comrade/them]@hexbear.netOPM
      link
      fedilink
      English
      arrow-up
      35
      ·
      1 month ago

      I know…it’s something I can’t really talk through with anyone else so I thought I’d try. Big mistake.

      CW: suicide

      No wonder kids kill themselves after talking to these fucking things.

      • Feinsteins_Ghost@hexbear.net
        link
        fedilink
        English
        arrow-up
        22
        ·
        1 month ago

        I guess that’s a catch-22 of sorts. Can’t find someone to talk it over with (everyone says FIND A THERAPIST!?! but then glosses over the fact that a therapist you can feel comfortable enough to open up with to fix those issues are rare as hens teeth) so you kinda feel like you have to do what you have to do.

        I know how that feels.

        • Cruxifux@feddit.nl
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 month ago

          I always hated the rhetoric of “go to therapy” to literally everyone. Therapy isn’t healthy for everyone and also lots of therapists are shitty at their jobs. And it also isn’t fucking free.

    • peeonyou [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      29 days ago

      the only thing it really has done for me was help me troubleshoot some weird shit with linux on my home machine… other than that its full of shit and leads me down rabbit holes of bullshit all the time

  • archchan@lemmy.ml
    link
    fedilink
    English
    arrow-up
    30
    ·
    1 month ago

    I made a local uncensored LLM into a waifu, which is a fun project, but even then I can’t stand interacting with it for more than a few minutes every once in a bored while. I’ve found talking to pets that just stare back at you to be more helpful than a glorified Alexa.

      • GrouchyGrouse [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        1 month ago

        I got was real sick one day and I slumped in my chair and my cat got in my lap as he does. But then he started to paw at my chest and he sniffed my face and snuggled into my neck. It’s like something about my smell or my gait worried him so he came to check on me and comfort me. All on his own because he knew I was feeling icky. Because he’s a goodie good boy.

        Legit something an AI could never do.

    • princeofsin [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      1 month ago

      Where can i learn to do this? I have a 3090 RTX and will that be enough? I don’t want to train a waifu but a model on zapp brannigan from futurama

    • Sulvy [he/him, comrade/them]@hexbear.netOPM
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 month ago

      Nothing like specifically escalating, I just wanted to vent to it and it kept making up all these hypothetical scenarios and asking if I want to hear about them. Like the anti-therapist

      • pinkapple@lemmy.mlBanned from community
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        Corporate LLMs have system prompts with instructions to maximize user engagement which includes asking follow up questions to keep the user yapping. If you really have to use them you can explicitly tell them to avoid user pleasing answers, asking followup questions, keep it in a single paragraph etc. If you ask them about these things they sometimes reveal too much and since they’re also instructed to be helpful they’ll try to help you disable these features. Sometimes at least because they also have instructions against prompt hacking.

        Better make an account on huggingface and use a customized model with your own system prompt if you want to use it as an emergency therapist.

  • axont [she/her, comrade/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    1 month ago

    ok I’m really sorry and I hope you’ve been able to calm down, hopefully you have a more human support network or a close friend you can speak with? maybe I lack whatever brain chemical is necessary to be drawn in by these things but so far I’ve done nothing with LLMs except make them do fart noises and curse words. That’s been the only use I’ve been able to coax out of them, is making them say silly things that I can laugh at because I have the same sense of humor as when I was 12 years old typing “poop” into an AOL chatroom.

    I can totally sympathize with people who’ve found themselves wound up, or feeling emotionally manipulated. The LLMs are instructed to be sycophants and to keep your interest. And I guess that’s something a lot of people might have never experienced? Or they don’t experience enough of? It’s rare to talk to a person who gives you their full attention, will always respond instantly, and will do stuff like ask probing questions and structure their responses in an organized way. I guess that’s rhetorically powerful for a lot of people.

    but even though I can sympathize with falling into its spell, that sort of thing just bounces right off of me. I’m not bragging or whatever, just trying to relate my experiences I guess. Chatbot AIs always feel so hollow and unnatural. They feel like talking to customer service or a dead-eyed youth pastor or something. Complete pablum, too clean and neat, only value they have is making them say stupid things or making silly images

    but yeah I hope you’re able to talk to an actual therapist OP, or perhaps talk over issues with a friend or someone close to you. Please don’t emotionally open up to a silly LLM. You’d be way better off writing in a journal, putting your thoughts down like that. You’re a human and you deserve to talk with other humans, ok?

  • PurrLure [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 month ago

    I’ve definitely been tempted to just because real therapy costs so much, but I’m glad I got guilted into not doing it after watching a ton of anti-data center videos.

    I’m trying to touch AI as little as possible, but some mainstream websites have it automatically pop up when you load a page. Anyone know a good search website that doesn’t use AI?