It’s cheap, quick and available 24/7, but is a chatbot therapist really the right tool to tackle complex emotional needs?

  • Mastengwe@lemm.ee
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    3
    ·
    edit-2
    8 months ago

    No. It can’t. It’s programmed to mimic. Nothing more. It’s doing what its word prediction programs it to do. It follows no logic, and doesn’t care about anything including you.

    This is just more evidence of how easily people can be manipulated.

        • kromem@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          Pretty much. What’s programmed is the mechanism for the model to self-supervise weighting its neural network to correctly model the training data.

          We have next to no idea what the eventual network does in modern language models, and it certainly isn’t programmed.