ChatGPT’s new AI store is struggling to keep a lid on all the AI girlfriends::OpenAI: ‘We also don’t allow GPTs dedicated to fostering romantic companionship’

  • AllonzeeLV@lemmy.world
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    1
    ·
    edit-2
    2 years ago

    To be fair, it cares about you exactly as much as your OnlyFans crush.

    Probably a cheaper obsession.

      • veni_vedi_veni@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        2 years ago

        This is just a pr stunt.

        Nobody gaf about others psychological well being, so let em have the new opioid. At least there’s no collateral damage with this

      • cerulean_blue@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 years ago

        It’s not just Simps. Boomers are very vulnerable to this technology, especially the lonely or widows. For some reason, they can be recklessly trusting of internet relationships/technologies. Hence why they keep getting exploited by Romance and Pig Butchering scams and why they can’t get enough Facebook Trump/COVID misinformation.

      • bane_killgrind@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        8
        ·
        2 years ago

        Humans are a cancer on the world if some percentage never leaves the house and therefore procreates we should exploit that mechanic

        • Pretzilla@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 years ago

          Good point but they are already here and taking up resources.

          Let’s give them something productive to do to pull their weight and fix some of the terrible existential issues the world is facing.

          Get out and plant some trees for example.

  • paddirn@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    5
    ·
    2 years ago

    I’d love to have an AI assistant/girlfriend like JOI from Bladerunner 2049, something I could jerk off to one minute, then have her prepare my taxes and order a pizza the next. However, these ChatGPT girlfriends all seem like they’re just subscription chatbots. Maybe some day we’ll get there and nerds will work up a local, open-source slutty AI girlfriend, but for now they’re all just crap.

  • _number8_@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    6
    ·
    2 years ago

    why? why not let people just retreat into fantasy? it’s probably healthier than many common coping mechanisms. i mean, it’s a chatbot, how much can you do with it?

    let people have their temporary salve to get them thru whatever they were going thru such that they were resorting to this. and if it’s not temporary, ok, fine? better to have some outlet than be even more mentally isolated. maybe in 50 years this will be common, who knows.

    • cyd@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      ·
      2 years ago

      Liability. Imagine an AI girlfriend who slowly earns your affection, then at some point manipulates you into sending bitcoins to a prespecified wallet set up by the model maker. Because models are black boxes, there is no way to verify by direct inspection that an AI hasn’t been trained with an ulterior agenda (the “execute order 66” problem).

      • dirthawker0@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 years ago

        Some guy in the UK was allegedly convinced by his chatbot girlfriend to assassinate Queen Elizabeth. He just got sentenced a few months ago. Of course he’s been determined to be psychotic, but I could imagine people who would qualify as sane getting too deep and reading too much into what an AI is saying.

      • Kittenstix@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        2 years ago

        Yep, I was having a conversation with a guy that informs policy makers on ai, he had given a whole presentation to a school board meeting I went to a few nights ago.

        He said that’s his highest recommendation when it comes to what should be done on the lawmaker side, pass bills that push for opening up those black boxes so we can ensure transparency.

        • cyd@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 years ago

          Problem is, there isn’t a way to open up the black boxes. It’s the AI explainability problem. Even if you have the model weights, you can’t predict what they will do without running the model, and you can’t definitively verify that the model was trained as the model maker claimed.

          • Kittenstix@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 years ago

            I see, my knowledge is surface deep so I admit this is new information to me.

            Is there no way to ensure LLMs are safe for like kids to use as a tool for education? Or is it just inherently going to come with some risk of exploitation and we just have to do our best to educate students of that danger?

    • devfuuu@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      2 years ago

      These kinds of things are not temporary. We know that humans can’t control themselves and aren’t rational enough to “just use it a bit”. It’s highly addictive and leads to people to remove themselves from reality.

  • danielfgom@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    2 years ago

    It’s just a big money grab! Everyone is trying to get rich quick. Like with the App Store. Everyone is hoping their bot breaks into the big time and makes them rich.

    This is what is terrible about society. Few are making bits that help people, they make bots that appeal to the base desires. A race to the bottom if you will…will man ever learn???

  • afraid_of_zombies@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    2
    ·
    2 years ago

    Why? Let it happen.

    There are zero issues with consent. It is a chatbot not a living or sentient.

    There are zero issues with STDs or pregnancy scares.

    No one is going to want this as a substitute for a real relationship and if they do you might be doing the world a favor by allowing them a way out of the datingpool.

    Sure it could be used for honeypots but we have had that for pretty much all of history. People fake affection or who they are to get what they want.

    I just don’t see how this is at all different than the forms of erotica we already have and society has adapted to.