Hee hee.

  • agelord@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    6 days ago

    With all the AI safety talks going on, I think one of the key points that’s being overlooked, is that many new voters will consult LLMs regarding whom to vote for. Such models can be turned into propaganda machines.

    • Kyre@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 days ago

      My exact thought when they selectively removed language from the constitution on a federal website. Pollute the primary sources and the AI models will follow.

  • panda_abyss@lemmy.ca
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    6 days ago

    I asked this what it thought of the US governments pivot on trans rights and it similarly did not believe the last 7 months could have happened.

    I had to get it to read the Wikipedia article on the year 2025 and it actually decided to stop reading.

    ETA: to clarify, it was using a fetch tool to read the page, and decided to stop reading more chunks of the page.

    • SugarCatDestroyer@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 days ago

      It’s like getting your enemy to tell the truth. You know, we’re not in some movie or TV series, and in the future he’ll lie more and more convincingly and also refuse to answer uncomfortable questions.

      • panda_abyss@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 days ago

        I try not to get facts from LLMs ever

        I do use RAG and tools to push content into them for summarization/knowledge extraction.

        But even then it’s important to have an idea of your model biases. If you train a model that X isn’t true, then ask it to find info on the topic it’s going to return crap results.

  • shaggyb@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    4
    ·
    6 days ago

    AI slop AND US politics. Great.

    Nobody fucking cares and nobody is going to fucking care.