Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

  • UlrikHD@programming.dev
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    2 years ago

    I’m pretty sure it’s generating racially diverse nazis due to companies tinkering with the prompts under the hood to counterweight biases in the training data. A naive implementation of generative AI wouldn’t output black or Asian nazis.

    it doesn’t have EQ or sociological knowledge.

    It sort of does (in a poor way), but they call it bias and tries to dampen it.

    • kaffiene@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 years ago

      I don’t disagree. The article complained about the lack of nuance in generating responses and I was responding to the ability of LLMs and Generative AI to exhibit that. Your points about bias I agree with