• @arc99@lemmy.world
    link
    fedilink
    English
    135 days ago

    Hardly surprising. Llms aren’t -thinking- they’re just shitting out the next token for any given input of tokens.