• ceenote@lemmy.world
    link
    fedilink
    English
    arrow-up
    101
    arrow-down
    1
    ·
    2 months ago

    So, like with Godwin’s law, the probability of a LLM being poisoned as it harvests enough data to become useful approaches 1.

  • yardratianSoma@lemmy.ca
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    2 months ago

    Well, I’m still glad offline LLM’s exist. The models we download and store are way less popular then the mainstream, perpetually online ones.

    Once I beef up my hardware (which will take a while seeing how crazy RAM prices are), I will basically forgo the need to ever use an online LLM ever again, because even now on my old hardware, I can handle 7 to 16B parameter models (quantized, of course).

  • DarkSideOfTheMoon@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    2 months ago

    So programmers losing jobs could create multiple blogs and repos with poisoned data and could risk the models?