A while back there was some debate about the Linux kernel dropping support for some very old GPUs. (I can’t remember the exact models, but they were roughly from the late 90’s)

It spurred a lot of discussion on how many years of hardware support is reasonable to expect.

I would like to hear y’alls views on this. What do you think is reasonable?

The fact that some people were mad that their 25 year old GPU wouldn’t be officially supported by the latest Linux kernel seemed pretty silly to me. At that point, the machine is a vintage piece of tech history. Valuable in its own right, and very cool to keep alive, but I don’t think it’s unreasonable for the devs to drop it after two and a half decades.

I think for me, a 10 year minimum seems reasonable.

And obviously, much of this work is for little to no pay, so love and gratitude to all the devs that help keep this incredible community and ecosystem alive!

And don’t forget to Pay for your free software!!!

  • @RegalPotoo@lemmy.world
    link
    fedilink
    English
    254 months ago

    As long as someone is willing and able to maintain it.

    It’s open source. All the work is either done by volunteers or by corporate sponsors. If it’s worth it for you to keep a GPU from the 90s running on modern kernels and you can submit patches to keep up with API changes, then no reason to remove it. The problem isn’t that the hardware is old, it’s that people don’t have the time to do the maintenance

    • @WhatAmLemmy@lemmy.world
      link
      fedilink
      English
      84 months ago

      However, when it comes to any proprietary hardware/software the solution is simple. All companies should be required by law to open source all software and drivers, regardless of age, when the discontinued support; including server side code if the product is dependent on one (massive for gaming).