A while back there was some debate about the Linux kernel dropping support for some very old GPUs. (I can’t remember the exact models, but they were roughly from the late 90’s)
It spurred a lot of discussion on how many years of hardware support is reasonable to expect.
I would like to hear y’alls views on this. What do you think is reasonable?
The fact that some people were mad that their 25 year old GPU wouldn’t be officially supported by the latest Linux kernel seemed pretty silly to me. At that point, the machine is a vintage piece of tech history. Valuable in its own right, and very cool to keep alive, but I don’t think it’s unreasonable for the devs to drop it after two and a half decades.
I think for me, a 10 year minimum seems reasonable.
And obviously, much of this work is for little to no pay, so love and gratitude to all the devs that help keep this incredible community and ecosystem alive!
And don’t forget to Pay for your free software!!!
What do you think is reasonable?
As long as possible unless nobody uses it for cases that need any security (daily driver, server, enterprise etc). If you drop support, you are lazy and support ewaste creation. In some cases it can be too difficult to support it but “too difficult” has a lot of meanings most of which are wrong.
I think for me, a 10 year minimum seems reasonable.
That’s really not enough. GTX 1080 is an almost 10 years old card but it’s still very competitive. Most of my friends even use 750s or similar age hardware. And for software, any major updates just make it more enshittificated now lol.
In principal I don’t disagree.
Problem is supporting everything requires work and effort which isn’t funded by a corporation or anything
Hardware support is usually funded enough or has enough human resources for it not to be a big problem imo. It’s ok to drop 30 years old stuff that nobody uses but dropping something just because rich people have a few years newer hardware is bad.
Yeah entirely missing my point.
As long as possible, as long as someone is using it, as long as someone can keep maintaining it.
If the main developer team can no longer maintain it then open-source it, put it in the public domain and set it free. Ditto for firmware and hardware documentation.
Companies oughta be forced to release all information they have on hardware they no longer maintain and disable any vendor-lock crap once warranty ends.
Yes hardware gets old and in the computer realm it usually means it’s rendered obsolete, but that doesn’t mean it doesn’t have its uses.
As long as someone is willing and able to maintain it.
It’s open source. All the work is either done by volunteers or by corporate sponsors. If it’s worth it for you to keep a GPU from the 90s running on modern kernels and you can submit patches to keep up with API changes, then no reason to remove it. The problem isn’t that the hardware is old, it’s that people don’t have the time to do the maintenance
However, when it comes to any proprietary hardware/software the solution is simple. All companies should be required by law to open source all software and drivers, regardless of age, when the discontinued support; including server side code if the product is dependent on one (massive for gaming).
Don’t disagree with you, but yeah - good luck with that
I do not think that can be determined in the tech space with ‘age’ alone. Popularity, usability and performance are much more important factors.
It was already brought up in another comment, the gtx 1000th gen, is a nice example. The gtx 1080 is after 8 years still a valid GPU to use in gaming and the 1050 a nice little efficient cheap video encode engine which supports almost all modern widespread codecs and settings (except AV1).
i use 10 year old hardware and its pretty capable on linux
we reached a point of diminishing returns in the advance of this technology
the fact that it’s open and you can get old versions of the kernel. i say we are very lucky we get the support we get but ask long as that older version is still available abd opening means no e waste. even 386s
If there are no security updates, it does become ewaste because of severe vulnerability to all sorts of attacks that makes it unsuitable for most use cases. Though it’s still better than nothing.
It is not that simple.
For hardware attacks, older hardware are probably safe since the attacks are specifics to some newer features. I really doubt you can deliver a Spectre attack on anything up until the Pentium or even later.
On the software side, there could be some security bugs to which some older version could be vulnerable since there were not the vulnerable code at the time. Granted, there could be some security bugs that were not yet discovered in older codebase.
Usually, my computers dropped in performance after around 10 years. They might contain parts that are a few years older by that time. So, to be able to use them further, I would suggest a minimum of 15 years.
Good point. If I know it’ll meet my needs, I’m sometimes inclined to buy tech that’s a few years old, especially if the newer version just adds cloud, AI, or something else I don’t want/need. In many cases it’s still marketed the same so I think end of support dates should be clearly marked on the product itself so the consumer can make an informed choice. Intentionally bricking a device should be treated as littering and the company should be responsible for disposal fees.
Linux is a different story because of the volunteer presence. If anything Linux should get subsidies for keeping e-waste out of landfills after the manufacturer has long abandoned the product.
My laptop is about 5 years old now and still runs as fast as the day I bought it, if not faster. I replaced the battery twice, but this thing could go another 5-10 years if I don’t drop it or spill something on it.
The thing is, Linux always gets touted as the way to save old hardware. Win 11 not supporting a bunch of perfectly good older computers is leading to a massive e-waste wave. I understand that kernel devs mostly do it for free, and resources are limited for maintaining support for hardware few use anymore, but I think having a way to viably daily drive old hardware is really important for reducing e-waste and also just saving people’s money. I don’t like buying new tech unless it’s to replace something beyond repair—ie not just an upgrade for the sake of upgrading.
Obviously the problem is more socially systemic than just the decisions of Linux devs. I think the release cycle of new hardware is way too quick—if it were slower obviously that would reduce the workload for kernel devs, so hardware could be supported for longer (as they have less new hardware to work on supporting). And more generally we should have a mode of production not centred around profit, so that people don’t get punished (as in, they’re not getting paid but could be compensated for their time if they worked on something else) for spending time developing kernel support for old hardware.