Give it enough time and a new version of Slack will eventually grind to a halt on your puny 2019 CPU.
I have a T430s which was handed down to me by my boss 10 years ago. It has i5 from that time, 8Gb RAM. I'm still doing web app development on it same as 10 years ago and i don't feel any need to change it. I'm actually afraid there is no better laptop i could change it for when mine dies. Also i can't imagine no better keyboard :(
He’s forgetting that software keeps getting slower. Forever and ever. With new hardware comes new expectations for hardware by software vendors.
I am typing this on my MacBook Pro early 2015. Dual Core i5 with 8GB Memory. For 99% of usage on Chrome and Firefox it is fine. ( 12 years using Safari it is still the worst browser for many tabs )
Basically the last real big jump in performance was SSD. I have used M1 - M3 Macbook at work. While they are faster it wasn't that much faster compared to the switch between HDD and SSD. Even On devices Voice Dictation and other AI features worked pretty well.
As many stated software is getting slower. Security and all the other requirement will likely put more burden on your machine. So there may be a need to upgrade this 2015 machine in the future, but as far as I am concern most of those have to do with Memory rather than CPU performance. I could have a 2015 Quad Core MacBook Pro and 32GB and I am sure it will last me till 2030.
ARM and Qualcomm have both catch up to Apple in CPU performance. Oryon and Cortex X725 is now within ~12% ( IIRC ) IPC difference or even similar if you ignore a small type of workload. With X730 and Oryon 2 both expect to eliminate or even exceed that gap. Unless A19 / M5 pull some other magic tricks we have basically make High CPU Performance a commodity.
The machine I'm typing this on is a 'whitebox' build I put together in 2010.
I build computers to last - the specs were high-end at the time, and have been upgraded over the years (video card, RAID controller, SSD's, etc). Even though it's getting long in the tooth, the box is still reasonably performant today.
It's highly customized; the case sports thoughtful additions like sound-dampening foam, bespoke brackets for additional cooling fans (all Noctua of course), hardware thermostats & monitoring LCD, interior lighting that activates when you open a panel even if the machine is off (makes it a pleasure to work with when under a desk), etc.
Choices that really panned out well include: Infiniband (this was back when 10G NIC's were stupid-expensive, but eBay was flooded with great, second-hand Mellanox cards off universities), Areca (their RAID controllers and arrays were so easily upgradeable across generations), ECC RAM everywhere, and an external PCI-E expander (six x16 slots just weren't enough).
It has in the range of 1000 software titles installed, countless ones used regularly (guess I'm somewhat a jack of all trades). Specialized diagnostics and tooling track and isolate changes made by software, which has helped manage things and prevent bloat accretion. I periodically run benchmarks to ensure metrics like bootup time, disk transfers, etc. still match out-of-the-box numbers).
When you have to install and configure that many apps, migration is a real pain, which motivates longevity (and a collateral reduction of e-waste).
You can also buy second hand to save another 50-80% when you do upgrades due to something catastrophically breaking. I got a used but very good quality mid tier Ryzen laptop for $200 from a few generations ago and added 32GB memory and a nvme drive and it’s an absurdly good computer for dev work.
My experience so far is that you can get around 7 years of heavy usage from a premium product. It doesn't matter how much maintenance or care you have (I treat mine like it owes me money now, but I've been careful before), that's how far it goes without disappointing you.
I am also expecting to reuse my current daily drivers (like I did before) as backups or auxiliary machines. My laptop keyboard has some loose keys and my phone screen started to die, but they still have a lot of compute to give.
1080Ti from 2017 still handles modern AAA games really well, was able to play and finish Cyberpunk 2077 with no issues. Really a remarkable feat of engineering for it's time.
> But here’s the thing: I don’t need it. I don’t have a single usecase for which I would need this much processing power. In fact, I could still use that i5 from 2011 and it would do everything I want it to do perfectly fine. I didn’t need to upgrade, I just wanted to.
If only we could have a bigger percentage of people that thought the same way. Then we might be able to get away from the insanity of marketing for new New NEW when what you have will do. Maybe these huge “tech” companies will be taken down a peg into more sane valuation territories. Maybe we’ll stop with the mounting piles of e-waste driven by the advertisers pushing FOMO of not having the shiniest.
A guy can dream though.
I figure I’ll slow my pace of upgrades even more than I have now and when the software becomes yet a larger pile of bloated nonsense shat out by clueless developers than it already is, I’ll switch back to writing letters.
This is simply not true, the UI speed isn't increasing because of systemic bloat, The Great EnFattening. But the throughput gains are immense.
A single NVME SSD can now push over 10GB/s
Main memory bandwidth is now over 100GB on midrange hardware.
My main machine is a 3.8GHz 8-core Ryzen, 64Gb RAM, GTX1070 GPU. Bought and self-built in ~2018 and still seems pretty good for development and the odd game. Even the 1TB Toshiba SSD is claiming to be healthy according to the monitoring app. It just zips along and copes with everything, and I've never felt any temptation to upgrade anything.
Me back in 2005 would have though this setup was science fiction.
I guess if you hold the religious belief that imagination is evil then this line of reasoning seems rational.
Alan Kay has talked about this many times, when a new technology comes around most people just see it as doing the same thing you could already do just faster, rather than enabling entirely new ways of doing something.
By all metrics the web is a slow buggy mess, but it's inherently different from a set of manpages and email addresses. While it's true that you don't "need" to do anything are you sure throughout the next 30 years that you will have no usecase for a local L*M as one example?
Now that there is great GPU-accelerated remote desktop options, I mostly just remote into more powerful machines. Even a country away the on-screen performance is almost like sitting at the machine, and as a bonus I don't hear every fan on my laptop going crazy. I've been a happy Parsec.app user for a while, but there are many other options (e.g. RustDesk has this).
This feels like a load of baloney to me. My most non-technical web browsing Microsoft Word-using relatives went from a 2012 Mac mini to an M4 Mac mini and the difference was night and day. They were extremely enthusiastic about the end result.
If they can tell it’s faster then certainly a technical person like myself can.
And also, that was an incredibly cheap upgrade. In 12 years they went from one $600 computer to another $600. That’s right, the new one was the same price, so cheaper than the original after inflation, they’ve paid $50 a year to compute, and that’s on the world’s most premium brand of computers.
Sure, you don’t need to upgrade anything. And for now, the Ryzen 3600 is a fantastic “old” processor, it runs my game server and it’s certainly capable.
But it’s not like you wouldn’t notice a far better experience someday in the future with an upgrade.
My 4 year old personal desktop PC agrees with him in full. My shitty work laptop that the big corpo paid less than $400 for, it screams every second for an upgrade: it takes 10 seconds to open VSCode without any project vs less than 2 seconds and it can barely paint the external monitors when moving a browser window or resizing it. It is also 4 years old.
I expect to replace the desktop components in a few years when something breaks. Broken CPUs due to age are extremely rare, but mainboards with bad contacts for memory are pretty common, I've seen a lot that don't work that well after 8-10 years. I don't expect a desktop PC to work forever, the PSU will break in 10 years anyway, the SSD will reach write limit (I did a few already). But right now performance is not a concern.
I just bought all new parts for an SFF build with a 9800X3D and 5090. So I guess this is a timely post. I know it's overkill for my needs but I love building PCs - been doing it since I was 12, and it was my first job (working at PC Club in SoCal).
My parents purchased a Samsung laptop in 2014. It came with an Intel 4th-generation (Haswell) i5 quad-core CPU and 8GB of DDR3 RAM. Although it’s over 10 years old now, it still faithfully meets my parents' needs. All they need was just upgrading the OS to Windows 10. Soon, support for Windows 10 will end, but since Windows 11 is such an unstable bloatware with random features, it seems we're going to upgrade it to Windows 10 Enterprise LTSC instead of Windows 11. This would have been unimaginable in the past. For example, using a 1994 laptop in 2004 would have been such a pain in the ass.
Energyconsumption is nowaday the reason for a CPU Update.
it's often a corporate's need for new revenue and security that causes the machine to march on. Just look at this TPM Win 10 upgrade issue.
My 2011 i5 desktop is still happily chugging away as a build server, home storage, and remote host. But oh yes, it will have to be nuked, thanks to MSFT policies.
The last desktop I built was in 2011. I built an absolute monster at the time. I ultimately repaired it and gave it away to someone else when I moved 2ish years ago (at ~11 years of age) and it is continuing to function as it was for that person. Last I discussed with them (a few months back) they're still successfully running everything they need on it.
That said, I enjoy the hardware improvements happening, because it allowed me to go from that huge full tower desktop with multiple GPUs and water cooling to doing everything I need in my life, pretty much, from a 14" M1 Max Macbook Pro. I replaced a huge, power-hungry device, with something that's tiny, portable, and can be powered off USB.
For me, I am not quite ready to replace my M1 Max MBP with an M4, but I am likely going to bump to an M5, simply for performance when editing photos in Lightroom / DxO Photo Lab, but that's more of a recent requirement. Before I got this M1 Max MBP, I had a 2015 15" MBP that worked just fine for 7 years, and would have kept going if it didn't have a bulging battery and I decided to upgrade rather than repair it. I may just stick to my M1 Max MBP in the end, I can be patient.
On the other hand a lot of game developers have a incentive to push current hardware to the limit to impress their customers. Its too tempting to juice their games with minimal effort with high polygon count and effects at the expense of their customers wallets. Try playing something like Path Of Exile 2 (in early release now) with a good sized desktop monitor or 4k without a monster CPU/GPU.
They may need to buy a new server though. (Site is down at the time of writing this)
I just got rid of my 2013 Microsoft Surface Pro. It was still being used daily in my workshop, 11 years old. Core i5 processor, running Windows 10. I only got rid of it because the battery decided to become a spicy pillow one night, expanding until it cracked open the case and pushed out most of the touchscreen.
"640k is all you need!"
There is always new tech. Local LLMs and other high processing intensive things might be a thing people want. Not directly, but it may enable things they want. More viral TikTok videos. Maybe some kind of health monitoring. Maybe AR will finally get a compelling use case if it can identify everything in your field of view but it requires serious computing power. Maybe AR 3D movies where the characters show up in your house and adapt to your living room. Siri might suck, but lots of people want a "Star Trek" computer that actually understands them.
The point is not any specific example. Rather, it's that there's always something around the corner that needs more computing power. I have no idea what it will be, but I'm confident something will appear.
I felt like that from 2010 or thereabouts to two years ago. There wasn't really a use case for having a very fast machine. Now with llms and sd, I think there is a use case that happily absorbs any compute I can buy, just like first person shooters in the 90ies.
I splurged for an M4 Pro Mini w/ 64GB. 5TB Between internal and external SSDs, then it’s over to a ~30 TB NAS if I need more.
All that cost less than a typical PC I’d build in the late 90s.
Even as a power user who codes, I can’t imagine what I’d need more for, unless I want to train AIs.
I have a 3950x in my desktop and I feel exactly the same way. I have the upgrade itch but I can't justify it in cost/benefit terms.
I don't even use that system much because my M1 Pro macbook can do almost all the same things.
"software gets slower to counteract hardware getting faster" is mostly true, but what's more true is that "software gets slower to counteract the developer's hardware getting faster". Devs (or their employers) aren't feeling too compelled to upgrade, and so they don't, and so software is staying fast(ish). Apple's annoying RAM-upgrade pricing is likely helping here, too.
(By the way, I've diverted my hardware-upgrade itch into photography gear)
I'm typing this on a ten year old iMac - i7 processor, 32 gb of ram... paid a premium when I bought it, but still a whole lot of horsepower. Wish Apple would provide releases for their current OS rather than just support for previous iterations, but I can respect that would cut into new hardware sales as otherwise I have no real reason to upgrade. My work has changed over the past few years and at this point I don't ever see myself buying another laptop either - my iPad does everything I need when I'm on the go.
Certainly there are people with significant performance needs. For the rest of us there's a Mac Mini or an iPad.
I think I used my 2010 laptop for eight years. Upgrades: 120GiB (GB?) SSD.
My almost 12 yo laptop is still OK for my job. I like its 3 buttons touchpad and touchpads have no buttons nowadays. I could keep it going for still a long time but for two things:
1. Spare parts: RAM will fail (it's 1666 MHz), keyboards wear out (I've got one spare left), etc
2. Support wanes for some old hardware. I already can't update NVIDIA driver past a certain release (I'm on Linux.)
Sooner or later I'll have to buy something new just to be able to read my screen or to cope with a failed irreplaceable part.
I game so I had to upgrade my GPU and CPU but I'm going to ride Windows 10 for as long as I can.
It's a coin toss whether I go Linux or Windows 11 once 10 becomes unusable.
I'm in this camp, perhaps a little more extreme: my daily driver laptop is a ThinkPad X230T, which I think is from 2011 or 2012. It is separate from a home lab - which I don't currently have, but which I'll use hardware from a few years ago if I ever need again. The only thing that can kill older hardware is software bloat - honestly, the web is the biggest culprit here.
You’re probably not on windows 10, because that makes sure you need a new computer. Personally I wouldn’t say I’d never need a new computer, but my i5 bought 11 years ago does everything I need (as a software developer) other than playing back 4K video recorded on an iPhone.
Which means the MS is forcing people like me to either buy a few new computers or to finally commit to Linux.
Yep, I have a Lenovo E420 (I think?) that I bought when I graduated in 2011. If I replace it, it will be due to things like the USB ports not working, not due to processing power being insufficient. I don't game anymore, I can watch video on it, I can use the Internet and word processing. What does one DO with a high powered processor?
Electron would love to have a word with you.
After starting running some LLM models locally I would like a faster CPU, maybe with dedicated "AI cores" or whatever they are called. But old CPU still works and I'll need new motherboard, RAM, ... So I'll probably keep using my PC until one of parts will finally give up.
We definitely had a good couple years in stability but I don’t see the forever part. Certainly not for gamers.
Feels like game devs have come out of their covid slumber and decided it’s time to jack up requirements
I recently did an interim update (5800x3d and 3090) so will try to hang on for a few more years
Having to configure a new machine has put me off upgrading stuff. If I have to spend time doing that, it takes away from more important things. Obviously for major QOL improvements eg. Eye-level laptop screen, it's worth changing, but I've found very little that fits that metric.
I'm still using my Dell XPS 7100 from 2009. It has an AMD Phenom II X6 1045T. Only upgraded the GPU over the years. Added more RAM and an SSD. It still works like a charm. Even the original keyboard it came with is the same. No need to upgrade to anything new as of yet.
For me, the fun is spec'ing and building new PCs. I wish I could do it every year.
Then the pain is finding a home for my old PC.
I heard about a guy on Facebook who builds and configures PCs for free (free labor, not free parts). He only does a couple each year. That sounds like a pretty fun hobby.
Well, newer laptops are built with poor quality plastics, so the hinge will break after 2-3 years. Older models are a beast though. Even the budget Dell Inspiron 3520 (2013 ~$400) is still running fine as a youtube streaming machine.
As long as your computer runs a browser and a terminal emulator, it is more than fine.
The one other use case that will need better hardware is gaming. And compiling is also always better when faster. Using llms locally will also profit from new hardware, though I guess there is almost never a use case for those.
I have never had a computer that was ever even close to being fast enough, and i doubt i ever will. I guess it depends on what you use them for.
I still deal with 20 minute compile times. Let me know when that drops to 10 seconds.
I thought the same about my 2020 Ryzen, until I started working with the Unity editor two months ago.
I'm reminded of the dead parrot sketch - this thing wouldn't "voom" if I put four million volts through it.
Don't worry, we - software developers are going to ruin the software with AI features that you will need to upgrade to Ryzen Al Max+ 395 just to run an editor.
My only beef with this has to do with power usage which can make some of the older computers just not worth it.
This might be true of all tech. I don’t want a new tv ever. I want one as dumb and old as possible. No sense letting the fuckers try to catch up to my ad blocking tech.
I fired up my old iPod the other day to get tunes in my shop. Hasn’t been updated in 7 years and still has all the music I like. Doesn’t come with a subscription. Still has genius playlists. Remember those? they were great. I’m so glad Apple hasn’t thought to or isn’t able to brick it.
The biggest improvement in computer speed for the last decade or so comes from more ram and much as Apple wants you to think so, it ain’t scarce.
A five year old laptop with 16gb ram is totally fine for everything I ever do (32 if I want to splurge!). I mean unless you want to run windows 11 for some unknown reason. Which, to be perfectly clear, I do not suggest for anyone ever. Did you know Linux can run security updates in like 45s? Windows wants to download a 6gb patch and spend 45min replacing your entire operating system every month. That alone should be enough to get everyone to nope.
If I don't build a new PC, what will my new 5090 run on?
I buy almost all tech refurbished / recertified - it's shocking how cheap off-lease equipment goes for on Newegg / Backmarket. Amazon has fantastic warehouse deals sometimes too, you just have to be patient and check occasionally. Especially with desktops, buy everything 'base' 4-8 years out of date, upgrade memory + storage, look for refurb monitor and peripheral deals, etc. Hell, I've gotten completely free shit (expensive stuff!) multiple times from Amazon because of UPS/FedEx screwing up or fudging deliveries! And don't discount Ebay either, sometimes pallets of crap end up on there trying to flog 100+ 'whatevers' at bananas prices, as well as finding weird/rare Chinese replicas of discontinued OEM hardware.
Given how much junk we (over)produce as a species, buying retail for a lot of this stuff just doesn't make sense unless you need it immediately for business or work purposes.
But what about my 450 open tabs?
If you game, you will.
Energyconsumption is a reason for a CPU update.
My daily computer is 7 a years old laptop. The battery has its issues, but it's still powerful enough for everything I do – except games, for which I have a refurbished 5 year old desktop.
So... yeah, I tend to agree.
i told myself if i'm buying high-end parts, i won't be needing to upgrade every few years.
and yet i'm still tempted to update every years :D
[dead]
[dead]
[flagged]
I just got a new M4Pro Mini (Apple). It replaces my M1Max 14-inch MBPro.
Bit zippier (not screaming), but it does have native support for Apple "Intelligence."
I was waiting for the M4Max/Ultra Studio, but, y'know, I realized that I have no need for that.
This has been working fine, for a couple of months. I suspect that I won't be replacing it, for a few years.
I probably will need to get a new iPhone, and maybe iPad, sometime in the next year or so (also for Apple Intelligence stuff), but I'm in no hurry.
I've mentioned this in other threads, but I run a small side business refurbishing and selling old laptops. One element of my work is saving retro machines for retrocomputing, old hardware interfaces, etc., but I also refurbish and sell for general use.
For the average person with average needs, there is no difference between, for example, a $100 Dell Latitude E5530 from 10+ years ago and a $600 Best Buy low-end Dell laptop from today, so long as the Latitude has been modestly upgraded with 8GB of RAM and a small, used SSD. Its 3rd generation i5 is more than enough to do anything they need. It even runs Windows 11 just fine, so long as you inform the customer about the need to manually install feature updates.
For the general public, buying new computers is an expensive scam that contributes massively to waste. The machines I refurbish would typically have been thrown out or 'recycled' (stripped for precious metals in an expensive process) if not for my intervention. There's no reason for this except number-go-up greed, and it should stop.