Because after a certain point I'm pretty sure it's all just diminishing returns. Of course, I could be wrong - I don't claim to be any sort of l33t computer expert. What do you think? Should a person stop upgrading after a point? Or like just stick with a level until there is a massive leap forward? I'm all about the second option myself.
With the computing world changing almost on a day to day basis, upgrading your computer regularly may seem like the right thing to do but it is not necessary. If you are just using your computer for basic Internet surfing and writing document then upgrading wont be necessary. If you are doing all sorts of programming, using web development tools, gaming, playing and creating videos etc., then upgrading will be necessary. Before you upgrade make sure you really need to based on what you are currently doing and what you hope to be doing n the near future.
Most of these upgrades don't have more benefits that using the previous model of the same device. Sometimes it's just all hype when in reality there's nothing much which has been improved. I normally don't upgrade until the device I'm using dies or fails to [successfully] do what's intended for because higher specs are needed for it to do a great/decent job.
Upgrading becomes redundant or unnecessary if you do it every single year. If you always upgrade your computer and all of you use is Microsoft Word, then I think that the switch would be useless. If you use a computer for more extensive operations and you up to date hardware, then you need to upgrade your computer often. The same goes with the operating system.
Well if you need to upgrade every year then that's redundant. Let's say a new processor comes out the next year but you just bought one then it's not time to upgrade yet. I don't think upgrades will become unnecessary not unless the needs of the software don't exceed the limits of the hardware.
More powerful hardware should be bough only when a computer becomes unable to properly perform the tasks a user wants. Take gaming for example. If you want to play a newly released game and your PC can't handle the game, it's time to upgrade. But if games perform perfectly, at the highest settings, and you still upgrade then it's a waste of money since you don't really need the extra power.
No, upgrades won't become redundant since you always have to upgrade not unless you want your system to be obsolete. Upgrades are necessary because the technology demands it. If my Windows XP can run modern day applications of today then I won't upgrade or if my 1GB graphics card can handle all future games well then I won't upgrade anymore.
I agree with you completely, a $5000 build to me seems unnecessary and is not needed for anything that someone would do on a day-to-day basis. A build around the $3000 would be the highest I would ever go as that should get you the best equipment around at the time including all top quality peripherals.
It all really depends on your needs, what you use the computer for. In general, upgrading is not redundant in my opinion: with time, every program takes more resources, becomes harder to process. It is true that programs are optimized better with each new generation, but they take into account the computing power of average machines. Without upgrading from time to time you risk finding yourself with an obsolete system after a while. You can just keep going with the old computer until it just can't take it anymore, and trust me, after a while it won't be able to run even the most basic programs. Or you can upgrade piece by piece, judging from time to time whether you look for a more expensive component that lasts you longer or a cheaper one with a shorter lifespan; I usually choose the former. The problem with buying a whole new computer when the old one's obsolete is that you either spend some time with the frustration of a slow system, or you change it before it has given you all it had. While I think upgrading components when the need arises is probably the best way to squeeze every cent out of the initial expense. You also need to consider a purely economic aspect. Buying a whole new computer means spending a pretty big amount of money at the same time, and not everyone can do that without touching the emergency nest egg. Upgrading part by part means you spend just a bit of money from time to time, and this is probably easier for most. But I'm biased since I upgraded my computer for the last 8 years, and I don't think it has even just one original component!
You have to be careful with upgrading. I knew from the getgo that certain parts only work with certain computers but I wasn't sure exactly what it all meant so I researched/asked advice since I'm planning to buy some memory for the first time. You not only have to get the right amount of GB you have to worry about the processing speed as well. It can't be too high or your computer will not take it. If it's too low then you've just wasted your money and your computer won't show a significant difference.
Like Seven Ways mentioned, it depends on your needs. Someone who is only using Word, really shouldn't need to upgrade much until things start slowing down or their needs change. Someone who works primarily from their computer, uses extensive programs, etc, may need to upgrade more frequently.
I only upgrade when there's something wrong with the device or if it's inadequate and it can't do what I want it to do, so that's not really often. Upgrading is unnecessary if you don't have any use for the new features, I think it would only be a waste of money if you upgrade just for the sake of upgrading.
I believe that once you got to a point that the computer can handle all the daily use you require, there's no longer a need to upgrade. Previously, my computer could handle everything I need except watching 60fps videos. So I went and upgraded my graphics card to a 1GB memory and I've stopped needing to upgrade since I already got this taken care of. Unless you're a PC gamer than you'll need to upgrade once in a while to catch up to the requirements.