New
#1
Graphics Cards - When are they ever good enough for the average user?
On the topic of graphics cards, someone please help me understand something? I am only a very casual gamer (at best) and haven't really been into recent games in quite some time now. Honestly, I just haven't had the time to take up the hobby on any kind of serious level whatsoever. Of course, that's just me and we all have our priorities in life, and the limitations of our time and money.
But when I look at other forums (not this one), I get the distinct impression that serious gamers are continuously upgrading their rigs, and especially graphics cards every year or two. And so (to them), the rest of us (on desktop computers) are using absolute dinosaur computers that have no value.
Could the argument or debate not be created that there is somewhat of a racket with high end modern games that continue to demand ever increasing more powerful GPUs year over year? So in other words, it just becomes a never ending cycle of serious gamers feeling like they continually feel the need to upgrade time and again in order to compete with one another? Now, I am not suggesting that the average PC user should not consider upgrading older parts on a desktop PC. In fact, one of the best upgrades I ever did to my older PC was the replacement of the HDD with a new SSD. What a difference that made in performance.
Certainly, the PC I am currently using is an older one; and I make no pretenses about it. The graphics card itself is a late 2012 era Nvidia GeForce GTX 650 Ti and the computer itself is even older than that. But when a PC is still able to boot up from a cold start (super fast), load web pages fast, play utubes and movies when occasionally desired, run ms office apps quickly, and perform other basic daily tasks, is it not still good enough?