New
#2331
Just like you, I and a lot of people do a lot more than game on their computers. If it doesn't hurt gaming, which I don't believe it does, but helps in content creation or other tasks, it is worth the price to some people. And yes, there are the enthusiasts and benchmarkers who just want the best of everything. I would think if you do a lot of video/Photo editing/encoding and time is important to you, it may be worthwhile to get the 6+ core CPUs. True, the 4 core/8 thread versions will do the exact same work, it just takes them much longer. If you are just building a strictly gaming computer, it isn't worth the extra money. If you do a lot of CPU intensive work like encoding, 3D modeling or such work, it is worth it. In those areas, you do get top dollar performance.
As far as sales taking off, I don't have a clue. Intel keeps building them, so sales must be making them money. I also know you see a lot more 6+ core CPUs around now than you did just a few years ago. It is becoming more common now.
I think 6+ cores are a great thing. But yes as a gamer they need to be performing better than this and SOON. Maybe they will.
I wonder how Pascal would have done or 1080TI compared to 1080 if they said "oh for now gaming performance will be 13% lower, or much worse depending on the game."
To be clear...
What I've been saying is simply this, though there are no programs out now, that can take full advantage of hexa and octa core CPU's, that doesn't mean none shouldn't be released now. That once they become the norm, software developers will create more consumer based programs to take full advantage of those extra cores. I think it's a good thing the chips come out first as it lays the groundwork for software developers to see the tech has moved forward and now they need to step up their game.
That said, what I also said is there's really no need to rush in now as the tech (software/games) isn't ready yet.
Anyway Steve brings up the point that other programs can benefit from that hexa/octa core chips as well, but most home users aren't using those types of programs. Even Adobe Photoshop and Adobe Lightroom don't greatly benefit from spending extra on an hexa/octa core chip. As a photographer, using both Adobe programs for photo editing, I've never once said I needed a hexa or octa core chip, because as I understand it, Adobe isn't there yet for those chips. Now, if I were doing content creation like video editing I "might" look at one of these chips. But again, this is a small group compared to the gaming community who's clamering for every bit of performance they can get... and where a hexa/octa core chip might win the day.
My two cents.
A game may not directly benefit of more cores but many are doing other things at same time, recording, streaming, communicating and that should leave more resources for game itself.
I've been watching the AMD EPYC presentations, and I must say, Holy Bit!
They have really innovated(I'll spell it in case someone from Intel reads this: I-N-N-O-V-A-T-E-D)
This was the most interesting part to me, as it also had the new security features(like HW Memory encryption):
Why do I get the feeling that if this acquisition goes through I won't be seeing 4 phantom fans in Corsair Link anymore. I'll be seeing 8.
Private equity firm likely to purchase Corsair
PS: the last sentence, where I bolded, $30.4B, way to go dude.