Q3 2023 Xeon TDPs

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

Erlipton

Member
Jul 1, 2016
93
23
8
36
Alright, I hope this doesn't come off as too much of a rant, but I've been looking into numbers for my next homelab piece. I find moral issue with Intel's newer lineup of Gold and Platinum CPUs that have TDPs that are equivalent to entry - mid tier consumer graphics cards. Imagine these are populated in a high percentage of data center deployments on the planet (actually not hard to imagine at all)... where is this energy coming from? 200w-350w + for a single CPU just to (presumably) stay competitive with AMD? Didn't we have a massive energy issue when people were mining bitcoin like crazy with GPUs (with admittedly more power consumption, but relatively speaking the operations of those vs the data center deployments of these.... )

It feels to me like Intel is putting corporate profits ahead of the wellbeing of our planet. Do others share the sentiment that there is an ethical responsibility here to use world resources wisely? Intel sets the example for their customers to follow. If Intel operates without environmental bounds, customers will follow. But I don't even see an advantage in terms of operating costs - beating AMD marginally on paper will double power costs and influence the rest of the global energy market...

Am I completely off my wagon?
 
  • Like
Reactions: SnJ9MX

BlueFox

Legendary Member Spam Hunter Extraordinaire
Oct 26, 2015
2,101
1,518
113
You must not have looked at AMD's TDPs recently? They're similar and the top end Epyc actually has a higher TDP than any Xeon (400W for AMD vs 350W for Intel).
 

SnJ9MX

Active Member
Jul 18, 2019
130
84
28
You must not have looked at AMD's TDPs recently? They're similar and the top end Epyc actually has a higher TDP than any Xeon (400W for AMD vs 350W for Intel).
The top of line AMD is also 40% faster than top of line Intel multithreaded. TDP of these is 360W AMD vs 350W Intel.


See also intel 14th gen i9-14900k using 428W as a desktop CPU to score a couple % higher than AMD using less than half the power.


On a per joule basis (recall 1 joule = 1 watt for 1 second), AMD chips get a lot more work done than Intel.
 
  • Like
Reactions: Erlipton

mr44er

Active Member
Feb 22, 2020
135
43
28
That means what you need to cool away in the worst case (= The CPU has a task to calculate under full load over all cores for x hours/days/months etc... )
 

RolloZ170

Well-Known Member
Apr 24, 2016
5,423
1,638
113
efficiency gets worse on higher clocks. if you need 6ghz for highest single thread performance you have to live with it, but it is not a must do.