New Server Build

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

IamSpartacus

Well-Known Member
Mar 14, 2016
2,515
650
113
If you observe frame rates dipping too low when going to higher resolutions, then CPU is not your problem. It's GPU.
I havent gamed in years so I'm just going off what my friends and colleagues have reported to me. So I can't back up their claims. But the ones I'm basing this info on all have 2080Ti's so I assumed they are pretty set on the GPU front.
 

Dreece

Active Member
Jan 22, 2019
503
160
43
Having been in the computing sector for over 30 years, I can reassure you of one fact... users are never to be trusted, man once blamed the lack of human sacrifice for drought and famine, assumptions are the bane of technology.

In respects to 2080TI, giving users of such top end wizardry the benefit of doubt - your friends could very well be outpowering the CPU, processors are indeed still playing catch-up all across the field, well considering U.3 is now upon us and GPUs thrashing massive data around the bus, I can easily believe the high base clock-rate requirements to drive intense big-texture rich 4k resolutions all pumping through at over 60fps at the very least.

For regular folk with more modest GPU budgets, blaming the CPU is definitely a normality in the less-techy universe, however in the high-end GPU sector pushing cards to their absolute potential, I at least can imagine CPUs being hit pretty hard.

In my case, if I find something running a little 'laggy', I just dial back a level of beauty in detail to achieve an optimum balance of smooth playability and eye pleasing moving pictures. However when you're running on bigscreens, dialling back the beauty can quickly become quite displeasing especially to beauty aficionados.
 

alex_stief

Well-Known Member
May 31, 2016
884
312
63
38
That's all nice in theory. In practice, increasing the resolution in the vast majority of all games only increases the GPU workload. Outliers exist e.g. when the field of view is tied to the resolution, like in some strategy games.
Respectable news outlets centered around gaming perform CPU tests at lower resolutions, and GPU tests at higher resolutions. For this exact reason: resolution does not tax the CPU.
As a side-note: having a 2080TI does not save you from encountering a GPU limit, even at 1080p.
 
Last edited:

Dreece

Active Member
Jan 22, 2019
503
160
43
I do not think anybody claimed resolution taxes cpus 'directly', but there is obviously a correlation between the complexity of scene and prebuffering in some cases too.

Anyhow, this is all apples to oranges, for without an actual game to benchmark, we have absolutely no idea what we're talking about or even what conclusion we're making. It was why I asked for names of games to test.

Also I do not think anybody claimed the 2080TI does not have its limits, obviously it does, but it can handle higher resolutions at higher FPS output compared to its siblings.

The topic that intrigues me personally is that it has been claimed recent games drop in FPS if the CPU does not clock higher, this I personally would like to confirm just out of curiosity because it would not be economical for game companies to produce titles which demand top-end high-clock CPUs, indeed GPUs is a whole different story.
 

Dreece

Active Member
Jan 22, 2019
503
160
43
A quick google search tracked down a few articles, and reddit posts where some games drop FPS on low-to-mid spec cpus... so it doesnt sound like @IamSpartacus 's friends are alone... could simply just be highly complex physics and AI algorithms which kick in at very busy scenes overwhelming low-clock cpus, however this has always been known way back since the early nvidia days before they purchased physx, which then helped things considerably for a few years, virtually killing ATI off... But now in this day and age, things are getting more adventurous, some of these games appear to have rather complex large-world computations going on... whence to be a competitive gamer, you're probably going to need a pretty stout cpu for those complex scenarios... I guess its all about the 'spike' - that odd moment where a lot is happening on screen accompanied by decision trees in the hundreds in code plus the physics etc... and poor use of multithreading/parallelism would indeed lead to a spike on core 0 which is quite strongly tied to the system, meaning a knock on effect on the GPU waiting for the game engine to catch up...

again, this is totally pointless without an actual game to inspect.
 

stokedsurfer56

New Member
May 25, 2020
4
0
1
Interesting conversation. I'm also currently researching a new system for use primarily as a workstation, but would also like to do some gaming on the side. My needs require memory bandwidth (CFD) which is where Epyc really shines, so I’m looking at a single (for now) 7302 instead of a Xeon solution. Unfortunately Threadripper is not so great for memory bandwidth and latency.

There seems to be very limited (if any) reliable information about Epyc Rome gaming performance. Looking strictly at single core benchmarks (PassMark) results, compared to desktop processors with similar results, Epyc Rome gaming performance could be mediocre (similar to Ryzen 7 1700x for eg.). But there is conflicting information when looking at something like the PC Builds Bottleneck Calculator which indicates that the Epyc 7302P could perform very well. I have also come across a few non specific comments about fairly good gaming performance, but I cant seem to find any gaming FPS benchmarks results for Epyc systems.

Does anyone have any insight into this, or have real world performance numbers to share?
 

alex_stief

Well-Known Member
May 31, 2016
884
312
63
38
Let's only look at CPU-limited scenarios, because discussing CPU performance for cases where the GPU is the bottleneck is rather pointless.
For gaming, an Epyc 7302p will behave like a third gen Threadripper CPU with lower clock speeds.
This can lead to very good results for games that are well optimized and make use of many cores efficiently.
On the other hand, the vast majority of games still needs high single-core performance. Here the Epyc CPU will be slower than even entry-level mainstream CPUs. Probably still playable unless you are an FPS junkie, but slower nonetheless.
It is up to you which types of games you play, if you are CPU-bottlenecked, and which framerates you deem acceptable.
 

ReturnedSword

Active Member
Jun 15, 2018
526
235
43
Santa Monica, CA
It really depends on what game is being played on the workstation.

Older games, especially twitch-type FPS are not very well optimized for multi-core systems. In addition, their netcode quirks mean that high frequency single-threaded CPUs tend to do better. This is only relevant at the top tier of competitive gaming though, something that is not applicable for most people.

Contrary to popular thought and Intel marketing, most games released in the last few years have increasingly made use of multiple cores. After all, quad cores have been mainstream for years now, though this also depends on the game engine. For example, a game that I casually play, Stellaris, is a newish game, but lags to an unplayable level the more things are going on in late game because the Clausewitz Engine it's built on is ancient (thus doesn't take advantage of multi-core systems well irregardless of what the developers claim).

In general though, most gamers prefer to have better eye candy and larger resolutions, whenever possible. At higher graphics quality and resolution settings, games are clearly usually bottlenecked by the GPU. Case-in-point, the recently released cheap $120 Ryzen 3300X is a quad core that is more than fine for 1440p and 4K gaming, provided a top-end GPU is used. Refer to the 4K benchmark charts at TPU here.

TL;DR, in most scenarios you'll be GPU-limited, so don't worry about the CPU that much.
 

Dreece

Active Member
Jan 22, 2019
503
160
43
From what I've seen over the past few years especially is the rampant and quite blatant focus on Frames Per Second benchmarks, and the conclusions numerous YouTube reviewers make based on their comparisons between processor x vs processor y on the same gpu. Most these reviewers tend to be focusing on that 144hz+ screen use-case.

Personally, when I do game, I run at 60fps, that is the limit of my oled tv, and I'm more interested in playability and realism. I can't quite understand the uber frame-rate'aholics, as in those hyper-glued competitive gamers with their fridges packed full of brain-cell killing energy drinks and $200+ mouses and keyboards, oh and let us not forget the huge focus on input and display lags... when you walk into that world even for a second, you quickly grab your coat and exit, I'm fine with my command and conquer oldies and the occasional FarCry, my input tools are a bluetooth keyboard and a logitech anywhere mx mouse, and I'm lucky if I manage to down a hot cup of tea during the gaming session.
 

ReturnedSword

Active Member
Jun 15, 2018
526
235
43
Santa Monica, CA
Everyone has different requirements and motivations for their hardware choices, real or perceived. As long as they feel good about their selection I don't see any harm in buying something. Ofc if someone needed advice, that's all I can give, but then it's up to that person to decide what they want in the end.

I haven't played competitive FPS for years now, but when I did I can attest that high frame rates mattered, and still do, since due to the netcode of those FPS (which are still massively popular now) meant that high frame rate was correlated with some competitive benefit. This led to other things, such as a high end GPU, but running on low resolution/low quality. Now, whether that competitive benefit directly resulted in the player performing better can be contested, as most players do not play at a high enough level to warrant extreme emphasis on said frame rate. However, this does not matter since as with many things in life, people make choices based on emotions and whether they "feel good" about it or not. It's the same reason why a friend bought a Corvette ZR1 when he can only drive it as fast as someone with a base Corolla.

Nowadays I also like enjoying single-player games at a higher resolution and graphical quality, so I'm always GPU-limited. My GTX 1080 Ti can barely break 4K 60 FPS, and it is not consistent. I was disappointed in the RTX 2080 Ti being quite expensive when essentially Turing is an overclocked Pascal with RT and Tensor cores. An RTX 2080 Ti still can't consistently average 60 FPS on Ultra quality, even though there is a massive price premium. Thus I will wait and see about the next generation GPUs later this year.
 
  • Like
Reactions: Dreece

Markess

Well-Known Member
May 19, 2018
1,146
761
113
Northern California
I haven't played competitive FPS for years now, but when I did I can attest that high frame rates mattered, and still do, since due to the netcode of those FPS (which are still massively popular now) meant that high frame rate was correlated with some competitive benefit. This led to other things, such as a high end GPU, but running on low resolution/low quality.
From what I've seen over the past few years especially is the rampant and quite blatant focus on Frames Per Second benchmarks, and the conclusions numerous YouTube reviewers make based on their comparisons between processor x vs processor y on the same gpu. Most these reviewers tend to be focusing on that 144hz+ screen use-case.
I can't say from first hand experience, but I do curate (and pay for a lot of) the hardware on my son's PC, so I've learned a lot though association. He's not at the pinnacle of gaming, but he's good enough to get asked sidekick for "pro" gamers on occasion for their streaming sessions. So, he's not bad either.

He tells me that with the boom in "Elite" gamers actually making a good living streaming their gaming sessions on platforms like Twitch, there's a focus on both FPS (for the competitive bonus that @ReturnedSword mentions) but also having quality settings on high, so the screen mirror sessions will look good to the paying viewers. Not a lot of people are good enough to make a living having people pay to watch them game, but the ones who are successful make a LOT of money. Having a service like Twitch providing the infrastructure has lowered the barrier of entry enough that a lot more people are monetizing (or simply streaming their sessions for fun) than in years past. My son even did a paper on it for his high school "College and Careers" class. Pretty interesting stuff. I had no idea how much its monetized.

Anyway, using the Corvette analogy, not a lot of people drive well enough to race cars for a living but that doesn't stop a lot of people from buying fast cars, or modding their cars, as if they did. Gaming is the same, I guess. You get reviewers focusing on 144hz because to get an optimum viewing experience, you want to sync your crazy high FPS with the monitor refresh,. And you get people building gaming rigs with GPUs that have as much RAM as the motherboard. Which I find kind of crazy.

I was disappointed in the RTX 2080 Ti being quite expensive when essentially Turing is an overclocked Pascal with RT and Tensor cores. An RTX 2080 Ti still can't consistently average 60 FPS on Ultra quality, even though there is a massive price premium. Thus I will wait and see about the next generation GPUs later this year.
I think the first release of RTX was to help drive developers to start pumping out RT games. I bought a new laptop late last year and was interested to see that for non-RT games (which is what most of the laptop review benchmarks were based on), the mobile 1660ti actually outperformed mobile RTX SKUs in a lot of situations. Once there's more games and programs using RT that will change. And the next generation RTX will be much faster I'm sure, while the Turing GTX offerings will only see modest improvements.
 

8Ringer

New Member
Jun 3, 2020
4
1
3
@Dreece what v3/v4 E5's are going to get the job done on new games? None that I'm aware of.
I just picked up a X10DRL-iT with 2x 2620v3s for $300 on ebay. Another $80 for 32GB ECC RAM and I'm in business for under $400. And when the 2620s become a bottleneck I have AMPLE upgrade pathways with more cores and the v4 lineup.

Which is a roundabout way of saying this. My current gaming "rig" is a SFF i54690k and a GTX970. It handles nearly anything modern (Forza Horizon, Doom 2016, The Outer Worlds, etc) at High or Ultra @1080p without major issues and almost always above 60fps certainly always above 30fps. So saying a V3/4 gen CPU can't handle modern gaming simply isn't true UNLESS you want crazy high framerates for V-sync or something. And at that point, is a virtualized machine really the right solution?

For the more casual gamer like myself, older stuff can handle 1080p resolution gaming perfectly fine. Particularly when its a "living room" solution where you're playing on a large TV that is most likely limited to 60hz.
 
Last edited:

IamSpartacus

Well-Known Member
Mar 14, 2016
2,515
650
113
I just picked up a X10DRL-iT with 2x 2620v3s for $300 on ebay. Another $80 for 32GB ECC RAM and I'm in business for under $400. And when the 2620s become a bottleneck I have AMPLE upgrade pathways with more cores and the v4 lineup.

Which is a roundabout way of saying this. My current gaming "rig" is a SFF i54690k and a GTX970. It handles nearly anything modern at High or Ultra @1080p without major issues and almost always above 60fps. So saying a V3/4 gen CPU can't handle modern gaming simply isn't true UNLESS you want crazy high framerates for V-sync or something. And at that point, is a virtualized machine really the right solution?

For the more casual gamer like myself, older stuff can handle 1080p resolution gaming perfectly fine.
It really all boils down to how one defines "gaming." I'm with you, I'm a casual gamer and 1080p is plenty for me as well. But when someone says they want to build a "gaming" machine I tend to compare that to someone building a "gaming PC" with an higher performance CPU that may or may not be overclocked. That's just what I default to. But again, it's obviously very subjective. But to your question about VMs, you absolutely can run a "gaming PC" performance level VM with passed through hardware using certain CPUs/GPUs.
 

8Ringer

New Member
Jun 3, 2020
4
1
3
It really all boils down to how one defines "gaming." I'm with you, I'm a casual gamer and 1080p is plenty for me as well. But when someone says they want to build a "gaming" machine I tend to compare that to someone building a "gaming PC" with an higher performance CPU that may or may not be overclocked. That's just what I default to. But again, it's obviously very subjective. But to your question about VMs, you absolutely can run a "gaming PC" performance level VM with passed through hardware using certain CPUs/GPUs.
Fair enough, and I agree with you there.

And I guess my saying "would you want to be using a VM" was more around the networking lag of running the Gaming VM in the closet or basement or whatever and then doing VNC or something to display the output would be laggy even over fast wired networks. I use Parsec on occasion when I want to run a game on my Macbook from my PC (when the TV is in use) and the quality suffers when the Macbook is on Wifi (the PC is wired 1gb).

But yea, a "gaming PC" is a very subjective thing. Agreed.
 

IamSpartacus

Well-Known Member
Mar 14, 2016
2,515
650
113
Yea, if you're talking about streaming a high performance gaming VM, there will be some performance degradation. But if using direct attached display and USB to that VM server, there is no performance hit that I have found.
 

8Ringer

New Member
Jun 3, 2020
4
1
3
Direct attached would solve that issue for sure.

I don't mind having my small gaming PC in the living room. Its unobtrusive in the tiny and subtle Node 202 case, silent since I replaced the CPU fan with a noctua and added GPU compartment fans, and its only on when I need it on so power usage isn't super relevant.

Direct attaching HDMI and USB from my basement up to the living room would be tricky to say the least. But thats just my use case, and its sortof borne of laziness. Routing cabling to the basement is fully doable, I just don't want to cut drywall, drill holes in floors (while not puncturing the underfloor hydronic heating) and dealing with that headache. And running a hypervisor rather than just FreeNAS on bare metal sounds like an extra layer I'd rather not deal with if I could avoid it. Having a discrete machine here is just easier for me.