Advice for remote gaming rig

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.


Jul 16, 2015

I´m considering building a new box with the intention of being able to host one or two VM's that'll run games over parsec, with GPU-resources shared/assigned per VM. I'll probably want to run a few other VM's alongside these, but nothing that requires GPU. The box will be hosted close by, so latency wouldn't be much of an issue.

Last time i built myself a box it was a dual E5-2650v1, so my knowledge of what's best for this kind of build is pretty limited.

What kind of a CPU/GPU would you guys recommend? And what would be the most "cost-effective" solution to this?

I'm thinking something along the lines of a Nvidia 1070 with NVME/SSD-based storage for lower latency. I'm not sure how well AMD CPU's will play with passthrough these days, so maybe an Intel-based CPU is for the better? Is there any specific hypervisor that would be a good fit for this? I've noticed that many GPU-cloud solutions are using Xen, is it superior to something like ESXi?

The reason for not simply building a plain gaming-rig is that the kids who are intended to play have special needs, and can sometimes act out and slam/hit/break stuff - hence the solution to use a simple thin-client as a frontend.

I'd ideally like to keep it under $1100 / €985 for the final build.

Any help, pointers or "lolwut"-input is appreciated.


Active Member
Mar 10, 2016
My main machine these days is a Threadripper and it works great. I have also read reports that Ryzen works well. So AMD/Intel should both be fine. The biggest unknown seems to be motherboard/BIOS related. Those determine the IOMMU grouping, so they make the biggest difference. Look for reports of known good boards for VFIO. Even if you don't intend to use Linux as the base OS, the VFIO users are good about posting IOMMU groupings and such so you can get a really good idea if the hardware will work before you buy.

I've only used QEMU/libvirt so I'm not sure how the other platforms compare. There is a lot of good info for setting up Win10 VMs for this platform, along with tutorials and such.

Keep in mind, NVidia doesn't like VMs and tries to block them. Unless you buy expensive Quadro cards. Look for "code 43" to get an idea if you want to deal with it. I decided that I don't need top end GPUs anyway, so I use RX570s. Good enough for what I need, and work fine in VMs.

If the main use is for remote, I suspect running a couple GPUs at 8x or even 4x is unlikely to be an issue. Most consumer platforms are lane limited for this sort of thing, which is one reason I went with TR, 64 lanes. I suspect it's overkill for your needs, but if you can get a good deal it might be worth it. Motherboards tend to be pricey for it though.

I'm not aware of any consumer cards that can do shared GPU / SR-IOV. Only a few really expensive cards. So every VM that needs GPU needs its own card, and the host needs one too. The host can be integrated or low end though, unless you need more for it. There are a few options to try to prevent the host from taking a card and running headless, but it gets harder to set up in some cases.

Just to give a datapoint, I use an AsRock Taichi, I hear the Ryzen version of the same board works well for this. I use a Win10 VM with GPU for light gaming, but mostly for CAD/CAM work with a couple programs that don't exist for Linux.
  • Like
Reactions: BoredSysadmin