Affordable Shared Graphic Card for an ESXi (Home Server) installation?

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

Your name or

Active Member
Feb 18, 2020
285
41
28
Hi
I need for some Virtual OS an 3D Accelerator aka Graphic Card since the Software who run on them dont work with an Remote Access Software.
I know "normal" Graphiccards dont work at least not as an shared HW. Are there some affordable Graphic Cards who I can put in an Virtual Linux (Based) OS to access them via a Remote Software? I know, I know everything with Graphic in the Name is damn expencive even 10 Years old.
When I looking into the Price here oh damn.. Gebrauchte Grafikkarten günstig kaufen | ServerShop24
Thanks
 

mrpasc

Well-Known Member
Jan 8, 2022
489
259
63
Munich, Germany
A Quadro P4000 would work, but the price asked by Servershop24 is much to high. Can be found (with some patient search) for 100…150€ at eBay or sometimes even at Kleinanzeigen.de
 
  • Love
Reactions: Your name or

Tech Junky

Active Member
Oct 26, 2023
351
120
43
I put an a380 in my Linux box for $100. Works great for media transcoding and as a primary output. The onboard requires using a different "monitor" for uefi access and the arc just works.
 

Your name or

Active Member
Feb 18, 2020
285
41
28
Hi
I need to share the GPU with multiple Virtual System and not just One GPU to one Virtual System.
The reason is my Virtual Android works finde until I try to connect with an Remote Software there the Display is black.
People mention that a "real" GPU is nececary. Since I run multiple also soon some Virtual Linux Machines I cant pack 30 GPU into my Server.
All the Machines dont need much Ram since the are only used for Office type use and run just Software who do one specific task 24/7.
 

mrpasc

Well-Known Member
Jan 8, 2022
489
259
63
Munich, Germany
Then you will need vGPUs. Hardware part is affordable as well (Nvidia Tesla P4 or P40 or P100 are cheap, starting from ~75€ for a P4 used) but the Nvidia licenses for shared vGPU aren’t. And hard to buy for a Homelab.
 

Your name or

Active Member
Feb 18, 2020
285
41
28
but the Nvidia licenses for shared vGPU aren’t.
oh i need additional licence. Do you know how the called or what the order info are so I can research what I would pay for?
With the Android i found a solution who is junky as hell but works just ok.
 

CyklonDX

Well-Known Member
Nov 8, 2022
848
279
63
You could get a system with integrated intel igpu. (ones that support GVT-g) thats the cheapest way to get it it working, i'm not 100% sure but you could assign normal memory as its vram.


The best solution for nvidia vgpu is to get 2080/Ti , and have northwestrepair (a guy on the net) solder on 44-48G of vram onto your card (or just get RTX Titan/ RTX 6000 with rtx6000 you won't need vgpu unlock or bios changes), and upgrade bios, then use the vgpu unlock that you can find if you google. *(vgpu unlock only works with 20 series cards, and older. Doesn't work with Ampere, so you'll need lic for A5000 and above sol)

AMD solution is not there. Only supports vmware, and even so support is cucked (cards doing it are limited).
 
  • Like
Reactions: eptesicus

eptesicus

Active Member
Jun 25, 2017
151
37
28
36
I'm glad you started this thread as I'm currently looking to get back into VDI with Horizon on my vSphere 8 environment at home and I'm in a similar spot.

The best solution for nvidia vgpu is to get 2080/Ti , and have northwestrepair (a guy on the net) solder on 44-48G of vram onto your card (or just get RTX Titan/ RTX 6000 with rtx6000 you won't need vgpu unlock or bios changes), and upgrade bios, then use the vgpu unlock that you can find if you google. *(vgpu unlock only works with 20 series cards, and older. Doesn't work with Ampere, so you'll need lic for A5000 and above sol)

AMD solution is not there. Only supports vmware, and even so support is cucked (cards doing it are limited).
I'm gonna have to look into the 2080 Ti mod you mentioned. Thanks for sharing that. EDIT: I found the vgpu unlock workaround you mentioned and it doesn't not work for ESXi.

The AMD S7150 x2 cards used to be great for vGPU, but I think there was a limitation in Horizon where instant-cloning wasn't supported, so they could only be used for manual clusters. It's a shame AMD isn't much into the VDI game anymore and the Intel VDI GPUs are too new to be cost-effective for the homelab.

@Your name or - How many sessions/desktops do you need? If only a max of 2, you could have 2 individual GPUs in the system and do hardware pass-through, while getting some DP dongles to make the system think a monitor is connected. If your host system can support more cards, you can put as many cards in as needed. There's also thunderbolt external GPU enclosures as well to expand. Otherwise, an old-ass card like a Grid K2 would maybe work, but only if you can get the drivers to work. No licensing required for that, but again, it's at least 11 years old now.
 
Last edited:

CyklonDX

Well-Known Member
Nov 8, 2022
848
279
63
S7150 only has really old driver stack support, and its ancient. You will most likely need to run outdated system to get it working.

Other solution for vmware would amd's:
v620 (up-to 12 machines)
v420 / MI50 (up to 8 machines)
v340 / MI25 (up to 4 machines)
Each of those was meant for vgpu with esx, the question is were the driver stack built for your version of esxi.
For Nvidia's solution you will need to pay for licenses even with old K2's.

(for those cards proper driver you will either need to chat with amd, or look for sriov esxi drivers on the net - they exist)
Most likely you will need to use ROCm to get proper drivers



(Note: last time i looked mi100 drivers for esx with hexeditor it actually support mi50's too.)

also worth sharing from discord about nvidia gpu unlocking) some people did manage to get P40's working with their own lic server.
1700678317046.png
(here's a link to that discord)
 
Last edited:
  • Like
Reactions: eptesicus

Your name or

Active Member
Feb 18, 2020
285
41
28
There's also thunderbolt external GPU enclosures as well to expand.
Does the requiere some special Mainboard? I just use my Huawei Server for everything. I switched even to thin clients now and do it the "russian style" but it works PERFECT. I use the thin client beside of Remote Desktop just for watching Youtube, Video and so one.
For gaming I will power up my i7 Machine just when I need them. Sadly many Pc, Server are soon to be die so I need to be quick to held them alife until I can fully switch to my Huawei, ESXi environment.
 

zachj

Active Member
Apr 17, 2019
159
104
43
I believe it should be possible to bypass the license requirement for nvidia vgpu on virtual machines hosted upon a VMware hypervisor (ESXi)…it may even be possible to bypass a requirement for using a real grid gpu.

using open core the pci device id can be spoofed. It’s essentially identical to the spoofing that’s done in KVM on Linux/proxmox, but using open core because VMware doesn’t let you spoof pci device ids.

(you actually can spoof a pci device id in the VMX file of a VMware virtual machine but specifically for nvidia vgpu your spoof is overridden at vm boot.)

Problem is I’ve tried it and I can’t get it to work. Anyone able to give this a try?

All that said there’s really no compelling reason to muck about with spoofing when you can get a real grid gpu for $80 and run your own fake license server using fastapi dls.
 

eptesicus

Active Member
Jun 25, 2017
151
37
28
36
I believe it should be possible to bypass the license requirement for nvidia vgpu on virtual machines hosted upon a VMware hypervisor (ESXi)…it may even be possible to bypass a requirement for using a real grid gpu.

using open core the pci device id can be spoofed. It’s essentially identical to the spoofing that’s done in KVM on Linux/proxmox, but using open core because VMware doesn’t let you spoof pci device ids.

(you actually can spoof a pci device id in the VMX file of a VMware virtual machine but specifically for nvidia vgpu your spoof is overridden at vm boot.)

Problem is I’ve tried it and I can’t get it to work. Anyone able to give this a try?

All that said there’s really no compelling reason to muck about with spoofing when you can get a real grid gpu for $80 and run your own fake license server using fastapi dls.
I found that the GPU Unlocking Discord has figured out a way to run your own licensing server for Nvidia grid GPUs. They've added info on using it with Proxmox, ESXi, XenServer, and others. I have a P40 coming in next week and will be tinkering with that.
 

nk215

Active Member
Oct 6, 2015
412
143
43
50
I found that the GPU Unlocking Discord has figured out a way to run your own licensing server for Nvidia grid GPUs. They've added info on using it with Proxmox, ESXi, XenServer, and others. I have a P40 coming in next week and will be tinkering with that.
Can you point to a link?

Please keep us updated on how you get a P40 working with your own licensing server on ESXi.

Thanks
 
  • Like
Reactions: eptesicus

zachj

Active Member
Apr 17, 2019
159
104
43
It’s just fastapi dls right? Or did they find yet another way?

as I said in a different thread I know fastapi dls will work. It’s just a bit more cringe to me than is spoofing the pci device id. I reviewed the code from the GitHub repo and it looks benign to me but I just expect nvidia phone home will know what you’re doing.
 

CyklonDX

Well-Known Member
Nov 8, 2022
848
279
63
Thats what driver typically does to check if you have valid license, and the self-hosted lic server is to have it talk with your own server instead. Once you set it up right, there's no fear NV will check anything. Just use your own correct drivers. Obviously this is for non-commercial use only, if you plan on running business with this buy lic, or use something else.
 

bayleyw

Active Member
Jan 8, 2014
302
99
28
If you just want to light up a VM with a GPU you can get an M10, which is built out of 4 750's behind a PCIe switch.