A little late to this party, but wanted to add another success story to the mix.
Setup:
- ESX 6.0 Update 3
- Supermicro X10SRH
- Xeon E5-2630 V4 QS
- 64 GB
- GTX 970
Short version:
- Added hypervisor.cpuid.v0 = false on W10 VM via VSphere VM Settings->VM Options->Advanced->Configuration.
- Passthrough GTX970 to VM via standard VSphere web client.
- Installed latest drivers (382 something)
- After passthrough (and many BSODs), confirmed that GTX 970 is pciPassthru0 and added pciPassthru0.msiEnabled= false on W10 VM via VSphere VM Settings->VM Options->Advanced->Configuration.
- Did NOT need to disable SVGA or the console display.
Long version:
I had no issue passing the 970 through to the VM and getting the latest (382 something) to install on the VM after setting hypervisor.cpuid.v0 = false.
At this point, I had NOT set pciPassthru0.msiEnabled = false on the VM yet.
As soon as I connect an external monitor though I would continually get VIDEO_TDR_FAILURE BSOD's on the VM. I was able to unplug the external monitor and use the VM Console to disable the SVGA driver, but this did not resolve the BSOD when reconnecting the external monitor. Also the BSOD would happen fast enough that I could not even get the VM console display to disable via Windows.
I eventually was able to modify a TDR time out registry setting remotely on the VM before attempting to login, which allowed me to get the console display disabled (but the video would still freeze/pause at the the timeout value I set).
Disabling the console display did NOT help either. I reverted the TDR registry setting remotely again, and set pciPassthru0.msiEnabled = false. This solved the BSOD and I was able to keep the Console display enabled with the SVGA driver enabled as well if I liked.
The 970 seems to be working normally, games seem to be running well, etc.