You are not logged in.
Hello,
I am having quite a weird issue with my VM and GPU passthrough. It seems I only get near native performance the first time I boot my guest os. If I shutdown my guest and start it back up, I get much worse GPU performance for some odd reason. The only way I have found to resolve this is by restarting my host.
Host: Arch
Guest: Windows 10 1803 (Also occured on 1709)
Ryzen 1700 @ 3.9GHz
G1 Gaming 1080 for guest
EVGA 770 for host
ASRock x370 Taichi (BIOS V3.70)
16GB DDR4 Corsair Vengeance @ 3066MHz
Samsung EVO 850 500GB for guest
WD Blue M.2 500GB for host
Corsair TX650
XML Pastebin: https://pastebin.com/pfk5s3vN
Using latest official packages for everything.
Offline
post lspci -knn output please, also /etc/modprobe.d/vfio.conf .
The xml output suggests you use libvirt to manage the VM, is that correct ?
Last edited by Lone_Wolf (2018-09-16 13:09:46)
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
(A works at time B) && (time C > time B ) ≠ (A works at time C)
Offline
post lspci -knn output please, also /etc/modprobe.d/vfio.conf .
The xml output suggests you use libvirt to manage the VM, is that correct ?
Yes that's correct.
lspci: https://pastebin.com/ThpMCSTg
vfio.conf:
options vfio-pci ids=10de:1b80,10de:10f0
Thank you for your help.
Last edited by retalak (2018-09-16 14:53:42)
Offline
post the output of the script at https://wiki.archlinux.org/index.php/PC … _are_valid
Also dmesg / journalctl output from first and 2nd start of the VM without a reboot inbetween could help.
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
(A works at time B) && (time C > time B ) ≠ (A works at time C)
Offline