You are not logged in.
Hello,
I'm having a problem with nvidia-378.13: Videos in the shotcut video editor jump around a few frames back and forth while playing (at least in GPU-accelerated mode, haven't tested without). I don't have this issue with the 375.26 drivers.
Since I need to get some stuff done, I would simply downgrade. However, /usr/lib/xorg/modules/extensions/libglx.so is now owned by xorg-server after the latest system updates while the file was owned by nvidia-libgl before. I did a forced update (or downgrade) of nvidia, nvidia-utils and nvidia-libgl 375.26 to overwrite the libglx.so with the nvidia version and it works for the time being, but sooner or later there will be issues.
What is the best/sane way to use the older nvidia drivers?
Best regards
Ochi
Last edited by Ochi (2017-02-19 14:26:53)
Offline
Rebuild via ABS and remove that file from the nvidia package.
At some point there may be an ABI incompatibility and this will no longer work - but this would be true of any downgrade strategy or withheld package. This should be fine as a temporary workaround.
EDIT: actually, on second thought, I'm a bit confused. You seem to be ok with the fact that yours is a (somewhat) short term workaround, but you are concerned about long-term consequences. The long-term solution would be to file appropriate bug reports to fix the underlying problem. Downgrading will never be a long-term solution. So you're current approach is just as good as my suggestion to rebuild the package, and probably much easier.
"UNIX is simple and coherent..." - Dennis Ritchie, "GNU's Not UNIX" - Richard Stallman
Offline
Thanks, building from ABS and modifying the package sounds quite good. The issues I anticipated were more on a package manager level: After the force-downgrade it is already problematic to update to the newer version again since libglx won't be there anymore and re-installing xorg-server without force will still not reinstall the file.
But anyways, I think I have found the culprit and a nicer workaround: The new nvidia driver tries to use OpenGL Threaded Optimizations (see nvidia driver README). Disabling them using __GL_THREADED_OPTIMIZATIONS=0 for starting shotcut disables these optimizations and the hiccups are gone. I will try to file a suitable bug report either to the shotcut dev or nvidia (or both), not sure whether it's a driver issue or bad GL calls/assumptions.
Offline