You are not logged in.
Has anyone managed to start gnome 3.14 with Nvidia Optimus ?!
Not with bumblebee, but with modesetting.
Any help is appreciated.
I don't know when it is started working. But it works now.
I am using ..
gnome-shell 3.15.90-1
mutter 3.15.90-1
nvidia 346.47-1
nvidia-libgl 346.47-1
nvidia-utils 346.47-1
Last edited by bhante (2015-02-25 06:02:07)
Offline
I am about to test using the Nvidia drivers with modesetting myself (been using bumblebee without problems but I am just curious).
It is indeed missing instructions on the archwiki for GDM. But a bit of looking around I found on the Gentoo wiki:
add the following 2 lines to /etc/gdm/Init/Default
exec xrandr --setprovideroutputsource modesetting NVIDIA-0
exec xrandr --auto
Offline
mutter is the culprit.
with mutter 3.12.2
it works fine.
with mutter 3.14.0
I am getting X Error of failed request: BadMatch (invalid parameter attributes)
Major opcode of failed request: (RANDR)
Minor opcode of failed request: (RRSetOutputPrimary)
If possible, test it,and then reply. Thank You.
Offline
I'm having the same issue after update.
Offline
I tried. No Solutions so far.
Offline
FWIW:
1) Updated last night, got Gnome 3.14 etc. (I update every night).
2) I don't use Gnome (I use Openbox), BUT I created a new user so I could start it up and see the new, shiny stuff.
3) I rebooted, LDM came up (I don't use GDM), I logged into Gnome as my brand new test user
4) After a few minutes, got the "Oh No..." screen from Gnome.
5) Switched to TTY 2, logged in and rebooted.
My machine info:
i7 processor, 16GB RAM, nVidia GT 730M Optimus (I have configured my machine to ALWAYS run nVidia card -- instructions in Wiki -- but this is an Optimus machine)
I have NOT ran this down any further. I have NOT tried switching to GDM. I have NOT looked at any logs (it was time for me to head to bed.) I will play with it more this evening.
Matt
"It is very difficult to educate the educated."
Offline
With Intel (by disabling nvidia) card, mutter 3.14 works fine.
For optimus laptop setup (Offloading Graphics Display with RandR), every other window managers work fine except for mutter > 3.14
Last edited by bhante (2014-10-17 13:09:05)
Offline
I have had blank screens with the same combination. I have tried plasma-next and awesome. Both work fine with NVIDIA Optimus. I reverted back to using Bumblebee to get GNOME 3.14 to work.
Offline
Yeah, I'm sure it's mutter and xrandr in my case as well. I have an Optimus laptop, but I have used xrandr to use only the nVidia card and not use the Intel card.
What I wonder is -- is this the way it' suppoed to work now, or is it a bug in mutter that needs fixed? I didn't do proper research when I bought this laptop last year, because I wanted an nVidia card -- but ONLY an nVidia card, not this Optimus stuff.
Matt
"It is very difficult to educate the educated."
Offline
I had the same problem (black screen on Optimus hardware when using modesetting). To make it work on my laptop I had to downgrade:
gdm to 3.12
mutter to 3.12
libinput (new version is apparently not backwards-compatible)
and some other libs (I think clutter and gjs) to make it all run and not hang when resizing firefox
also had to make sure that /etc/gdm/Init/Default is properly modified (I had this problem before so I have a git repository in /etc and I add all the files I modify to it to make sure they're recoverable if overwritten)
So now I have only some of gnome 3.14, but at least it's working.
Offline
I left Gnome last year -- went to Cinnamon. Then went to enlightenment for a month after finally deciding I'd just go with Openbox.
I have used Gnome since 1998 up until the move to Cinnamon last year. And now I feel I will move to KD5/Plasma. I installed Plasma on a second user account and I really like it. Gnome's attitude has just got me fuzzed up since the GTK3 fiasco some time back. It smacks of the "Microsoft mentality", so I am probably leaving it behind.
But to get on topic, I will keep Gnome installed for the foreseeable future, in case mutter ever gets an update. I will test it out for grins and giggles if that happens.
Matt
"It is very difficult to educate the educated."
Offline
Has there been any development on this front? Anyone figured out the issue, or at least which codebase is responsible for it? Any upstream bug reports we may want to follow? I've been googling this periodically for the last couple of weeks and this thread still seems to be the most up-to-date on the issue.
In the meantime I've reverted to bumblebee and man that's a world of pain. Gnome is by far my favorite DE, but I might change for a while if things stays as they are...
Offline
has anyone managed to fix this?
Offline
There is a bug report here.
https://bugs.archlinux.org/task/42452
So far it looks like I'm the only one that voted for this bug.
Offline
I also voted for the bug and, according to the explanation on the comments by one of the bug's assignees, it seems fairly simple to solve (assuming you have some sort of understanding of the codebase, compilation and pull request procedures for the GNOME team, which I don't). It does look more like an upstream issue than an Arch issue, though, and so I tried to find anything about it on the mutter bugtracker, and it does not look like it's there: https://bugzilla.gnome.org/buglist.cgi? … %22+nvidia .
I might be looking in the wrong place, though. Anyone knows if it's the right place? If not, where would I find it? If it is the right place, should I just file a bug report about it there, or is there a more appropriate procedure to let them know about it?
Offline
Well, at least there's a bug report now on mutter's own bug tracker:
Offline