You are not logged in.
Pages: 1
This is an idea that's been floating around in my head for a while. I help administrate a lab with Windows XP clients on a Windows Domain. I've been wondering about how a similar system could be implemented in linux.
The group policies are in general fairly simple. Configs could be generated by plugins (not unlike the GPO "snap-ins"), and with Arch, the ALPM libraries that form the backend to Pacman make software deployment simple.
There are probably gaping holes in my thinking, and if someone familiar with GPO and AD could tell me I'm an idiot and give reasons why, I would much appreciate it.
The one problem I see in such a setup is user-specific application deployment and access control.
When I assign an application to a user (using GPO under Windows), it is installed when the user logs on (actually, shortcuts are made when the user logs on and the application isn't installed until the user actually runs it, but that doesn't matter). But when the user logs off, is the application uninstalled? When another user logs on, could they navigate to the program's install directory and run it?
I don't even know how Windows handles this, obviously, but it seems wasteful to reinstall the application every time the user logs in.
Given that I don't really understand how it works in Windows, how could it work in linux? I could just chmod every file owned by that package to be accessible only by root, that would be a workable solution... Once again, uninstalling it completely seems like a waste... I could just delete the .desktop files to hide it in whatever GUI the user is running, but then they could just use the command line and find it anyway. I could try to block command line, but would that break GUI functionality in some way?
As I said before, if someone could tell me I'm an idiot and give reasons, I would be very thankful.
And BTW, Windows sucks isn't a reason. I know it does, but it is currently far better in a lab environment because setups like this using GPO are possible. Windows has network administration down pat, and I'm hoping to start a branch or Arch (not really a branch, more like a few packages and a customized installer) that could give linux similar functionality.
Offline
Only nearest solution close to AD on linux I know is Fedora Directory Server. AD and Directory Server are both actually a bit different LDAP servers. I don't know if theres really any centralized management solution (other than thinclients) on linux, I wish there could be.
I have used GPO just to automatically add fileshares, printers and some computer settings (like turn xp firewall off), I haven't actually distributed software trough it. If you just add icon which installs the progman, I don't think that it will be uninstalled.
Offline
I'm just wondering how Windows keeps users from running applications that have been installed because they are assigned to other users. It probably caches the MSI files somewhere and uninstalls/reinstalls the application as necessary, although they could just play with permissions.
On linux, Kerberos and LDAP can approach the functionality of AD and Domains, but there's nothing like GPO,
I had a chat with a Samba developer lately and they're having a lot of pain dealing with what Microsoft did to the LDAP standard when they made AD. Technically, it's a LDAP server, but they made it case insensitive, added and removed a few required entries, and in general modified the standard to suit their purposes.
I have done network logons using Kerberos and LDAP before, and I'm looking into using AFS for home directories (it's nice because you can get lots of data redundancy). I'm also looking into programming a system that could provide the same centralized management that GPO provides, the closest thing you can use right now is clustered SSH, which, while less efficient, could accomplish the same thing.
Offline
you could also nfs mount /usr and /home, use ldap for centralized user information, and use some custom/funky ACLs (or selinux policies) to only allow certain user groups to run certain apps. Then you could just drop users into the appropriate groups in the ldap tree.
"Be conservative in what you send; be liberal in what you accept." -- Postel's Law
"tacos" -- Cactus' Law
"t̥͍͎̪̪͗a̴̻̩͈͚ͨc̠o̩̙͈ͫͅs͙͎̙͊ ͔͇̫̜t͎̳̀a̜̞̗ͩc̗͍͚o̲̯̿s̖̣̤̙͌ ̖̜̈ț̰̫͓ạ̪͖̳c̲͎͕̰̯̃̈o͉ͅs̪ͪ ̜̻̖̜͕" -- -̖͚̫̙̓-̺̠͇ͤ̃ ̜̪̜ͯZ͔̗̭̞ͪA̝͈̙͖̩L͉̠̺͓G̙̞̦͖O̳̗͍
Offline
Well, NFS mounting /usr would break pacman, right?
I want to have standalone clients that get home directories off the network. I'm planning on LDAP/Kerberos for user management/authentication.
I know that the Microsoft approach to installing an application for user is to install it in the user's home directory...
Following your mount /usr suggestion, is there some way I could have a share consisting of a software installation and somehow mount that? I could user permissions to make sure nothing strange happened and if my GPO system automated the mounting...
I would have to modify the path, or would I?
I'm puzzling over the best approach here.
Installing/uninstalling is too inefficient.
Installing and changing permissions to limit access would work, but it could get messy.
I like the idea of selinux, I'll have to look into that, I've only heard vague things about it.
The thing that just occured to me is that, following the Microsoft approach but with a twist to save disk space, I could have the exclusive-access packages untarred onto an AFS share, and mount that somewhere in the home directory. The problem I see there is that I couldn't mount multiple package shares in the same place, which makes life hard. Or could I... There must be some way to mount an AFS share so it combines with the local filesystem...
This probably sounds crazy and I'll probably call myself an idiot when I'm thinking more clearly. See the note at the end.
Just FYI, I'm planning to code something similar to the GPO system. It would work similarly, with plugins for different applications, and most of what it would do would be config file parsing and rewriting.
This system would also give me quite a bit of flexibility in this application access problem because everything doesn't have to be pre-set on existing permissions, I could do crazy things like changing permissions on-the-fly.
I'm sorry if I sound incoherent here, I went to an all night lockin last night and my coffee isn't helping much.
Offline
I'm just wondering how Windows keeps users from running applications that have been installed because they are assigned to other users. It probably caches the MSI files somewhere and uninstalls/reinstalls the application as necessary, although they could just play with permissions.
That is usually installed from a shared network drive on a per user basis (Windows) in most large organizations.
If it 's a local install it will depend on the permissions the other user has, he/she might be able to see the program folder and still be unable to execute it
Offline
Yes, that's what I've found, Windows just installs the application in the user's home directory (although not on the home directory on the server). You could just install it and change the permissions so it's unreadable, but that's not in any way automated AFAIK.
It's driving me nuts that I can't find a simple way to accomplish this on linux. I could install it in the home directory like Windows, but I don't want to fill the file server(s) with application installations and make users angry because their quota has been destroyed by automated installs. If there was a way to install the application locally and merge the local install with the user's home directory, that would work.
Better yet, as I described in my last post, it would be amazing if you could install the package on a network share, and somehow merge that share with the local filesystem. I feel stupid asking this, but is that possible? Do mountpoints have to be empty directories, or could I mount an AFS share on /?
It's funny, I feel competent using linux, but exploring this is making me question my most basic linux knowledge. For example, I've started wondering whether a user could run a KDE session if their shell is set to /bin/false.
Offline
Would I have described be possible with aufs (which is basically the new UnionFS)? Could I merge multiple shares and the user's home directory?
I'm going to have to read their documentation, don't have time right now though,
Offline
To answer my question, yes, UnionFS/aufs could do this (either one supports it).
Could someone who has used UnionFS tell me if there is a significant performance hit? I would probably be combining an AFS share containing an installation of the package with the normal root fs. I would probably be using many shares of package installations actually. I could look into some automated way of combining packages that are only ever used together, but how would UnionFS perform if I combined ~100 read-only AFS shares with the standard writable / found on the local hard disk?
Is this even technically feasible?
Offline
*bump*
Offline
Hi arew
Group Policies in Windows are basically one thing: Registry Entries. Those are just Keys and Values applied to the local registry. Of course, this is not true for software deployment. Of course there's no such thing as a registry under Linux. Settings are applied (well, in the most cases) through config-files. So, I guess it should be possible to apply settings like in GPO's through rSyncing files in /etc and in ~. In a more advanced scenario, you could manage those settings through files like /etc/bashrc.gpo and include those files in the original config-file. That way, the configs were "safe", if you know what I mean. It's just a thought of mine.
The other part is the software-deployment. As you already discussed, you could just make a script that installs the software the first time its being used (so you actually don't make a shortcut to the software itself, you just link the shortcut to the install-script, which will remove the "old" shortcut when the software is installed). In addition, you create a group similar to the application name. Then you chown and chmod the binary (and its components) so that only the corresponding group can execute the application. The (only, IMHO) disadvantage is that you have to create a set of rules for each application (as you had to for software-deployment under windows. AFAIR exes aren't allowed, you have to use .msi). But that's not much of an effort and once you have it you can use it over and over again (or you could even share it on the net so everyone could use it, something like PKGBUILDs).
Offline
My evil plot is to write a daemon that retreives applications that should be installed when the user logs on, and downloads config, package, and any other updates from a central server (not unlike a Domain Controller). This would make management generally easier because instead of just setting up a RSYNC server, I can deploy specific configurations to specific clients. Package management also becomes easier because I don't have to manage groups for applications, the server would maintain a database and tell the client what it needs to have installed.
I could even emulate the detailed configuration that GPO allows instead of just having administrators write configs if I wanted, I would just have to parse the config as the application would. That would take time to perfect, but it's possible.
I guess I'm going to give up on the per user installs. Windows, as it turns out, handles the installation of an application assigned to a user by installing the application when the user logs on and leaving it there for others to access. I could cook up some sort of security policy (I still need to look into SELinux) that only allows certain users to run certain applications, but that would probably also come later.
Anyway, I've been getting a better picture of how all this should work, any advice/suggestions are still very welcome.
Oh, and reading back through your post again, do you think I should even bother to make a shortcut? When you install a MSI under Windows, it takes... forever. It's the new standardized Microsoft Minute (TM).
When you install packages with Pacman, the longest part of the installation is the download. Once that's done, you can install about 20 packages in about 30 seconds on an average system. Why not just run the install at startup or login? I could work the shortcuts if I had to, but when you consider that the server these packages will be coming from is most likely on a LAN with the clients, package installation time becomes negligible. Heck, I could even have the clients cache the packages at startup if I had to.
Offline
Here's a thought. If you are running an arch specific network, you could set up a local repository for packages and updates easily enough, which allows you to control which packages are distributed to the network.
What you do then is get the PKGBUILD for the applications you want to install, and then in the PKGBUILD set the permissions to the correct group for the application. This means that on install the app will only be able to be run by the users you specify. Having done this, you can write a daemon that checks what packages are needed by the user on login, if it is installed it does nothing, and if it isn't then it uses pacman to install them. This basically means that even when the user logs out the app is still installed, it just can't be accessed by other users, as the binaries are not executable to them.
An alternative to using the daemon would be to just install all the apps you need on the workstations, it doesn't really matter as they would still only be accessible to the right users.
Edit: The advantage of the latter is that applications can be upgraded across the network easily enough too.
Last edited by SiC (2008-10-28 12:23:00)
Offline
So I guess the installation part should be solved now. There are several ways. You could try them and give us feedback.
As for the configuration deployment: I didn't want to say that you should use only an rsync-server, my thought was that an rsync-server is just the way of deployment. There needs to be an application that advises the rsync-server, that manages and applies configuration to users and so on. That part needs still to be done. It needs to be close to the usermanagement, as it is in AD. If you want those GPOs you'll want central user management as well. Probaly we should have a look at what LDAP is capable of.
and don't forget to report progress in this really interesting thread!
Cheers
Offline
Have a look at policykit, it sounds like it might be useful here.
Offline
Pages: 1