You are not logged in.
Mass deployment - it's essentially what the chaotic-aur repo is doing for you as well.
I wouldn't call 3 systems mass deployment, but there are similarities with unofficial repos that are used by their creator. .
you hunt down the dependecies yourself and keep everything up-to-date
Not sure what you mean by hunting deps, but I am subscribed to aur packages I use and also use nvchecker for non-vcs packages to keep an eye on upstream.
My local custom repo has been a valuable asset to me and my workflow for atleast 15 years now.
I have tried a few aur helpers in the past (including yaourt) but found I had less control and more problems with them then without.
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
clean chroot building not flexible enough ?
Try clean chroot manager by graysky
Offline
@LW
recursive aur deps: I want to build and use aur package A - but it depends on another aur-only package B (repeat)
a local repo doesn't help unless you do quite some work that a helper can do for you
if I learned one thing from LinuxFromScratch: I don't want to deal with that manually if there're tools doing that for me
Offline
That just means I have to add B to my local repo before I can build A.
Several of the aur packages I maintain have a more complicated setup. Building packages and putting them in my local repo makes it easy for pacman/makepkg to use them.
The unix philosophy of
Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new "features".
fits well with me.
Maybe you prefer a tool that can do many jobs ?
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
clean chroot building not flexible enough ?
Try clean chroot manager by graysky
Offline
If you're gonna throw unix philosophies at each other I'll briefly remind everyone that they're currently using systemd (and some even try to excuse that w/ a disclaimer )
Whether you want/need something to auto-resolve and build dependency chains for you kinda depends on the package.
If you're interested in an AUR package that depends on an entire tree of other AUR packages that need rebuilding, it's not unreasonable to not doing this manually.
https://aur.archlinux.org/packages/aurutils might intersect either opinions here.
Offline
A custom local repository that is added to pacman.conf solves that.
I didn't know about local repositories. Serves me right for not meticulously reading the wiki...
So, a simple script that converts the entire AUR GitHub mirror into a local repository would be all that's needed to keep going whenever aur.archlinux.org goes down?
Or will using such a large repo cause performance or storage space issues?
Last edited by plp (Yesterday 20:19:18)
Online
The repos would host actual packages, you'd have to build constantly everything from the AUR - which isn't realistic nor reasonable.
https://aur.chaotic.cx/ does something like that for a limited set of packages (and implies that you need to trust the repo next to the individual packages it builds)
When aur.archlinux.org goes down (and in general) use the github mirror.
Offline
The repos would host actual packages, you'd have to build constantly everything from the AUR - which isn't realistic nor reasonable.
Oh, I hadn't realised that.
We're back to my old plan for a "utility service" that just provides the PKGBUILDs and associated resource files then. And the RPC.
BTW, does anyone know how large the AUR exactly is? Would it even be feasible to host a local copy of the GitHub mirror on a average-sized SSD?
Last edited by plp (Yesterday 20:33:26)
Online
The github mirror currently has 141620 branches, many just some 3-4 4kB inodes (the AUR doesn't host any sources) so ~5GB?
(Plus the git "overhead", ie. the history if you don't just want a snapshot) - are you genuinely worried that github will drop out?
Offline
are you genuinely worried that github will drop out?
No, I was simply not aware of the actual size. 5GB is nothing, 5TB on the other hand would be infeasible for many people. Even though GitHub is unlikely to drop out, people will still probably want to run the utility service locally for performance reasons. Or (like the good libre-minded nerds they are) host mirrors of the whole thing on independent forges to avoid giving Microsoft the keys to the kingdom.
I'm currently using the instructions here to clone the entire thing, and will report once it's done.
Last edited by plp (Yesterday 22:21:07)
Online
That just means I have to add B to my local repo before I can build A.
...
Maybe you prefer a tool that can do many jobs ?
key - expand that to a 30-package multi-level tree - wanna deal with that by hand?
I don't seek for a tool that does multiple things - but if the one thing I miss out on is proper package management - na uh - not me
it's not about the inability to invest half an hour dealing putting my package together - it's about the 25 minutes wasted when not using a tool dealing with that in 5 minutes
5TB on the other hand would be infeasible for many people
[main@main ~]$ zpool list
NAME SIZE ALLOC FREE CKPOINT EXPANDSZ FRAG CAP DEDUP HEALTH ALTROOT
vault 21.8T 9.44T 12.4T - - 12% 43% 1.00x ONLINE -
say what now again?
Last edited by cryptearth (Today 02:01:01)
Offline
for many people
I think you and the lonesome puppy are talking past each other.
The problem w/ most AUR helpers is that they in fact /do/ multiple things and more specifically that one of those things (the really really bad one) is that they blend between AUR and repos (and by inference also lure people into not scanning the PKGBUILDs *cough*)
So what you're looking for is an AUR helper that doesn't also wrap pacman - such exist, but are for some reason not the ones that frequently show up having gotten their users into a complete mess ¯\_(ツ)_/¯
Offline
The problem w/ most AUR helpers is that they in fact /do/ multiple things and more specifically that one of those things (the really really bad one) is that they blend between AUR and repos (and by inference also lure people into not scanning the PKGBUILDs *cough*)
Or perhaps AUR should have some sort of package verification system in place?
A simple way to do this without burdening the core developers would be to simply let users mark/sign packages as 'verified'. Packages with enough verifications can then be considered safe enough to be installed. This will of course depend on the verifiers' honesty, so perhaps there should also be in place some kind of mechanism for vetting them. Like a scoring system based on the user account's age, packages submitted, etc.
Sorry for digressing, but it's always nice to try and identify new problems to fix. Should I put this in another thread? Am I just being annoying?
say what now again?
21TB is 3 times what I have, yet 5TB would still waste over 40% of your free space.
Sorry, couldn't help talking past you again
BTW:
I'm currently using the instructions here to clone the entire thing, and will report once it's done.
I went to bed and woke up, and git is still adding branches. It's on letter 'p'.
Last edited by plp (Today 09:28:02)
Online
Have you checked how big the repo already is?
The 5GB were a complete estimate based on the branch count and reasonable average package size - but i've no idea how big the branches really are because of the backlog…
The AUR already has a voting system, but that doesn't verify anything - I can package an awesome tool, get plenty of votes (even from experienced users who read the PKGBUILD) and then change it to distribute malware.
Think of the AUR as structured wiki articles on how to package foobar - they're very convenient to apply, but you still need to read and understand them first.
Offline
Have you checked how big the repo already is?
I already talked about it, git hasn't finished downloading/adding all the branches yet. Still on letter 'p'.
Online
A simple way to do this without burdening the core developers would be to simply let users mark/sign packages as 'verified'. Packages with enough verifications can then be considered safe enough to be installed. This will of course depend on the verifiers' honesty, so perhaps there should also be in place some kind of mechanism for vetting them. Like a scoring system based on the user account's age, packages submitted, etc.
Come On Jia Tan ! I'm ready for round two. Bring it on Shake it baby
The AUR already has a voting system, but that doesn't verify anything - I can package an awesome tool, get plenty of votes (even from experienced users who read the PKGBUILD) and then change it to distribute malware.
Think of the AUR as structured wiki articles on how to package foobar - they're very convenient to apply, but you still need to read and understand them first.
This is the reason by the way friendly sea doggo puppy. You can have the verification and all that things but there is always some kind of trust in the hands of the developers in every piece of software, of course here we trust a lot in the package maintainers of core and extra repos because we know by years that they are nice people, but some times came Jian Tan and tries to steal your stuff, like the fox in Dora the explorer :C, but hey we have a lot of pretty neat Boots and Doras saying no to the Fox
21TB is 3 times what I have, yet 5TB would still waste over 40% of your free space.
Maybe btrfs with deduplication ? But not sure to be honest.
str( @soyg ) == str( @potplant ) btw!
Online