You are not logged in.
Hello!
Need an advice.
I need to administer multiple (> 20) computers. As OS, I use arch linux.
The problem is that these computers are connected via 3G Internet (or even worse) - and I can't update them regularly.
And sometimes I come across a situation where I need to install different small "system" programs (for example, htop), but it turns out that for this you need to install another million dependencies, and it’s not always possible to do this.
As a solution, I see the following option - make my own copy of the repository, "freeze" it and use only it for these computers. (and update them )
I can roughly imagine how to do this, but maybe there is a guide or something similar for arch.
Or you should think about another distribution. Arch was chosen because of minimalism - in fact, I need a basic system, X and openbox, and I do not want to complicate much.
Thanks for attention.
Offline
No need to create your own - it already exists...
https://wiki.archlinux.org/index.php/Arch_Linux_Archive
Do remember though that by running an outdated system you won't be able te get any help here on the forums for any issues that may arise as a consequence. It also has massive security implications.
Offline
Arch is rolling release, so I would say it's a pretty bad fit for your intended purpose. However, there is the Arch Linux Archive, which allows you to target a snapshot of the repositories as they were at a specific point in time.
https://wiki.archlinux.org/index.php/Ar … cific_date
This may allow you to scratch your particular itch while remaining on Arch Linux.
I would still recommend something like Debian instead though.
Sakura:-
Mobo: MSI MAG X570S TORPEDO MAX // Processor: AMD Ryzen 9 5950X @4.9GHz // GFX: AMD Radeon RX 5700 XT // RAM: 32GB (4x 8GB) Corsair DDR4 (@ 3000MHz) // Storage: 1x 3TB HDD, 6x 1TB SSD, 2x 120GB SSD, 1x 275GB M2 SSD
Making lemonade from lemons since 2015.
Offline
Or you should think about another distribution. Arch was chosen because of minimalism - in fact, I need a basic system, X and openbox, and I do not want to complicate much.
Debian stable from a netinstall image, perhaps? The limited updates would make it easier for your connection. Some people even put out spins with openbox as the desktop...
EDIT: ninja'd by WorMzy.
Last edited by Head_on_a_Stick (2020-02-19 14:09:15)
Offline
I see the following option - make my own copy of the repository, "freeze" it and use only it for these computers. (and update them )
I can roughly imagine how to do this, but maybe there is a guide or something similar for arch.
There is, in fact, a guide for this Network shared pacman cache: Read-only cache
--
saint_abroad
Offline
If install package from archive like wiki says
pacman -U https://archive.archlinux.org/packages/ ... packagename.pkg.tar.xz
Will later that package will be automatically update using ordinary pacman -Syu?
Offline
Yes.
Sakura:-
Mobo: MSI MAG X570S TORPEDO MAX // Processor: AMD Ryzen 9 5950X @4.9GHz // GFX: AMD Radeon RX 5700 XT // RAM: 32GB (4x 8GB) Corsair DDR4 (@ 3000MHz) // Storage: 1x 3TB HDD, 6x 1TB SSD, 2x 120GB SSD, 1x 275GB M2 SSD
Making lemonade from lemons since 2015.
Offline
Hey Dmitry,
sounds like you need something like a local cache for packages.
You can use my IPFS-mirror for this purpose:
If all computers are stationary and you just want a centralized server which caches the packages you can setup IPFS on one computer, change the config to a listening address in the local network, for the web-gateway and configure this computer as your mirror server on all other computers like this:
(if your network is 192.168.0.0/24 as an example, and the "server" is on 192.168.0.5)
# IPFS
Server = http://192.168.0.5:8080/ipns/pkg.pacman.store/arch/$arch/default/$repo
The requests will be proxied into the IPFS network, the server will download the packages onto its local storage and keep them there until the storage limit (by default 10 GB) is nearly full and then start to discard them.
You can fetch the updates on all computers at the same time with a systemd-timer or a cronjob like this:
pacman -Syuw --noconfirm
This will reduce the amount of traffic needed, since the db files are all the same and you get the same package versions on all computers. Since it's running in the background, you don't have to do anything.
You can then go ahead and install the updates without fetching a new db with 'pacman -Su' and cause zero traffic
Best regards
Ruben
Last edited by RubenKelevra (2020-03-10 15:23:09)
Offline
You can fetch the updates on all computers at the same time with a systemd-timer or a cronjob like this:
pacman -Syuw --noconfirm
Please don't suggest things like this to other people. This is equivalent to running `pacman -Sy` as far as partial upgrades are concerned.
Offline
RubenKelevra wrote:You can fetch the updates on all computers at the same time with a systemd-timer or a cronjob like this:
pacman -Syuw --noconfirm
Please don't suggest things like this to other people. This is equivalent to running `pacman -Sy` as far as partial upgrades are concerned.
Can you be more specific? How do pulling the db and packages in the cache on multiple computers at the same time affect how an update is applied?
Offline
Because you've updated the pacman db without updating the system. If you then procced to install something using pacman -S it can leave your system in a broken state.
Offline
Pacman -Sy is fine [1] and not a partial upgrade you just have to avoid using pacman -S until you have performed a full upgrade.
Offline
Pacman -Sy is fine [1] and not a partial upgrade you just have to avoid using pacman -S until you have performed a full upgrade.
That is one, very unpopular opinion, and I told him before he committed that, that he was going to make an enemy of the support staff because of this.
Please do not consider this to be good advice.
Managing AUR repos The Right Way -- aurpublish (now a standalone tool)
Offline