You are not logged in.
hello there,
Few months ago I got this idea crossing my mind. How could be possible to implement torrent/p2p protocol to tranfer packages file across the globe?
In fact we all may have some package cache growing in our HD, therefore won't be difficult to increase transfer to Archers faster than before and reduce the load on the mirrors.
Well there could be only one weakness, package genuinity, but the normal md5sum could do its job greatly.
Other drawback it's made by ISP which don't like P2P and trying to boil it down on reducing bandwidth.....
Packages difference?
Anyhow, what could be the limit on implementing this way?
do it good first, it will be faster than do it twice the saint
Offline
not terribly difficult.
For a while I had a shell wrapper for aria2, that I used for the XferCommand in pacman.conf
It downloaded parts of files from each mirror, and worked with a few .torrent files I generated for it (fooling around).
I don't recall if at the time aria2 supported http as a torrent source or not (or even if it does now), as it was quite a while ago (years? yikes).
Overall though, I found it much slower. Torrents are great for large files (huge files), but for lots of smallish files, it didn't seem like a real win -- at least in the testing I did long ago. Your mileage may vary.
"Be conservative in what you send; be liberal in what you accept." -- Postel's Law
"tacos" -- Cactus' Law
"t̥͍͎̪̪͗a̴̻̩͈͚ͨc̠o̩̙͈ͫͅs͙͎̙͊ ͔͇̫̜t͎̳̀a̜̞̗ͩc̗͍͚o̲̯̿s̖̣̤̙͌ ̖̜̈ț̰̫͓ạ̪͖̳c̲͎͕̰̯̃̈o͉ͅs̪ͪ ̜̻̖̜͕" -- -̖͚̫̙̓-̺̠͇ͤ̃ ̜̪̜ͯZ͔̗̭̞ͪA̝͈̙͖̩L͉̠̺͓G̙̞̦͖O̳̗͍
Offline
There are some ideas of introducing binary deltas, torrents, p2p sharing etc. but there hasn't been much going on lately in this regard:
https://bbs.archlinux.org/viewtopic.php … 90#p882590 - I don't think anyone stepped up and the repo went offline
https://bbs.archlinux.org/viewtopic.php?id=90970
Offline
As far as I guess, the implementation, on torrent sharing, might take only a server for tracking where the files are located.
Small files might be grouped on the fly, if they aren't by their environmental use. Other files like libreOffice, kernel, JKE, just to mention some big package, might be spitted.
In the other hand, it's only how most of the archers will welcome this metod. Because I believe there's no much to change. Then it could be by torrent or Amule-like to have a distributed packages library to gather files from, which may not suffer lags like congested servers.
I've looked those links and I'd say that's a total different implementation proposal.
So for final easy use with pacman, I still prefer aria2c as downloader, I could take some better configuration to grab files from several mirrors and/or splitted down into few Mbytes size. Actually I turned ont pacman .3.5 and powerpill won't work any longer
Just to try to convert Xyne code for pacman compatibility. So the style remains to fill the cache of necessary packages and then invoke pacman to start the upgrade.
do it good first, it will be faster than do it twice the saint
Offline
So for final easy use with pacman, I still prefer aria2c as downloader, I could take some better configuration to grab files from several mirrors and/or splitted down into few Mbytes size. Actually I turned ont pacman .3.5 and powerpill won't work any longer
You're doing something wrong then, its works fine here. Perhaps your db has already been upgraded? I don't think there's a downgrade path for that.
Allan-Volunteer on the (topic being discussed) mailn lists. You never get the people who matters attention on the forums.
jasonwryan-Installing Arch is a measure of your literacy. Maintaining Arch is a measure of your diligence. Contributing to Arch is a measure of your competence.
Griemak-Bleeding edge, not bleeding flat. Edge denotes falls will occur from time to time. Bring your own parachute.
Offline
People, before anyone is even THINKING about this, we need a proper way to sign packages using pacman. Otherwise anyone could spread malware using such a p2p system.
Offline
People, before anyone is even THINKING about this, we need a proper way to sign packages using pacman. Otherwise anyone could spread malware using such a p2p system.
In the unsigned packages situation, p2p is safer than one-mirror downloads because any mirror with a different file generates an error. Please do your research first on what package signing is and the benefits it brings.
Allan-Volunteer on the (topic being discussed) mailn lists. You never get the people who matters attention on the forums.
jasonwryan-Installing Arch is a measure of your literacy. Maintaining Arch is a measure of your diligence. Contributing to Arch is a measure of your competence.
Griemak-Bleeding edge, not bleeding flat. Edge denotes falls will occur from time to time. Bring your own parachute.
Offline
"any mirror with a different file generates an error"
My point is that you cannot know for sure that someone doesn't change the file in a clever way such that file size and hashes stay the same, thereby tricking the user into downloading these files without noticing any problem/generating an error. Admittedly, this sounds very hard to pull off in practice, but depending on which P2P network is used (and especially which hashing function it uses), this is not necessarily impossible. You are correct that bittorrent should be safe, though.
Offline
hashing functions are chosen for the extremely low chance of collision. If a collision happens with a forged file, it will be very improbable that both files have the same size, and if by chance they have the same size, it's really doubtful the forged file will be able to do/carry something meaningful, it will probably be some random garbage.
Ah, good taste! What a dreadful thing! Taste is the enemy of creativeness.
Picasso
Perfection is reached, not when there is no longer anything to add, but when there is no longer anything to take away.
Saint Exupéry
Offline
I still caressing the idea to download packages from several locations for the fastest method available.
Server having some limits, huge packages might reduce download speed.
I found parallel download from splitted files improve speed upto 400%. P2P will overcome the server limits and it won't cost any additional space, as long as any Archer has a package cache.
Distributing new packages also will be possible by p2p, I don't see much a threat the package faking, as much as no profit will arise from these software
.
do it good first, it will be faster than do it twice the saint
Offline
I guess most people don't care that much or perhaps they are downloading some other stuff when pacman does the updates or their download speed is e.g. 5Mbps so using 100Mbps servers is enough for them.
Offline
I still think extofme wanted something similar https://bbs.archlinux.org/viewtopic.php … 71#p761771
Offline
Bandwidth shape can be adjusted, if aria2c is the downloader. This for a purpose of having packages quickly.
Secondly, reducing mirror load is also a solution and P2P it proven to alleviate bottle necks
For my use I got a bash script to compose multi-mirror links input file to give to aria2c for the download, Thanks 4javier www.archlinux.it . A bit unripe indeed
There are several steps in that script, which I'd prefer to bring under one program. I don't know C, therefore accessing libalpm it's complicated AFAIK. Package-query looks great, only it'll take some shell piping that I'm not aware
Also xyne gave us pacman2aria, anyhow there still several steps before the completion, which could be prone to failures.
I'm looking into airpac code and get some idea from it, but my python knowledge is scarce and time too. I'd like if a libalpm wrapper will become available.
do it good first, it will be faster than do it twice the saint
Offline