You are not logged in.
Here is my idea: Create a program (or even script) that would use bittorrent and pacman for fast downloads of big files (AKA kde or gnome)
1. Tracker with compressed groups of packages (don't download individual)
2. Once done downloading, extract
3. pacman -U everything for install
Don't know the practicality of this, but that's what ideas are for.
I need to find a way out so everyone can find their way out.
Resregietd Lunix Uesr: 485581
Offline
Hmm.. interesting.., like everyone shares their /var/cache/pacman/pkg dir.
I would..
is this what you mean?
Offline
I really don't know, I was thinking more along the lines of using pacman the same, but using bittorrent. Less load on servers means faster downloads, right?
like:
# bitman -S kde
Last edited by z.s.tar.gz (2009-03-21 17:24:10)
I need to find a way out so everyone can find their way out.
Resregietd Lunix Uesr: 485581
Offline
Just curious... have you seen powerpill yet? If you just want simultaneous/segmented downloads, that should be good enough.
I've thought of using torrents to share packages before too (especially for old packages no longer in the repos that live on in people's caches), but I never got further than a few suggestions. There might be a bittorrent protocol in pkgd's future if I ever get around to it (I have ideas for extending its topology floating around in my head).
My Arch Linux Stuff • Forum Etiquette • Community Ethos - Arch is not for everyone
Offline
I've thought of using torrents to share packages before too (especially for old packages no longer in the repos that live on in people's caches), but I never got further than a few suggestions. There might be a bittorrent protocol in pkgd's future if I ever get around to it (I have ideas for extending its topology floating around in my head).
Since powerpill using aria2, it seems like you've already done quite a bit of the work.
Offline
Great! I like it!
It would decentralize repository and speed things up. They are doing ti with git, so why not with pacman.
And + everyone can put custom compiled packages (their own cflags, like core2 or such) and share it... So i can download firefox for my proc and share it to next peer.
Offline
There is a catch with that, in some places p2p traffic is very throttled and I've seen cases of getting corrupted p2p data. Throttling can be done by the isp/school and data corruption will only go away with encryption.
But thats a nice idea, sharing older packages and/or custom packages.
R00KIE
Tm90aGluZyB0byBzZWUgaGVyZSwgbW92ZSBhbG9uZy4K
Offline
Hmm.. interesting.., like everyone shares their /var/cache/pacman/pkg dir.
I would..
is this what you mean?
sounds like a prety good idea, like said it would saave server resources, i would be woried about people altering package files thou. not sure if that could be a problem or if im just being paranoid
Desktop: E8400@4ghz - DFI Lanparty JR P45-T2RS - 4gb ddr2 800 - 30gb OCZ Vertex - Geforce 8800 GTS - 2*19" LCD
Server/Media Zotac GeForce 9300-ITX I-E - E5200 - 4gb Ram - 2* ecogreen F2 1.5tb - 1* wd green 500gb - PicoPSU 150xt - rtorrent - xbmc - ipazzport remote - 42" LCD
Offline
If it was to be done it would still have to uses the md5 from the mirrors in order for it to be made sure that the packages is same has in the repo.
This is just my theory if it was going to be done.
I'm working on a live cds based on Archlinux. http://godane.wordpress.com/
Offline
BitTorrent does integrity checking on every single piece -- no extra checking needed besides the normal md5 pacman does.
Offline
That sounds like a really nice idea. Sure, we don't really need it since there are so many mirrors and all and its not like there is too much a load on them (is there?), but it would be nice since then old packages etc could still be floating around.
But I don't know about custom packages. I'm not against the idea of allowing them to integrate into the system, but I would never use them myself. (Is paranoid of unofficial packages) So make sure there is enough seperation so people who only use official packages can do so.
EDIT: I don't understand the idea of the original poster. I just think that everyone sharing their /var/cache/pacman/pkg/ would be cool.
Last edited by sokuban (2009-03-22 21:47:07)
Offline
old idea is old:
http://bbs.archlinux.org/viewtopic.php?id=2679
http://bbs.archlinux.org/viewtopic.php?id=9399
http://bbs.archlinux.org/viewtopic.php?id=9547
http://bbs.archlinux.org/viewtopic.php?id=33161
still, a neat idea.
"Be conservative in what you send; be liberal in what you accept." -- Postel's Law
"tacos" -- Cactus' Law
"t̥͍͎̪̪͗a̴̻̩͈͚ͨc̠o̩̙͈ͫͅs͙͎̙͊ ͔͇̫̜t͎̳̀a̜̞̗ͩc̗͍͚o̲̯̿s̖̣̤̙͌ ̖̜̈ț̰̫͓ạ̪͖̳c̲͎͕̰̯̃̈o͉ͅs̪ͪ ̜̻̖̜͕" -- -̖͚̫̙̓-̺̠͇ͤ̃ ̜̪̜ͯZ͔̗̭̞ͪA̝͈̙͖̩L͉̠̺͓G̙̞̦͖O̳̗͍
Offline
There you go using the search function again...
Offline
Can't we just added in rsync support? This could allow us to save bandwidth by updating the old packages in are cashe folder.
More like copy older package with new name/version then add a .rsync at the end untill it fully updated with diff.
Could be better then xdelta since you have make it before hand for it to work.
PS My understanding is that rsync does it on the fly. Where xdelta, you have make the diff of the package on server then download and manually added it in.
I'm working on a live cds based on Archlinux. http://godane.wordpress.com/
Offline
I don't know. It was just an idea.
Seems that it has more problems than solutions, so 'oh well'
I need to find a way out so everyone can find their way out.
Resregietd Lunix Uesr: 485581
Offline
This could be potentially dangerous.
Archi686 User | Old Screenshots | Old .Configs
Vi veri universum vivus vici.
Offline
Yes it could be. But then again, what can't be dangerous?
The concept is good, but doesn't pose that big of an advantage over standard pacman, so at the moment it is somewhat pointless to implement bittorrent into package management.
Last edited by z.s.tar.gz (2009-03-23 00:29:14)
I need to find a way out so everyone can find their way out.
Resregietd Lunix Uesr: 485581
Offline
On the other hand, we could use Bittorrent to transfer distro ISO releases! Has anyone thought of that yet?!?
Offline
On the other hand, we could use Bittorrent to transfer distro ISO releases! Has anyone thought of that yet?!?
Offline
CheesyBeef wrote:On the other hand, we could use Bittorrent to transfer distro ISO releases! Has anyone thought of that yet?!?
no wai.
Offline
Would it be that hard to implament? Ie, (thinking out loud here)
Xyne (Thank you for all your great work btw) has a pkgd daemon which distributes packages over LAN. If you have a central tracker (could be hosted on gerold) which creates a torrent when a package is uploaded to ftp.archlinux, it can be SHA/MD5 summed at the tracker and the torrent keeps its own integrity check so it would be (in therory) pretty hard to spoof the download. Back to Xyne's pkgd, it does what we need already, a central server to distribute files, and all that would need distributing would be the torrent files, then some backend to initiate the download via torrent to the package cache and then run pacman again.
I might have a go at dry coding something later
Offline
i'd be interested in helping out with this even it's a proof of concept as i think to work out if this is even worth doing is to test it out.
It's a very intersting idea.
"is adult entertainment killing our children or is killing our children entertaining adults?" Marilyn Manson
Offline
I'm not sure that BitTorrent would be the best solution. A P2P model would work, sure, but BitTorrent requires .torrent files. Maintaining these .torrent files would be tedious. Every single package would require one of these .torrent files, otherwise there's no way of knowing what to download or from who (well is some form of magnets for bittorrent but I haven't looked into it). You can't just share files in a repo-wide .torrent, and you can't just share your cache, because the torrent will quickly get invalidated and people won't be able to finish it (DHT might help).
A more traditional P2P system would work well though. Having some daemon running in the background, sharing files in the cache would work. There would still need to be some way of identifying specific packages and versions. I'm not sure that $pkgname-$version-$release-$arch.pkg.tar.gz would be sufficient, as corrupt downloads may occur.
Having a distributed network of caches would be great, but is it really necessary? There are plenty of mirrors for packages, and xdeltas are (at least partially) supported. Those two things already reduce network traffic by a significant amount.
Offline
There you go using the search function again...
guilty!
"Be conservative in what you send; be liberal in what you accept." -- Postel's Law
"tacos" -- Cactus' Law
"t̥͍͎̪̪͗a̴̻̩͈͚ͨc̠o̩̙͈ͫͅs͙͎̙͊ ͔͇̫̜t͎̳̀a̜̞̗ͩc̗͍͚o̲̯̿s̖̣̤̙͌ ̖̜̈ț̰̫͓ạ̪͖̳c̲͎͕̰̯̃̈o͉ͅs̪ͪ ̜̻̖̜͕" -- -̖͚̫̙̓-̺̠͇ͤ̃ ̜̪̜ͯZ͔̗̭̞ͪA̝͈̙͖̩L͉̠̺͓G̙̞̦͖O̳̗͍
Offline
could the tracker not be made by a script as suggested everytime a new package is uploaded a tracker is made at the same time?
"is adult entertainment killing our children or is killing our children entertaining adults?" Marilyn Manson
Offline