You are not logged in.
Nothing really constructive to add here. But one of my only gripes with arch was always I could never find a repo that performed good (probably my fault for not trying hard enough).
Your script fixed my problem and it works great.
Thanks a lot for all your trouble!
Vince
Offline
repo d/l speeds vary ... I have been using pacget for a month now and I am happy
Mr Green I like Landuke!
Offline
Hi! I tried this script and it does seem to improve my download speeds mainly for large files(for smaller ones speeds seem to have reduced). Also, irritating time out problems(which forces me to restart the whole pacman procedure) i've been experiencing using other methods does not seem to rear it's ugly head using pacget. However, i keep getting the following error for some packages.
Downloading: pango-1.14.7-1.pkg.tar.gz
repo: current
0/0 Bytes 0% - 0.00 KB/s 26 connectionsterminate called after throwing an instance of 'std::bad_alloc'
what(): St9bad_alloc
/usr/bin/pacget: line 19: 3646 Aborted aria2c -t20 -m2 -l /var/log/pacget.log $mirrors -o $file.pacget
Any ideas?
Btw, thanks for this great script!
p.s i'm using the instructions and script from the wiki.
Thanks in advance!
Offline
It is nice, except now I can't see my cute yellow coloured C munching things.
Oh well. It'll be back with pacman 3.
Offline
Nice performer...thanks for the improvement!!! 8)
Prediction...This year will be a very odd year!
Hard work does not kill people but why risk it: Charlie Mccarthy
A man is not complete until he is married..then..he is finished.
When ALL is lost, what can be found? Even bytes get lonely for a little bit! X-ray confirms Iam spineless!
Offline
Hi! I tried this script and it does seem to improve my download speeds mainly for large files(for smaller ones speeds seem to have reduced). Also, irritating time out problems(which forces me to restart the whole pacman procedure) i've been experiencing using other methods does not seem to rear it's ugly head using pacget. However, i keep getting the following error for some packages.
Downloading: pango-1.14.7-1.pkg.tar.gz repo: current 0/0 Bytes 0% - 0.00 KB/s 26 connectionsterminate called after throwing an instance of 'std::bad_alloc' what(): St9bad_alloc /usr/bin/pacget: line 19: 3646 Aborted aria2c -t20 -m2 -l /var/log/pacget.log $mirrors -o $file.pacget
Any ideas?
Btw, thanks for this great script!
p.s i'm using the instructions and script from the wiki.
Asalamu Alikum,
The timeout errors I sometimes get too. (not 100% sure on the fix) I've made a few minor changes to my local copy of the script but never bothered to upload them. They may help limit the timeout issues.
Basically I limited it to 5 mirrors, and made sure that ftp.archlinux.org is the first mirror in the list. I'll update the wiki when I get home from work tonight.
That error your getting. Does it only come up on specific packages, or does it come up randomly? Like does pango fail every time, or only sometimes? I'll try downloading pango when I get home and see if I get that error.
PS:You guys should check out http://bbs.archlinux.org/viewtopic.php?p=212356 (drakosha's mirror sorting script)
It sorts mirrors from fastest to slowest. If your gutsy, you can try my version (linked below), but it still needs to be worked on and make sure to backup your repo files. I havn't used the drakosha script yet so I can't comment on it, but I'd definatly check out his first.
With a sorted mirror list you shouldn't need more then 5 mirrors in packget to max out your connection.
My sort mirrors script: http://bbs.archlinux.org/viewtopic.php?p=212356#212356
If you try both sorting scripts please PM me feedback on which is better. If mine is worse I'll stop working on it and just use drakosha's script myself.
Offline
Wa alaikumsalam!
That error your getting. Does it only come up on specific packages, or does it come up randomly? Like does pango fail every time, or only sometimes? I'll try downloading pango when I get home and see if I get that error.
As far as i know, they are random though i haven't done an conclusive tests to proof that.
As per my post in http://bbs.archlinux.org/viewtopic.php?t=27764 , I am currently using prozilla download accelerator(available in arch's repos) in place of aria2 and wget. Speeds are pretty good and almost maxes out my 800kb/s connection though only for larger files. I have no idea how to use prozilla for more than one mirror though.
Btw, are you on a router sabooky? I suspect that may be the cause of timeouts in my connection.
Hope this helps!
Thanks in advance!
Offline
Wa alaikumsalam!
That error your getting. Does it only come up on specific packages, or does it come up randomly? Like does pango fail every time, or only sometimes? I'll try downloading pango when I get home and see if I get that error.
As far as i know, they are random though i haven't done any conclusive tests to proof that.
As per my post in http://bbs.archlinux.org/viewtopic.php?t=27764 , I am currently using prozilla download accelerator(available in arch's repos) in place of aria2 and wget. Speeds are pretty good and almost maxes out my 800kb/s connection, though only for larger files. The most important thing is it solves my timeout problems. I have no idea how to use prozilla for more than one mirror though.
Btw, are you on a router sabooky? I suspect that may be the cause of timeouts in my connection.
Hope this helps!
Thanks in advance!
Offline
The script seems to work for me except for almost every repos it get a downlaod error:
Downloading: unstable.db.tar.gz
repo: unstable
(using one mirror)
0/0 Bytes 0% - 0.00 KB/s 1 connections
The download was not complete because of errors. Check the log.
aria2 will resume download if the transfer is restarted.Downloading: community.db.tar.gz
Ive checked the log, but, none of it really makes too much sense to me, it looks like it only says successful reports.
If I need to post that too just tell which part.
Offline
The script seems to work for me except for almost every repos it get a downlaod error:
Downloading: unstable.db.tar.gz repo: unstable (using one mirror) 0/0 Bytes 0% - 0.00 KB/s 1 connections The download was not complete because of errors. Check the log. aria2 will resume download if the transfer is restarted.Downloading: community.db.tar.gz
Ive checked the log, but, none of it really makes too much sense to me, it looks like it only says successful reports.
If I need to post that too just tell which part.
Yea I get that error sometimes myself, if the log doesn't have any error messages then i have no idea what causes it. When it gets really annoying i just comment it out in pacman.conf do my `pacman -Sy`then put it back for `pacman -Su` not exactly the best way so to do it .
islamguide.com:
Yea, I'm on a router but I really don't think that its causing any problems.
Also, I never updated the wiki cause i realized the copy of the program thats in this thread is up to date, if not newer then my local copy.
PS: maybe if i have time in the future I'll rewrite the script in python and try to make it work a bit better.
Offline
The timeout errors I sometimes get too. (not 100% sure on the fix) I've made a few minor changes to my local copy of the script but never bothered to upload them. They may help limit the timeout issues.
is the timeout in aria2? I notice that segments sometimes finish for me, while other stall, and I have to resume the transfer. might be worth filing a bug.
Simpler/Faster downloads with error recovery - http://www.metalinker.org/
Offline
Been a while, but I decided to look back at my pacget script and try to figure out the problems now that I've gotten a bit better at coding. I noticed two things:
1. Wow my code is ugly.
2. The problem with the timeouts isn't from my script its from aria2c and the fact that ftp.archlinux.org resolves into 2 different IP addresses.
$ nslookup ftp.archlinux.org
Name: ftp.archlinux.org
Address: 209.85.41.132
Name: ftp.archlinux.org
Address: 209.85.41.133
This sometimes causes this error:
#aria2c --allow-overwrite=true -l log ftp://ftp.archlinux.org/extra/os/i686/extra.db.tar.gz
Tue Mar 20 17:26:20 2007 - NOTICE - Adding URL: ftp://ftp.archlinux.org/extra/os/i686/extra.db.tar.gz
0/0 Bytes 0% - 0.00 KB/s 1 connections
The download was not complete because of errors. Check the log.
aria2 will resume download if the transfer is restarted.
Here's the error in the logfile:
Tue Mar 20 17:26:21 2007 - ERROR - CUID#1 - Download aborted.
Tue Mar 20 17:26:21 2007 - ERROR - exception: Size mismatch 287542 != 287802
Click Me To view the full log (on pastebin)
The reason for the size mismatch is that on the first connection the server resolved to 209.85.41.132, on the second it went to 209.85.41.133
The two servers report a different size for the db file. Apparently when aria2c doesn't find an .aria2 file, it restarts the process of connecting creating an .aria2 file and setting that as a "segment" it also passes it the "size" of the package. The second time it goes to resolve the server it asks for the size, realizes the size changed and thinks that the server changed the file completely so it spits out an error and quits.
One "fix" would be to change the server from ftp.archlinux.org to one of its IPs (tho this is hacky, and if the ip changes it won't work).
If anyone has any ideas on a good way to work this please feel free to post on here, or PM me. Though at this point it really does seem like the problem is out of my hands and is on aria2c's part. Maybe someone could recommend another good download manager that supports segmented downloading from multiple servers.
Last edited by sabooky (2007-03-20 22:06:40)
Offline
Well I finally got around to working on the script at first I was going to rewrite the whole thing but that was going badly, it was looking worse then the original .
I finally gave up and was just gonna patch up the old when when I noticed Djclue917 rewrote my script for me . The script MUCH nicer now, easier to manage and a lot clearer.
Anyways so I patched his (much nicer looking) version. Basically you're going to need to download dnsutils for the fix.
The fix works by checking if the host resolves into more then 1 IP using dig, if it does then it takes one of them and replaces the name with it, then passes that to aria2c.
So there you have it, the glitch is finally fixed.
Please report anymore problems here, or PM me.
Offline
2. The problem with the timeouts isn't from my script its from aria2c and the fact that ftp.archlinux.org resolves into 2 different IP addresses.
I haven't tested this, but the latest aria2 0.10.2 might have fixed this bug:
To cache resolved hostname:
* src/AbstractCommand.h, src/AbstractCommand.cc
(resolveHostname): Put outside #ifdef ENABLE_ASYNC_DNS clause.
Added dns cache.
Simpler/Faster downloads with error recovery - http://www.metalinker.org/
Offline
It would be nice to have this integrated in pacman by default.
This way the load is shared over different servers, and the user can enjoy a higher downloadspeed.
In most cases the bandwidth limit of the internetprovider will limit the downloadspeed and cause a reduced downloadspeed shared over several servers
Jan
Last edited by Lontronics (2007-03-28 19:31:57)
Offline
It would be nice to have this integrated in pacman by default.
This way the load is shared over different servers, and the user can enjoy a higher downloadspeed.
In most cases the bandwidth limit of the internetprovider will limit the downloadspeed and cause a reduced downloadspeed shared over several servers
I think that would be really good.
AFAIK no other distribution's tools do this. It could be a real time-saving benefit to Arch users and ground breaking too.
Simpler/Faster downloads with error recovery - http://www.metalinker.org/
Offline
aria2 0.11.0+ now has the "ability to download multiple files concurrently. * Now downloads multiple files concurrently. See -j option."
could this, along with the delta/diff packages could be the next evolution in package updates? it should at least be a nice improvement.
Simpler/Faster downloads with error recovery - http://www.metalinker.org/
Offline
aria2 0.11.0+ now has the "ability to download multiple files concurrently. * Now downloads multiple files concurrently. See -j option."
could this, along with the delta/diff packages could be the next evolution in package updates? it should at least be a nice improvement.
For some reason I'm not grasping this concept. How does the -j option work? Like I tried:
#aria2c -j2 www.test.com/foo.zip www.anothersite.com/bar.zip
and also tried
#aria2c -j2 -M foo.metalink -M bar.metalink
I think I'm misunderstanding it or something.
Offline
twanj wrote:aria2 0.11.0+ now has the "ability to download multiple files concurrently. * Now downloads multiple files concurrently. See -j option."
could this, along with the delta/diff packages could be the next evolution in package updates? it should at least be a nice improvement.
For some reason I'm not grasping this concept. How does the -j option work? Like I tried:
#aria2c -j2 www.test.com/foo.zip www.anothersite.com/bar.zip
and also tried
#aria2c -j2 -M foo.metalink -M bar.metalinkI think I'm misunderstanding it or something.
It's confusing. I asked for an example on the site & documentation.
here's the example the author gave me:
aria2c -l /tmp/log.txt -j 15 -i urls.txt -d /tmp
Simpler/Faster downloads with error recovery - http://www.metalinker.org/
Offline
I have a problem, I followed the wiki's instructions, and installed pacget however it gives me this error.
sh: /usr/bin/pacget: /bin/bash^M: bad interpreter: no such file or directory
sh: /usr/bin/pacget: /bin/bash^M: Success
The last time I used pacget I installed it the same way and it worked fine. I chmod'd to thing to 755 and put the Xfer command.
The mirror I selected during the installation was ftp.archlinux.org I kinda think that was a little stupid, but I thought if I use pacman I should use the official mirror as the main one so that I don't download out of date things. (I just installed recently with Duke)
Thanks.
Offline
i think it's an encoding problem. probably the file was uploaded from windows (it has DOS termination chars instead of UNIX). if you open a windows file in linux, you see all these pesky ^Ms which cause scripts to go haywire. I personally use emacs to get rid of them... You can do it using any good cmd based txt editor, but the gui editors usually dont show the ^Ms.
Last edited by abhidg (2007-08-03 15:51:00)
Offline
Oh is that so. I did take it from windows because I hadn't gotten a browser on Linux yet. So I just need to delete every ^M? (Or get it from Linux)
Thank you very much for your help.
Last edited by sokuban (2007-08-06 04:14:55)
Offline