You are not logged in.
brendan wrote:your patch says requires libdownload>=1.3
I cant seem to find that in the repos
$ pacman -Qip /tmp/pacman-3.3.1-1-x86_64.pkg.tar.gz Name : pacman Version : 3.3.1-1 URL : http://www.archlinux.org/pacman/ Licenses : GPL Groups : base Provides : None Depends On : bash libarchive>=2.6.0 libdownload>=1.3 pacman-mirrorlist ...
Actually his patch says nothing about libdownload, this is just a broken package that someone built. You should probably find out what PKGBUILD he used...
Ok how do I fix this? I now have a perfectly working arch PC that can't install a DE. Kind of annoying. If I manually change to wget or curl will it work?
I need it to do work on and have no OS working.
Offline
Ok how do I fix this? I now have a perfectly working arch PC that can't install a DE. Kind of annoying. If I manually change to wget or curl will it work?
I need it to do work on and have no OS working.
Either use abs and edit pacman PKGBUILD to apply the patch.
Or try using XferCommand.
Actually I would like to know for all person hit by this bug if using wget or curl with XferCommand leads to perfectly stable downloads, without any interruptions of any kind.
pacman roulette : pacman -S $(pacman -Slq | LANG=C sort -R | head -n $((RANDOM % 10)))
Offline
Using wget as the command. It is perfectly stable. Just installed gnome and gnome-extra and because I made a mistake kdemod-complete 32bit and now reinstalling kdemod-complete 64bit after having deleted the previous. Pretty intensive stuff I would say and it's perfectly stable albeit alot uglier than the normal pacman (wget dumps alot of stuff when downloading...)
Offline
Using wget as the command. It is perfectly stable. Just installed gnome and gnome-extra and because I made a mistake kdemod-complete 32bit and now reinstalling kdemod-complete 64bit after having deleted the previous. Pretty intensive stuff I would say and it's perfectly stable albeit alot uglier than the normal pacman (wget dumps alot of stuff when downloading...)
It's really never interrupted ? Note that when interrupted, it could go on using second mirror, etc.
So you could have a complete download in the end, but you actually used 10 mirrors to get to that point.
Just to be sure this is not the case, try enabling only one mirror in your mirrorlist.
pacman roulette : pacman -S $(pacman -Slq | LANG=C sort -R | head -n $((RANDOM % 10)))
Offline
brendan wrote:Using wget as the command. It is perfectly stable. Just installed gnome and gnome-extra and because I made a mistake kdemod-complete 32bit and now reinstalling kdemod-complete 64bit after having deleted the previous. Pretty intensive stuff I would say and it's perfectly stable albeit alot uglier than the normal pacman (wget dumps alot of stuff when downloading...)
It's really never interrupted ? Note that when interrupted, it could go on using second mirror, etc.
So you could have a complete download in the end, but you actually used 10 mirrors to get to that point.
Just to be sure this is not the case, try enabling only one mirror in your mirrorlist.
it only messed up once where it stopped and restarted during the reinstall of kdemod4-complete 64bit. Without using wget it would have failed 4 lines in.
But yes my connection is on the dodgy side so its very possible it dropped a little but not as often as every 4 packages like pacman was telling me. My netbook on the same router never saw a drop in connection, sat it to ping google.com infinately and display only fails. It only spat something out once during the install of kdemod but that was with quite a big amount of pacakges.
Offline
shining: wget actually "waits". If a connection stops abruptly for whatever reason, it just pauses (goes to 0 KB/s). Then when the connection is back up, it simply continues. As such, one mirror can appear to suffice. Whether it falls to the next mirror depends on wget configuration, i.e how long to wait before stopping (returning an error code). I don't think this is the same as number of "retries", but I could be wrong.
} else if(retval != 0) {
/* download failed */
pm_printf(PM_LOG_DEBUG, "XferCommand command returned non-zero status "
"code (%d)\n", retval);
ret = -1;
I believe pacman needs an error return for falling to the next mirror. By default, using pacman's default fetcher, if a connection stops it returns an error, and it will then fall to the next mirror.
And as for curl, it's the same. It returns an error if a connection is disrupted. But anyway, it will not resume without -C. So pacman needs:
XferCommand = /usr/bin/curl -C %u > %o
Moreover, not everyone here has that bad of an unstable connection. It could just be a momentary loss of packet transmission, though I don't know why the reporters here have faced this only recently (wasn't the migration to libfetch done in January?). As for me, it's been this way for about half a year (ever since I cut my cable contract and got myself an HSDPA one), so it matches up.
I need real, proper pen and paper for this.
Offline
My bad, built pacman against the latest PKGBUILD and reuploaded:
Offline
shining wrote:brendan wrote:Using wget as the command. It is perfectly stable. Just installed gnome and gnome-extra and because I made a mistake kdemod-complete 32bit and now reinstalling kdemod-complete 64bit after having deleted the previous. Pretty intensive stuff I would say and it's perfectly stable albeit alot uglier than the normal pacman (wget dumps alot of stuff when downloading...)
It's really never interrupted ? Note that when interrupted, it could go on using second mirror, etc.
So you could have a complete download in the end, but you actually used 10 mirrors to get to that point.
Just to be sure this is not the case, try enabling only one mirror in your mirrorlist.it only messed up once where it stopped and restarted during the reinstall of kdemod4-complete 64bit. Without using wget it would have failed 4 lines in.
But yes my connection is on the dodgy side so its very possible it dropped a little but not as often as every 4 packages like pacman was telling me. My netbook on the same router never saw a drop in connection, sat it to ping google.com infinately and display only fails. It only spat something out once during the install of kdemod but that was with quite a big amount of pacakges.
But it must drop somehow. I am not sure what wget does to handle this more gracefully though.
Another user who had the same problem ( http://mailman.archlinux.org/pipermail/ … 07777.html ) sent me a wireshark trace.
It shows a LOT of TCP previous segment lost / dup ack / retransmission.
This seems to indicate a lot of packet loss.
Could you also check the behavior of other downloaders to get a better picture ?
Thanks
pacman roulette : pacman -S $(pacman -Slq | LANG=C sort -R | head -n $((RANDOM % 10)))
Offline
shining: wget actually "waits". If a connection stops abruptly for whatever reason, it just pauses (goes to 0 KB/s). Then when the connection is back up, it simply continues. As such, one mirror can appear to suffice. Whether it falls to the next mirror depends on wget configuration, i.e how long to wait before stopping (returning an error code). I don't think this is the same as number of "retries", but I could be wrong.
I see, that's interesting. Though it's not really a bug/misbehavior of libfetch. It's just less nice behavior for bad connections I guess
} else if(retval != 0) { /* download failed */ pm_printf(PM_LOG_DEBUG, "XferCommand command returned non-zero status " "code (%d)\n", retval); ret = -1;
I believe pacman needs an error return for falling to the next mirror. By default, using pacman's default fetcher, if a connection stops it returns an error, and it will then fall to the next mirror.
It really doesn't do that with XferCommand ? I really thought it did. I just reviewed the code again, and I don't see why it would not work.
And as for curl, it's the same. It returns an error if a connection is disrupted. But anyway, it will not resume without -C. So pacman needs:
XferCommand = /usr/bin/curl -C %u > %o
Good to know.
Moreover, not everyone here has that bad of an unstable connection. It could just be a momentary loss of packet transmission, though I don't know why the reporters here have faced this only recently (wasn't the migration to libfetch done in January?). As for me, it's been this way for about half a year (ever since I cut my cable contract and got myself an HSDPA one), so it matches up.
No, it was done with pacman v3.3.0 release on 2009-08-02.
With libdownload, I would expect that the behavior was similar to libfetch one, using fixed pacman. That is, in case of too many packets loss, it just prints an error and switch to next mirror.
pacman roulette : pacman -S $(pacman -Slq | LANG=C sort -R | head -n $((RANDOM % 10)))
Offline
Ahh, alright. Well after your fix, I don't see anymore problems. It's just back to the intended behaviour (trying fallback mirrors), so that's good.
It really doesn't do that with XferCommand ? I really thought it did. I just reviewed the code again, and I don't see why it would not work.
It does But wget does not give up (timeout settings?), that is why. Curl, however, does give up. So for really troubled connections, wget was and still is a better option because there is simply no other cure aside from subscribing to a better ISP. I will stick to libfetch because wget has ugly output
I need real, proper pen and paper for this.
Offline
i686 version of the patched pacman:
Offline
having gruesome no space left on drives issues as well here.
thanks for pointing out what's going on.
SO: WHERE are these BIG files and can I delete them????
Last edited by yvonney (2009-10-02 03:26:40)
Offline
having gruesome no space left on drives issues as well here.
thanks for pointing out what's going on.SO: WHERE are these BIG files and can I delete them????
Where all the pacman cache is, /var/cache/pacman/pkg.....
Allan-Volunteer on the (topic being discussed) mailn lists. You never get the people who matters attention on the forums.
jasonwryan-Installing Arch is a measure of your literacy. Maintaining Arch is a measure of your diligence. Contributing to Arch is a measure of your competence.
Griemak-Bleeding edge, not bleeding flat. Edge denotes falls will occur from time to time. Bring your own parachute.
Offline
'I'd cleared all that out to my archives - pacman/pkg empty....
I did find 4 gig of log files though!
Offline