You are not logged in.

#1 2007-11-14 04:55:42

hussam
Member
Registered: 2006-03-26
Posts: 572
Website

Command line download accelerator

Prozilla rocks but lately it segfaults a lot. anyone knows any other command line download accelerators? I'm looking for something that can split downloads into multiple connections like prozilla does.

Last edited by hussam (2007-11-14 04:56:23)

Offline

#2 2007-11-14 05:04:02

Allan
Pacman
From: Brisbane, AU
Registered: 2007-06-09
Posts: 11,384
Website

Re: Command line download accelerator

aria2?

Offline

#3 2007-11-14 06:18:25

somairotevoli
Member
Registered: 2006-05-23
Posts: 335

Re: Command line download accelerator

I like axel, it's in aur.

Offline

#4 2007-11-15 09:48:20

floke
Member
Registered: 2007-09-04
Posts: 266

Re: Command line download accelerator

+1 for axel

Offline

#5 2007-11-15 21:56:45

hussam
Member
Registered: 2006-03-26
Posts: 572
Website

Re: Command line download accelerator

Thanks for the suggestions. I will install axel.

Edit:.. I installed axel and removed prozilla. It's running very nicely.
Btw, axel --version says 2002, does anyone know if it is still being developed?

Last edited by hussam (2007-11-16 09:46:33)

Offline

#6 2007-11-16 21:17:51

FizDev
Member
From: Canada
Registered: 2007-06-29
Posts: 57

Re: Command line download accelerator

hussam wrote:

Btw, axel --version says 2002, does anyone know if it is still being developed?

Well since Axel is not in the "Unmaintained stuff" section of the official site... I suppose he stopped developing it.

Axel's Official site : http://wilmer.gaast.net/main.php/axel.html

Offline

#7 2007-11-17 03:06:10

MrWeatherbee
Member
Registered: 2007-08-01
Posts: 277

Re: Command line download accelerator

Just tried axel, and on the files it will download, it is very fast.

However, based on my [limited] use, it seems to be unable to download anything beyond 2 GB (2,147,483,648 Bytes) in size.*

For example, it fails to download DVD ISOs (test ISOs ranged in size from ~3.5 to 4.3 GB) and gives these errors for downloads attempted:

Initializing download: http://somemirror.org/pub/some-iso/DVDs/some/releasename/release/some-1.0-dvd-i386.iso
File size: -2147483648 bytes
Opening output file some-1.0-dvd-i386.iso
Crappy filesystem/OS.. Working around. :-(
Starting download

Connection 3 finished                                                          
Write error!

Downloaded 0 bytes in 0 seconds. (0.00 KB/s)
Segmentation fault

Notice the File size in the error message.

If the File size cannot be retrieved initially (e.g., as occurs with a certain Sabayon mirror), the download will start successfully as shown in the output below (note the absence of the File size indication):

Initializing download: ftp://mirror.cs.vt.edu/pub/SabayonLinux/SabayonLinux-x86-3.4f.iso
Opening output file SabayonLinux-x86-3.4f.iso
Starting download

[  0%] [0123] [ 723.4KB/s] [48:04]

Downloaded 10123.1 kilobytes in 13 seconds. (723.41 KB/s)

If the Sabayon download is canceled then resumed, the file will begin downloading again, but now notice that State file found indicates the size of the download is 2,147,483,647 bytes (i.e., 10,366,085 bytes downloaded PLUS 2,137,117,562 to go); in actuality, the size of the Sabayon ISO is ~ 4,424,314,000 bytes:

Initializing download: ftp://mirror.cs.vt.edu/pub/SabayonLinux/SabayonLinux-x86-3.4f.iso
Opening output file SabayonLinux-x86-3.4f.iso
State file found: 10366085 bytes downloaded, 2137117562 to go.
Starting download

[  1%] [0123] [ 701.6KB/s] [49:03]

Downloaded 20.6 megabytes in 30 seconds. (701.57 KB/s)

I'm not sure what would happen if I let this download continue; would it:

- eventually cancel due to errors;
- continue until it reached 2,147,483,648 bytes and then report a successful download (though it would really not be);
- continue on beyond the reported size to complete the download of 4,424,314,000 bytes.

I tested with ReiserFS and ext3 file systems.**

Anyone seeing this behavior with Axel on their systems besides me? I can successfully use other downloaders (aria2, wget, and even prozilla) with no problems at all.

Thanks.

---------
*Note 1:
My "2 GB limit" theory is based primarily on Axel's mention of "2,147,483,648 Bytes" in various parts of its output. My theory would be more sound if I had test files of exact sizes to download; however, I was only able to test with file sizes below 1GB or above 3 GB. I do not have actual results for file sizes between 1GB and 3GB as they are odd sizes and are difficult  to find.

**Note 2:
Looking at the source code seems to indicate that the author's idea of a "crappy filesystem / OS" is one that can't handle seeks "past-EOF areas" (related to Axel's behavior of putting all the data in the right file, in the right order, at download time):

/* And check whether the filesystem can handle seeks to
           past-EOF areas.. Speeds things up. :) AFAIK this
           should just not happen:                */
        if( lseek( axel->outfd, axel->size, SEEK_SET ) == -1 && axel->conf->num_connections > 1 )
        {
            /* But if the OS/fs does not allow to seek behind
               EOF, we have to fill the file with zeroes before
               starting. Slow..                */
            axel_message( axel, _("Crappy filesystem/OS.. Working around. :-(") );
            lseek( axel->outfd, 0, SEEK_SET );
            memset( buffer, 0, axel->conf->buffer_size );
            i = axel->size;
            while( i > 0 )
            {
                write( axel->outfd, buffer, min( i, axel->conf->buffer_size ) );
                i -= axel->conf->buffer_size;
            }
        }

However, since I tested with:

- "crappy file system [ReiserFS] / OS [Arch]" and
- "crappy file system [ext3] / OS [Arch]"

both of which seem to be capable of seeking past EOF since no error messages are thrown for smaller files, it seems that the "crappy" error message is being sent to the terminal based more on a file-size issue rather than one based purely on seeking capability.

Offline

#8 2007-12-12 17:37:40

twanj
Member
From: Pompano Beach, FL
Registered: 2006-08-02
Posts: 47
Website

Re: Command line download accelerator

axel hasn't been maintained for years AFAIK, hence the 2 GB limit.

why not use a modern downloader like aria2? the author is actively working on making his program better & listens to users.


Simpler/Faster downloads with error recovery - http://www.metalinker.org/

Offline

#9 2007-12-12 20:26:05

MrWeatherbee
Member
Registered: 2007-08-01
Posts: 277

Re: Command line download accelerator

twanj wrote:

axel hasn't been maintained for years AFAIK, hence the 2 GB limit.

why not use a modern downloader like aria2? the author is actively working on making his program better & listens to users.

Yes. I prefer prozilla, but use aria2 as well on occasion. My preference for prozilla is because it seems to be consistently faster when used with pacman. And I have not had any segfaults that Hussam says he has experienced.

However, since several posts recommended axel, and because the original poster decided upon axel, I thought I'd test it out and post the results.

Simply being 'unmaintained' (even for quite some time) does not correlate directly with the file size limitation I experienced, and since I didn't find anything in the axel documentation or website about the limitation, maybe the info is useful to someone. smile

Offline

#10 2007-12-13 19:00:41

twanj
Member
From: Pompano Beach, FL
Registered: 2006-08-02
Posts: 47
Website

Re: Command line download accelerator

MrWeatherbee wrote:

Simply being 'unmaintained' (even for quite some time) does not correlate directly with the file size limitation I experienced, and since I didn't find anything in the axel documentation or website about the limitation, maybe the info is useful to someone. smile

most stuff didn't support 2 GB+ file sizes back then. apache only recently added support and many things still don't, like Firefox I think.


Simpler/Faster downloads with error recovery - http://www.metalinker.org/

Offline

#11 2008-09-07 17:21:01

dsr
Member
Registered: 2008-05-31
Posts: 187

Re: Command line download accelerator

Sorry to revive an old thread, but I just discovered that this problem with axel and large files has been fixed in the SVN. I prefer axel to aria2 since it doesn't take time before the download to preallocate space and after the download to join the files together. Now that I can use axel for large files, I don't need aria2 for anything other than torrents and powerpill. Yay!

Offline

#12 2008-09-25 13:14:38

dsr
Member
Registered: 2008-05-31
Posts: 187

Re: Command line download accelerator

axel 2.0 was released earlier this month and is in [community]. It supports large files!

Offline

Board footer

Powered by FluxBB