Linux wget your ultimate command line downloader

Posted: October 14, 2008 in LINUX, SYSTEM UTILITY
Tags: ,

Download a single file using wget

$ wget
$ wget

Download multiple files on command line using wget

$ wget

i) Create variable that holds all urls and later use ‘BASH for loop’ to download all files:
$ URLS=”" ii) Use for loop as follows:
$ for u in $URLS; do wget $u; doneiii) However, a better way is to put all urls in text file and use -i option to wget to download all files:

(a) Create text file using vi
$ vi /tmp/download.txtAdd list of urls:
(b) Run wget as follows:
$ wget -i /tmp/download.txt(c) Force wget to resume download
You can use -c option to wget. This is useful when you want to finish up a download started by a previous instance of wget and the net connection was lost. In such case you can add -c option as follows:
$ wget -c
$ wget -c -i /tmp/download.txt
Please note that all ftp/http server does not supports the download resume feature.

Force wget to download all files in background, and log the activity in a file:

$ wget -cb -o /tmp/download.log -i /tmp/download.txtOR$ nohup wget -c -o /tmp/download.log -i /tmp/download.txt &nohup runs the given COMMAND (in this example wget) with hangup signals ignored, so that the command can continue running in the background after you log out.

Limit the download speed to amount bytes/kilobytes per seconds.

This is useful when you download a large file file, such as an ISO image. Recently one of admin started to download SuSe Linux DVD on one of production server for evaluation purpose. Soon wget started to eat up all bandwidth. No need to predict end result of such a disaster.
$ wget -c -o /tmp/susedvd.log --limit-rate=50k Use m suffix for megabytes (–limit-rate=1m). Above command will limit the retrieval rate to 50KB/s. It is also possible to specify disk quota for automatic retrievals to avoid disk DoS attack. Following command will be aborted when the quota is
(100MB+) exceeded.
$ wget -cb -o /tmp/download.log -i /tmp/download.txt --quota=100mF) Use http username/password on an HTTP server:
$ wget –http-user=foo –http-password=bar Download all mp3 or pdf file from remote FTP server:
Generally you can use shell special character aka wildcards such as *, ?, [] to specify selection criteria for files. Same can be use with FTP servers while downloading files.
$ wget*.pdf
$ wget*.pdf
OR$ wget -g on*.pdfH) Use aget when you need multithreaded http download:
aget fetches HTTP URLs in a manner similar to wget, but segments the retrieval into multiple parts to increase download speed. It can be many times as fast as wget in some circumstances( it is just like Flashget under MS Windows but with CLI):
$ aget -n=5 command will download soft1.tar.gz in 5 segments.

Command to resume file download with wget

After reading man page, I found -c option. It will continue getting a partially downloaded file. This is useful when you want to finish a download started by a previous instance of wget, or by another program.
$ wget -c

Here is quick tip, if you wish to perform an unattended download of large files such as Linux DVD ISO use wget as follows:

wget -bqc


=> -b : Go to background immediately after startup. If no output file is specified via the -o, output is redirected to wget-log.

=> -q : Turn off Wget’s output aka save disk space.

=> -c : Resume broken download i.e. continue getting a partially-downloaded file. This is useful when you want to finish up a download started by a previous instance of Wget, or by another program.

This tip will save your time while downloading large ISO image from the internet.

You can also use nohup command to execute commands after you exit from a shell prompt:
$ nohup wget &


Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s