From Archiveteam
Revision as of 10:06, 24 August 2012 by Anonymous (talk) (minor updates)
Jump to: navigation, search

GNU Wget is a free utility for non-interactive download of files from the Web. Using Wget, it is possible to grab a large chunk of data, or mirror an entire website with it's complete directory tree using a single command. In the tool belt of the renegade archivist, Wget tends to get an awful lot of use. (Note: Some people prefer to use cURL. If it can back up data, it's useful).

This guide will not attempt to explain all possible uses of Wget; rather, this is intended to be a concise intro to using Wget, specifically geared towards using the tool to archive data such as podcasts, PDF documents, or entire websites. Issues such as using Wget to circumvent user-agent checks, or robots.txt restrictions, will be outlined as well.

Mirroring a website

When you run something like this:

wget http://icanhascheezburger.com/

...Wget will just grab the first page it hits, usually something like index.html. If you give it the -m flag:

wget -m http://icanhascheezburger.com/

...then Wget will happily slurp down anything within reach of its greedy claws, putting files in a complete directory structure. Go make a sandwich or something.

You'll probably want to pair -m with -c (which tells Wget to continue partially-complete downloads) and -b (which tells wget to fork to the background, logging to wget-log).

If you want to grab everything in a specific directory - say, the SICP directory on the mitpress web site - use the -np flag:

wget -mbc -np http://mitpress.mit.edu/sicp

This will tell Wget to not go up the directory tree, only downwards.

User-agents and robots.txt

By default, Wget plays nicely with a website's robots.txt. This can lead to situations where Wget won't grab anything, since the robots.txt disallows Wget.

To avoid this: first, you should try using the --user-agent option:

wget -mbc --user-agent="" http://website.com/

This instructs Wget to not send any user agent string at all. Another option for this is:

wget -mbc -e robots=off http://website.com/

...which tells Wget to ignore robots.txt directives altogether.

You can put --wait 1 to add a delay, to be nice with server.


Wget doesn't use compression by default! This can make a big difference when you're downloading easily compressible data, like human-language HTML text, but doesn't help at all when downloading material that is already compressed, like JPEG or PNG files. To enable compression, use:

wget --header="accept-encoding: gzip"

This will produce a file (if the remote server supports gzip compression) that uses the .html extension, but is actually gzip-encoded, which can be confusing.

Any vaguely modern server can sustain thousands of simultaneous text downloads, with video or large images being the big ticket items. But sites using outdated hardware, or run by habitual whiners, will complain when a site scraping uses 200 megabytes of transfer when it could have used 100.

Tricks and Traps

  • A standard methodology to prevent scraping of websites is to block access via user agent string. Wget is a good web citizen and identifies itself. Renegade archivists are not good web citizens in this sense. The --user-agent option will allow you to act like something else.
  • Some websites are actually aggregates of multiple machines and subdomains, working together. (For example, a site called dyingwebsite.com will have additional machines like download.dyingwebsite.com or mp3.dyingwebsite.com) To account for this, add the following options: -H -Ddomain.com

Wget for Windows

Windows users can download Wget for Windows, part of the GNUWin32 project. After installation, you will probably want to add it to your Path so that you can run it directly from the command prompt instead of specifying its absolute file path (i.e. "wget" instead of "C:\Program Files\GNUWin32\bin\wget.exe").

These are the instructions for Windows 7 users. Prior versions should be relatively similar.

  1. Install Wget
  2. Right-click My Computer and select Properties
  3. Select Advanced System Settings from the left
  4. Click the Environment Variables button in the bottom-right corner
  5. Under System Variables, find the Path variable and click Edit
  6. Carefully insert the path to Wget's bin folder followed by a semi-colon. Getting this wrong could cause some nasty system problems
    • Your Wget path should be inserted like this: C:\Program Files\GnuWin32\bin;
  7. When done, click OK through all the dialog boxes you opened
  8. The changes should apply immediately under Windows 7. Older versions may require a reboot
  9. To test the settings, open a command prompt and enter "wget"

Parallel downloading


Essays and Reading on the Use of WGET

Use Your Talents Give More Receive More

When I stand before God at the end of my life, I would hope that I would not have a single bit of talent left, and could say, I used everything you gave me.Erma Bombeck

[Use Your Talents Give More Receive More]

[GoodvilleNews.com - good, positive news, inspirational stories, articles]

How To Let Go of Insecurities 7 Steps To Build Your Confidence

Self-worth comes from one thing thinking that you are worthy. Wayne DyerIts okay to have insecurities, we all do and its crucial for us to observe and understand the impact these insecurities have on the quality of our lives.

[How To Let Go of Insecurities 7 Steps To Build Your Confidence]

[GoodvilleNews.com - good, positive news, inspirational stories, articles]

Behind the Beautiful Forevers

For two decades, and currently at The New Yorker, youve written about the distribution of opportunity, and the means by which people might get out of poverty, in America. What inspired you to start asking the same kinds of questions in India?

[Behind the Beautiful Forevers]

[GoodvilleNews.com - good, positive news, inspirational stories, articles]

The Radical Linguist Noam Chomsky

For centuries experts held that every language is unique. Then one day in 1956, a young linguistics professor gave a legendary presentation at the Symposium on Information Theory at MIT. He argued that every intelligible sentence conforms not only to the rules of its particular language but to a universal grammar that encompasses all languages.

[The Radical Linguist Noam Chomsky]

[GoodvilleNews.com - good, positive news, inspirational stories, articles]

A Young Girl with a Big Heart and an Old Man with a Lot of Flowers

When I was very young (about ten or eleven years old), I heard that our neighbor, a retired physician, had lost his wife after a long illness. He was such a dear soul, and he had the most spectacular gardens in his backyard. The whole neighborhood could see his glorious flowers from the street. The backyard was a profusion of daisies and roses, snapdragons and lilies, hyacinths and columbines. I used to think there wasnt a flower in the world that he didnt grow.

[A Young Girl with a Big Heart and an Old Man with a Lot of Flowers]

[GoodvilleNews.com - good, positive news, inspirational stories, articles]