Sucking websites

Reader Tyler Moynihan is interested in doing a little offline reading. He writes:

Can I download an entire webpage such that I can view its contents as though I were on the web? What I mean by this is could I hook up my laptop online and decide to download all of and all of its links? Then when my computer was done I could jump on my next flight and open up Safari or some program and “browse” the fully downloaded webpage. And if this is possible, could I do it at my command and/or could I schedule my computer to download a webpage every day, week, month, etc?

Geek that I am, I’m going to batch process each of your questions and provide this single answer:


The tool I’d try is Limit Point Software’s $25 Blue Crab. By default it will suck the guts out of a website up to 99 layers deep and save that site in your user’s Documents folder. You don’t need to extract the entire site. The program lets you determine how many layers down you want to go or, if you like, you can choose to grab just a single page.

To save time you can filter what it downloads—for example you can instruct it to keep away from resources that are larger than a certain size or have been modified before or after a particular date. It can even be configured to enter login information for protected sites.

Give the 7-day demo a try and see how you like it.

Shop Tech Products at Amazon