Introduction
Did you ever land on a website with many pages or lots of content that you were looking for but didn't have enough time to look through the site at the time? If so, then a website ripper is the software you needed to download the whole site to your computer.
Once downloaded you could then browse through the whole site whenever you wanted, even without an internet connection. The software products reviewed here are capable of downloading the whole website including the images, style sheets and everything else the site owners uploaded on the server. So, don’t worry about how the site looks when they get downloaded.
I hope that this review will help you to choose the right website ripper freeware that matches your requirements.
Rated Products

Platforms/Download: [field_blackberry_download] | Linux | Mac OS | Windows (Desktop) |
Version reviewed: v3.47.19
Gizmos Freeware
Our Rating: 3.5/5 |
![]() |
Read more...
Platforms/Download: Windows (Desktop) |
Version reviewed: 3.30
Gizmos Freeware
Our Rating: 3.5/5 |
Read more...

Local Website Archive
A free lite ripper allows you to download web pages and other documents on the web with good integration with browsers.
Platforms/Download: Windows (Desktop) |
Version reviewed: 12.2
Gizmos Freeware
Our Rating: 2.5/5 |
Read more...

Getleft
An open-source freeware product that allows you to download entire web sites or single webpages.
Platforms/Download: Windows (Desktop) |
Version reviewed: 1.2
Gizmos Freeware
Our Rating: 2.5/5 |
Read more...

QuadSucker/Web
A Website downloading tool capable of downloading four files at a time for faster download.
Platforms/Download: Windows (Desktop) |
Version reviewed: 3.5
Gizmos Freeware
Our Rating: 2.5/5 |
Read more...
Related Products
Other software products to be reviewed:
- WebRipper
- WinWSD WebSite Downloader
- WebFetch
- wget + wget GUI
- CyberArticle
Editor
This software review is copy-edited by Victor Laurie. Please help edit and improve this article by clicking here.
Back to the top of the article.
Comments
There may be some confusion about the purpose of these programs. Downloading an entire website is one thing, but downloading entire individual pages alone from that website may possibly be considered something else.
I have been using HTTrack for about two years. If you want to download a mirror image of a website, it works well. If you want to pick and choose what to download beyond that, such as only one page of links within that site, it creates a mess.
Usually I start with only one page of a given website, and try to download everything on that one page, having no use for the other pages. I input the specific URL but HTTrack insists on downloading the entire root as well.
Moreover, if I want to exclude certain file types, I must input them one by one, but if I want to include those file types, I can choose them all with just a few clicks. Consequently, if I want to download only the text files, or html, or PDFs from a site, it is a longer process, as I have to manually exclude every other file type known to man.
There is also some confusion about how deeply to go for downloading links on the page in question: Does it download only linked files or does it download the files referenced on those linked files as well? You can choose how deep to go, but you won't know whether you selected the right option until the downloads are finished, half an hour later.
Today I have finally given up, as I think it would be faster to download manually the 100 text files on the web page in question, then to entrust HTTrack to do it automatically.
In all fairness I must say that I may have simply overlooked all the features that I needed or else missed them in the manual.
All in all, the programmer has been very generous to donate his work to the world, even though it may not be useful to me in particular, so for that he must be commended.
I tried a lot of these. Including Darcy Ripper, SiteSucker, Free download manager. But for me nothing seems to beat HTTrack. Although I've had some problems with it on command line with Linux.. but the HTTrack GUI for Windows has been more successful in some cases.
Hi, I used to use WinHTTrack, and beyond basic usage, it's options lend themselves to expert use. For a long time and again now I also found its GUI wasn't updating as it did originally whilst downloading.
Suddenly I thought of FDM (Free download manager).
As well as handling normal downloads and torrents, it also features site ripping (HTML spider).
http://www.freedownloadmanager.org/
That it does this is not obvious. Look at the Downloads tab- to the right is a double angle arrow. Click on this and you see more tabs, including
HTML Spider.
Much simpler than WinHTTrack.
Finding the options is tricky.
Rt click on the URL of the page you are to download when added to FDM and look for "Web page downloading settings"
thanks for that info. I was able to use the portable version that I already had on my HD. Problem free.
I spend a lot time with HTTrack and sometime its take a lot time andI think PageNest have an option better than HTTrack. thats auto-collect what should download. PageNest auto-collect pages are link prefer and do not download it. it save much time!
Excuse me, I am looking for something that can schedule automatic download of web page, and it can loop for certain duration, e.g. start copying a web at everyday morning10am and continue downloading every 10 minutes till 16pm, which of these software can meet this requirement?
Thanks
It looks like FDM Downloader could, for example. See the Scheduler tab.
New version of HTTrack: Version 3.48-13 (06/08/2014)
am having a really bad time with page nest, pc freezes, there are no step by step instructions [yes ive read the instructions] .
can anyone suggest a dedicated forum of some clear instructions ......getting desperate to copy my site before my subscription runs out, if u can help pls dont delay
thanks
pls explain what u mean by that
thanks but i tryed first choice, even harder to use, ive invested so much time in this one im sticking to it and slowly learning