Nowadays, by the technical advancement, people are meeting more and more issues, which are waiting to be solved. It’s clear, that everyone wants to find the simplest way, but sometimes the simplest is the worst way.
The complexity of the different tasks resulted on the end that there is no man/woman in the world who don’t make any mistakes. There are always unpredictable things, missing settings, elements and that is because we need to test the websites as effective as possible. But how to check all the page links on the page at once? (Checking all the links can mean finding viruses, too.)

Various programs are existing to use for this goal, but only some of them are free to use.
I have tried one named Link Sleuth.
It is simple to use, the program is checking for the broken links, too. If you had network issues with CTRL + R key combinations you can re-check the broken links. (To be sure 100% the problem is not on your side.)

addurl

notfound

On the end of the checking you can make SiteMap, and a useful report, too.

report

System requirements: Microsoft Windows 95/98/ME/NT/2000/XP/Vista/7, but it runs under Fedora 13, Red Hat 8, Ubuntu and OS X via wine or WineBottler, and under Crossover on a Mac.

The speed of the checking depends of the RAM amount of your PC. The current version of the program is really old, last version from September 4th, 2010.
I don’t claim this program is the best way, the program is doing a pretty work and you can find faster and more issues with him than clicking manually on the different page links. I suggest only personal use.

Similar Posts from the author: