Some stats. Since July 2007 when Warrick's web interface was first made public, we have recovered 4287 websites and 3,508,091 URIs. That's a lot of missing material the public wants back.
I've also updated Warrick:
- Removed Google SOAP API usage since Google no longer supports it.
- Alerted Microsoft about a bug with their cache which they have now fixed.
- Added some robustness in the face of a flaky Yahoo cache.
- Added the ability to specify an input file (-i option) which contains URLs to recover (thanks to James Young for his contribution).
Happy website reconstructing.
Hi
ReplyDeleteThanks for nice software.
It works perfect on WinXP. But i have problems on latest Ubuntu 10.04.
I have installed everything-perl and SOAP::Lite.
I have got "command not found" error