Wget for fun

Wget is nice little piece of software that everyone should know. With it you can check site, download from FTP an entire collection of files or a photo gallery. Just open your terminal and these steps

GNU Wget is a free software package for retrieving files using HTTP, HTTPS and FTP, the most widely-used Internet protocols. It is a non-interactive commandline tool, so it may easily be called from scripts, cron jobs, terminals without X-Windows support, etc.

GNU Wget has many features to make retrieving large files or mirroring entire web or FTP sites easy, here are some interesting options.


 


All these commands must be used from linux terminal.


Basic use download a package knowing its http (or ftp) URL:


wget http://kernel.org/pub/linux/kernel/v2.6/patch-2.6.23.bz2


Using Wget for Recursive Downloads


wget -r http://my.site.todownload.com/


The -r command tells wget to recursively download everything from the listed url.


Using Wget for Recursive Downloads but limit the number of levels to 2


wget -r -l2 http://my.site.todownload.com/


Now the -r does the same as above the -l tells wget to limit to that
number of levels here 2 levels deep (otherwise the defualt is 5)


Using Wget for Recursive Downloads but limit the type of files you want to download


wget -r -A.pdf -R.htm http://my.site.todownload.com/


This one tells wget to do a recursive get and Accept all files with .pdf extension and reject all files with .htm extension


Using Wget for Recursive Downloads from a FTP with authentication


wget -r ftp://username:password@my.site/path/to/download


Here you tell wget to download from FTP with userid and password


Using Wget to check dead link on your site


wget spider -r -o log.txt http://yourdomain.com


In this example we tell Wget to act like a web spider (Wget will
behave as a Web spider, which means that it will not download the
pages, just check that they are there), and put results in the file
log.txt, so you can open it and search for a list of broken links.


Using Wget to download a photo gallery


for i in `seq -w 1 100`; do wget http://www.mysite.com/images/DSCF00$i.jpg; done


In this example we run a cycle that go from 1 to 100 and every time
download a different URL, really useful for quickly download a gallery
with no links.


Finally, I forgot to tell you that wget is also usable by Mac and Windows (requires Cygwin)


And let me know if you use wget in a fanciful way

转自:http://www.linuxforums.org/articles/wget-for-fun_968.html

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值