[linux] wget 명령 사용 예제
wget utility is the best option to download files from internet. wget can pretty much handle all complex download situations including large file downloads, recursive downloads, non-interactive downloads, multiple file downloads etc.,
In this article let us review how to use wget for various download scenarios using 15 awesome wget examples.
1. 단일 파일 다운로드
인터넷에서 단일 파일을 받아 현재 디렉토리에 저장하는 방법.
$ wget http://www.openss7.org/repos/tarballs/strx25-0.9.2.1.tar.bz2
다운로드 하는 동안 진행 상태와 다음 정보를 보여준다:
- 현재 몇 퍼센트 진행되는지 (예 31%)
- 현재까지 다운로드된 바이트 (예 1,213,592 bytes)
- 현재 다운로드 속도 (예 68.2K/s)
- 남은 시간 (예 eta 34 seconds)
Download in progress:
$ wget http://www.openss7.org/repos/tarballs/strx25-0.9.2.1.tar.bz2 Saving to: `strx25-0.9.2.1.tar.bz2.1' 31% [=================> 1,213,592 68.2K/s eta 34s
Download completed:
$ wget http://www.openss7.org/repos/tarballs/strx25-0.9.2.1.tar.bz2 Saving to: `strx25-0.9.2.1.tar.bz2' 100%[======================>] 3,852,374 76.8K/s in 55s 2009-09-25 11:15:30 (68.7 KB/s) - `strx25-0.9.2.1.tar.bz2' saved [3852374/3852374]
2. 다른 이름으로 저장하기 (Using wget -O)
wget은 기본적으로 다운로드 경로의 마지막 슬래시('/') 다음에 오는 단어를 파일명으로 사용한다. 하지만 이 방법은 적절한 파일 이름이 아닐 수 있다.
Wrong: 다음과 같은 경우 파일명은 이렇게 된다 : download_script.php?src_id=7701
$ wget http://www.vim.org/scripts/download_script.php?src_id=7701
다운 받은 파일이 zip 포맷일지라도 다음과 같은 파일로 저장된다.
$ ls download_script.php?src_id=7701
Correct: -O 옵션을 이용하여 특별한 파일명을 지정하여 해결할 수 있다.
$ wget -O taglist.zip http://www.vim.org/scripts/download_script.php?src_id=7701
3. 다운로드 속도 지정 (Using wget –limit-rate)
wge은 기본적으로 가능한 최대의 대역폭을 사용한다. 상용 서버에서 대용량의 파일을 받아야 한다면 대역폭을 조절할 필요가 있다. '–limit-rate' 옵션을 이용하여 다운로드 속도를 제한할 수 있다.
다음은 다운로드 속도를 200k로 제한하는 예제이다.
$ wget --limit-rate=200k http://www.openss7.org/repos/tarballs/strx25-0.9.2.1.tar.bz2
4. 이어받기 (Using wget -c)
다운로드 도중 중단했을 경우 'c' 옵션을 이용해 다시 시작할 수 있다.
$ wget -c http://www.openss7.org/repos/tarballs/strx25-0.9.2.1.tar.bz2
This is very helpful when you have initiated a very big file download which got interrupted in the middle. Instead of starting the whole download again, you can start the download from where it got interrupted using option -c
Note: If a download is stopped in middle, when you restart the download again without the option -c, wget will append .1 to the filename automatically as a file with the previous name already exist. If a file with .1 already exist, it will download the file with .2 at the end.
5. Download in the Background Using wget -b
For a huge download, put the download in background using wget option -b as shown below.
$ wget -b http://www.openss7.org/repos/tarballs/strx25-0.9.2.1.tar.bz2 Continuing in background, pid 1984. Output will be written to `wget-log'.
It will initiate the download and gives back the shell prompt to you. You can always check the status of the download using tail -f as shown below.
$ tail -f wget-log Saving to: `strx25-0.9.2.1.tar.bz2.4' 0K .......... .......... .......... .......... .......... 1% 65.5K 57s 50K .......... .......... .......... .......... .......... 2% 85.9K 49s 100K .......... .......... .......... .......... .......... 3% 83.3K 47s 150K .......... .......... .......... .......... .......... 5% 86.6K 45s 200K .......... .......... .......... .......... .......... 6% 33.9K 56s 250K .......... .......... .......... .......... .......... 7% 182M 46s 300K .......... .......... .......... .......... .......... 9% 57.9K 47s
Also, make sure to review our previous multitail article on how to use tail command effectively to view multiple files.
6. Mask User Agent and Display wget like Browser Using wget –user-agent
Some websites can disallow you to download its page by identifying that the user agent is not a browser. So you can mask the user agent by using –user-agent options and show wget like a browser as shown below.
$ wget --user-agent="Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.3) Gecko/2008092416 Firefox/3.0.3" URL-TO-DOWNLOAD
7. Test Download URL Using wget –spider
When you are going to do scheduled download, you should check whether download will happen fine or not at scheduled time. To do so, copy the line exactly from the schedule, and then add –spider option to check.
$ wget --spider DOWNLOAD-URL
If the URL given is correct, it will say
$ wget --spider download-url Spider mode enabled. Check if remote file exists. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] Remote file exists and could contain further links, but recursion is disabled -- not retrieving.
This ensures that the downloading will get success at the scheduled time. But when you had give a wrong URL, you will get the following error.
$ wget --spider download-url Spider mode enabled. Check if remote file exists. HTTP request sent, awaiting response... 404 Not Found Remote file does not exist -- broken link!!!
You can use the spider option under following scenarios:
- Check before scheduling a download.
- Monitoring whether a website is available or not at certain intervals.
- Check a list of pages from your bookmark, and find out which pages are still exists.
8. Increase Total Number of Retry Attempts Using wget –tries
If the internet connection has problem, and if the download file is large there is a chance of failures in the download. By default wget retries 20 times to make the download successful.
If needed, you can increase retry attempts using –tries option as shown below.
$ wget --tries=75 DOWNLOAD-URL
9. Download Multiple Files / URLs Using Wget -i
First, store all the download files or URLs in a text file as:
$ cat > download-file-list.txt URL1 URL2 URL3 URL4
Next, give the download-file-list.txt as argument to wget using -i option as shown below.
$ wget -i download-file-list.txt
10. Download a Full Website Using wget –mirror
Following is the command line which you want to execute when you want to download a full website and made available for local viewing.
$ wget --mirror -p --convert-links -P ./LOCAL-DIR WEBSITE-URL
- –mirror : turn on options suitable for mirroring.
- -p : download all files that are necessary to properly display a given HTML page.
- –convert-links : after the download, convert the links in document for local viewing.
- -P ./LOCAL-DIR : save all the files and directories to the specified directory.
11. Reject Certain File Types while Downloading Using wget –reject
You have found a website which is useful, but don’t want to download the images you can specify the following.
$ wget --reject=gif WEBSITE-TO-BE-DOWNLOADED
12. Log messages to a log file instead of stderr Using wget -o
When you wanted the log to be redirected to a log file instead of the terminal.
$ wget -o download.log DOWNLOAD-URL
13. Quit Downloading When it Exceeds Certain Size Using wget -Q
When you want to stop download when it crosses 5 MB you can use the following wget command line.
$ wget -Q5m -i FILE-WHICH-HAS-URLS
Note: This quota will not get effect when you do a download a single URL. That is irrespective of the quota size everything will get downloaded when you specify a single file. This quota is applicable only for recursive downloads.
14. Download Only Certain File Types Using wget -r -A
You can use this under following situations:
- Download all images from a website
- Download all videos from a website
- Download all PDF files from a website
$ wget -r -A.pdf http://url-to-webpage-with-pdfs/
15. FTP Download With wget
You can use wget to perform FTP download as shown below.
Anonymous FTP download using Wget
$ wget ftp-url
FTP download using wget with username and password authentication.
$ wget --ftp-user=USERNAME --ftp-password=PASSWORD DOWNLOAD-URL
If you liked this article, please bookmark it with delicious or Stumble.
-
Apache CORS 설정
-
How to Install and Use wget on Mac
-
[mac] VirtualBox 실행 스크립트와 bash_profile 설정
-
[linux] wget 명령 사용 예제
-
[linux] The Ultimate Wget Download Guide With 15 Awesome Examples
-
[sh] 쉘스크립트 if 비교 연산
-
[sh] html 안에 있는 img 다운 받는 쉘 스크립트
-
[ios] Start developing your navigation app for CarPlay without enrollment
-
Configure Postfix to Use Gmail SMTP on Ubuntu 18.04
-
RPA란? 어디에 어떻게 쓰이고 누가 만드나?
-
[php] 3 Ways to Detect Mobile or Desktop in PHP
-
자주 쓰는 Docker 명령어 alias