[text] How to `wget` a list of URLs in a text file?

Let's say I have a text file of hundreds of URLs in one location, e.g.

http://url/file_to_download1.gz
http://url/file_to_download2.gz
http://url/file_to_download3.gz
http://url/file_to_download4.gz
http://url/file_to_download5.gz
....

What is the correct way to download each of these files with wget? I suspect there's a command like wget -flag -flag text_file.txt

This question is related to text wget

The answer is


Run it in parallel with

cat text_file.txt | parallel --gnu "wget {}"

try:

wget -i text_file.txt

(check man wget)


If you're on OpenWrt or using some old version of wget which doesn't gives you -i option:

#!/bin/bash
input="text_file.txt"
while IFS= read -r line
do
  wget $line
done < "$input"

Furthermore, if you don't have wget, you can use curl or whatever you use for downloading individual files.


If you also want to preserve the original file name, try with:

wget --content-disposition --trust-server-names -i list_of_urls.txt