Run wget command on each line of a file and download the files (two at a time)
I have a file which has urls of files to download.
For example:
https://url-of/file1.zip
https://url-of/file2.zip
https://url-of/file3.zip
...
The command i am currently using to download files is:
wget --continue --tries=0 https://url-of/file.zip
But now I need a bash script which will read urls from the file and download, two at a time.
The script i came up with so far is:
#!/bin/sh
cat file.txt |while read url
do
wget --continue --tries=0 "$url"
done
But it download single file.
How can i edit it so that it download two files at a time.
You can put multiple read commands in the while loop. For "two at a time", but them both in the background and then wait for them to complete
while read -r first && read -r second; do
wget "$first" &
wget "$second" &
wait
done < input.file
But @muru’s comment is preferable.