Have you ever used Secure Shell to access a remote machine, only to find yourself needing to download a file from a remote location? What do you do? Since you only have terminal window access to that remote machine, you can’t open a web browser and download the file as you normally would.
Also: Want a programming job? Make sure you learn these three languages
Fortunately, these Linux commands make it fairly easy to download files from a local or remote location. I’m going to show you three: wget, curl, and scp.
1. wget
The wget command is my go-to for downloading remote files to a local machine when a GUI isn’t available. There are two reasons I tend to default to wget: It’s the first command I learned to use for this purpose and it’s very simple.
Let’s say you need to download the file http://www.example.com/file.zip. You can do that with the command:
wget http://www.example.com/file.zip
The wget command has several handy options. For example, if a download is interrupted, you can pick it up where it left off with:
wget -c http://www.example.com/file.zip
Or maybe you want to download the file with a different name. For that, you’d just use the -O option (for output file), like this:
wget -O newnamez.ip http://www.example.com/file.zip
If you want to download the file to a specific directory, the command would be:
wget -P /home/$USER/Downloads http://www.example.com/file.zip
You can also create a text file with the full addresses to access multiple downloads. Say you create the file downloads.txt. In that file, you add one URL per line. You can then download all of those files with a single command:
wget -i downloads.txt
2. curl
Next, we have curl, which is a slightly different beast. If you use curl to download a file without any options, curl will essentially print out the contents of the file to the terminal window. Because of that, you have to instruct curl to save the file, which is done with the -O option like this:
curl -O http://www.example.com/file.zip
You can also save the remote file with a different name, like so:
curl -o newname.zip http://www.example.com/file.zip
Another handy curl feature: You can use what’s called globbing, which allows you to specify multiple URLs at once.
Also: Linus Torvalds talks AI, Rust adoption, and why the Linux kernel is ‘the only thing that matters’
Let’s say you need to download file1.zip, file2.zip, file3.zip, file4.zip, and file5.zip, and you want to do so with a single command. Just use brackets, like so:
curl http://www.example.com/file[1-5].zip
All five files will download into the current working directory.
3. scp
The scp command is part of Secure Shell and allows you to copy files from a remote machine with a bit more security. Because scp works in conjunction with Secure Shell, you have to be able to log into the remote machine with a valid user.
Also: 5 Linux terminal apps that are better than your default (and why)
Let’s say the file.zip file is on a remote machine on your network and you have a valid account on the hosting machine. For example’s sake, we’ll say the IP address of the remote machine is 192.168.1.11 and the local username is olivia. To download the file from that machine, the command would be:
scp olivia@192.168.1.11:/home/olivia/file.zip file.zip
The above command would prompt you for olivia’s user password and, upon successful authentication, the file.zip file would be downloaded to the current working directory on the local machine.
Also: How to use the scp command in Linux
Keep in mind that if you don’t have an account on the remote machine, you cannot use scp to download a file.
While all three of these Linux commands can be used to download files, if you want to know my preference, it’s wget all the way (unless I need to add a layer of security, at which point I use scp).
Source : https://www.zdnet.com/article/3-linux-commands-i-use-for-downloading-files-and-how-theyre-different/#ftag=RSSbaffb68