Curl Command in Linux with Examples

Everything Linux, A.I, IT News, DataOps, Open Source and more delivered right to you.
Subscribe
"The best Linux newsletter on the web"

Introduction

curl is a command-line utility for transferring data from or to a server designed to work without user interaction. With curl, you can download or upload data using one of the supported protocols including HTTP, HTTPS, SCP , SFTP , and FTP . curl provides a number of options allowing you to resume transfers, limit the bandwidth, proxy support, user authentication, and much more.

curl is used in command lines or scripts to transfer data. curl is also used in cars, television sets, routers, printers, audio equipment, mobile phones, tablets, settop boxes, media players and is the Internet transfer engine for thousands of software applications in over ten billion installations.

curl is used daily by virtually every Internet-using human on the globe.

Installing Curl

On Ubuntu and Debian

sudo apt update
sudo apt install curl

On CentOS and Fedora

sudo yum install curl

How to Use Curl

The syntax for the curl command is as follows:

curl [options] [URL...]

In its simplest form, when invoked without any option, curl displays the specified resource to the standard output.

For example, to retrieve the unixcop.com homepage you would run:

[root@unixcop ~]# curl unixcop.com
<html>
<head><title>301 Moved Permanently</title></head>
<body bgcolor="white">
<center><h1>301 Moved Permanently</h1></center>
<hr><center>nginx/1.14.2</center>
</body>
</html>
[root@unixcop ~]# 

As shown above it will print the source code of the unixcop.com homepage in your terminal.

If no protocol is specified, curl tries to guess the protocol you want to use, and it will default to HTTP.

Save the Output to a File

Use the -o or -O option to save the output to a file

Lowercase -o saves the file with a predefined filename, which in the example below is result.txt :

[root@unixcop ~]# curl -o result.txt unixcop.com
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   185  100   185    0     0    168      0  0:00:01  0:00:01 --:--:--   168
[root@unixcop ~]# 
[root@unixcop ~]# ls
anaconda-ks.cfg  result.txt
[root@unixcop ~]# 
[root@unixcop ~]# cat result.txt 
<html>
<head><title>301 Moved Permanently</title></head>
<body bgcolor="white">
<center><h1>301 Moved Permanently</h1></center>
<hr><center>nginx/1.14.2</center>
</body>
</html>
[root@unixcop ~]# 

Also we can use curl to download the files and save them with a specified names with -o (lowercase) option:

[root@unixcop ~]# curl -o nodejs.tar.gz  https://nodejs.org/download/release/latest/node-v16.6.2-headers.tar.gz
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  545k  100  545k    0     0   241k      0  0:00:02  0:00:02 --:--:--  241k
[root@unixcop ~]# 
[root@unixcop ~]# ls
anaconda-ks.cfg  nodejs.tar.gz
[root@unixcop ~]# 

Uppercase -O saves the file with its original filename as shown below :

[root@unixcop ~]# curl -O https://nodejs.org/download/release/latest/node-v16.6.2-headers.tar.gz
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  545k  100  545k    0     0   242k      0  0:00:02  0:00:02 --:--:--  242k
[root@unixcop ~]# 
[root@unixcop ~]# ls
anaconda-ks.cfg  node-v16.6.2-headers.tar.gz
[root@unixcop ~]# 

Download Multiple files

To download multiple files at once, use multiple -O options, followed by the URL to the file you want to download.

In the following example we are downloading the nodejs and joomla files:

[root@unixcop ~]# curl -O https://nodejs.org/download/release/latest/node-v16.6.2-headers.tar.gz   \
>                      -O https://downloads.joomla.org/ar/cms/joomla3/3-9-28/Joomla_3-9-28-Stable-Full_Package.zip

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  545k  100  545k    0     0   242k      0  0:00:02  0:00:02 --:--:--  242k
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
[root@unixcop ~]# 
[root@unixcop ~]# ls
Joomla_3-9-28-Stable-Full_Package.zip  anaconda-ks.cfg  node-v16.6.2-headers.tar.gz
[root@unixcop ~]# 

Resuming interrupted downloads with curl

You can resume a download by using the -C – option.

If your connection drops during the download of a large file, and instead of starting the download from scratch, you can continue it.

Example below shows that the download of ubuntu.iso has been failed because of interruption of network connection and i could continue it with the -c – curl option.

[root@unixcop ~]# curl -O http://releases.ubuntu.com/21.04/ubuntu-21.04-live-server-amd64.iso
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0 1119M    0 3559k    0     0   164k      0  1:55:57  0:00:21  1:55:36 46536
curl: (56) Recv failure: Connection reset by peer
[root@unixcop ~]# 
[root@unixcop ~]# curl -C - -O http://releases.ubuntu.com/21.04/ubuntu-21.04-live-server-amd64.iso
** Resuming transfer from byte position 3644781
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  1 1116M    1 3602k    0     0   270k      0  1:10:31  0:00:02  1:09:46  284k

You may notice the green time values shows that the Download has been resumed when connection became stable.

Specify a Maximum Transfer Rate

So The –limit-rate option allows you to limit the data transfer rate. The value can be expressed in:

bytes, kilobytes with the k suffix.

megabytes with the m suffix.

gigabytes with the g suffix.

As shown in example we limit the download rate with 2 mb

# curl --limit-rate 2m -O http://releases.ubuntu.com/21.04/ubuntu-21.04-live-server-amd64.iso

HTTP Headers of a URL

Use the -I option to fetch only the HTTP headers of the specified resource:

# curl -I --http2 https://www.ubuntu.com/

Dealing with HTTP 301 redirected file

The remote HTTP server might send a different location status code when downloading files. For example, HTTP URLs redirected to HTTPS URLs with HTTP/301 status code. Just pass the -L follow the 301 (3xx) redirects and get the final file on your system:

# curl -L -O http://freeditorial.com/en/books/the-little-prince

Website Supports HTTP/2 or not

Fetch the HTTP Headers with -I along with the –http2 option:

If the remote server supports HTTP/2, curl prints HTTP/2.0 200:

Otherwise, the response is HTTP/1.1 200 as shown below:

[root@unixcop ~]# curl -I --http2 -s https://unixcop.com/ | grep HTTP
HTTP/1.1 200 OK
[root@unixcop ~]# 

If you have curl version 7 or newer, you do not need to use the –http2 option because HTTP/2 is enabled by default for all HTTPS connections.

[root@unixcop ~]# curl -I -s https://unixcop.com/ | grep HTTP
HTTP/1.1 200 OK
[root@unixcop ~]# 

Curl Follows Redirects

By default, curl doesn’t follow the HTTP Location headers.

If you try to retrieve the non-www version of gmail.com, you will notice that instead of getting the source of the page you will be redirected to the www version as shown below:

[root@unixcop ~]# curl gmail.com
<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
<TITLE>301 Moved</TITLE></HEAD><BODY>
<H1>301 Moved</H1>
The document has moved
<A HREF="https://www.google.com/gmail/">here</A>.
</BODY></HTML>
[root@unixcop ~]# 

You can use the -L option to instruct curl to follow any redirect until it reaches the final destination:

# curl -L gmail.com

Curl to Transfer Files with FTP

Use the -u option and specify the username and password as shown below:

curl -u FTP_USERNAME:FTP_PASSWORD ftp://ftp.unixcop.com/

Once logged in,

You can download a single file from the FTP server using the following :

curl -u FTP_USERNAME:FTP_PASSWORD ftp://ftp.unixcop.com/myfile.txt.tar.gz

To upload a file to the FTP server, use :

curl -T newfile.tar.gz -u FTP_USERNAME:FTP_PASSWORD ftp://ftp.example.com/

Grab a password protected file with curl

Try any one of the following syntax


# curl ftp://username:[email protected]:21/path/to/database_backup_for_example.gz

# curl --ftp-ssl -u UserName:PassWord ftp://ftp1.example.biz:21/backups/21/07/2021/mysql.dump.sql.gz

# curl https://username:[email protected]/file/path/data.tar.gz

# curl -u Username:Password https://server1.example.com/file/path/data.gz

Using Proxies

curl supports different types of proxies, including HTTP, HTTPS and SOCKS. To transfer data through a proxy server, use the -x orproxy + proxy URL.

The following command downloads the specified resource using a proxy on 192.168.44.1 port 8888:

curl -x 192.168.44.1:8888 http://linux.com/

If the proxy server requires authentication, use the -U or –proxy-user option followed by the user name and password separated by a colon (user:password):

curl -U username:password -x 192.168.44.1:8888 http://linux.com/

Downloading file using a proxy server

Try any one of the following syntax

# curl -x proxy-server-ip:PORT -O url
# curl -x 'http://username:[email protected]:3128' -v -O https://dl.example.com/downloads/little_prince.pdf

Conclusion

curl is a command-line tool that allows you to transfer data from or to a remote host. It is useful for troubleshooting issues and downloading files.


Everything Linux, A.I, IT News, DataOps, Open Source and more delivered right to you.
Subscribe
"The best Linux newsletter on the web"
MQ-Jr
MQ-Jr
unixcop Admin

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest articles

Join us on Facebook