766

One can request only the headers using HTTP HEAD, as option -I in curl(1).

$ curl -I /

Lengthy HTML response bodies are a pain to get in command-line, so I'd like to get only the header as feedback for my POST requests. However, HEAD and POST are two different methods.

How do I get cURL to display only response headers to a POST request?

11 Answers 11

958
-D, --dump-header <file>
       Write the protocol headers to the specified file.

       This  option  is handy to use when you want to store the headers
       that a HTTP site sends to you. Cookies from  the  headers  could
       then  be  read  in  a  second  curl  invocation by using the -b,
       --cookie option! The -c, --cookie-jar option is however a better
       way to store cookies.

and

-S, --show-error
       When used with -s, --silent, it makes curl show an error message if it fails.

from the man page. so

curl -sS -D - www.acooke.org -o /dev/null

follows redirects, dumps the headers to stdout and sends the data to /dev/null (that's a GET, not a POST, but you can do the same thing with a POST - just add whatever option you're already using for POSTing data)

note the - after the -D which indicates that the output "file" is stdout.

15
  • 27
    above comment is valid if you're using powershell. for cmd.exe use curl -s -D - http://yahoo.com -o nul
    – JJS
    Commented Jul 15, 2013 at 21:45
  • 1
    @JJS for me $null worked on Win7. Is it due to cLink installed on windows. Commented Sep 27, 2013 at 15:58
  • 24
    The "-" in front of the URL may seem unimportant, but it's not. Commented Oct 13, 2013 at 15:38
  • 1
    @WahidSadik Why's is that the case in particular? What's the function of the single dash?
    – mamachanko
    Commented Oct 29, 2013 at 8:48
  • 7
    @mamachanko -D takes an argument that says where the output should go. the single dash means it should go to stdout. Commented Oct 29, 2013 at 10:33
301

The other answers require the response body to be downloaded. But there's a way to make a POST request that will only fetch the header:

curl -s -I -X POST http://www.google.com

An -I by itself performs a HEAD request which can be overridden by -X POST to perform a POST (or any other) request and still only get the header data.

11
  • 29
    This answer is actually correct because web servers can return different headers based on request method. If you want to check headers on GET, you have to use GET request.
    – chhantyal
    Commented Aug 31, 2016 at 15:34
  • 10
    This is the most correct answer, in my opinion. It is easy to remember, it actually sends GET request and doesn't download the whole response body (or at least doesn't output it). The -s flag is nor necessary.
    – skozin
    Commented Oct 11, 2016 at 14:38
  • 1
    @JeffPuckettII well kinda nitpicking I would say. You can replace GET with POST in above command and it will work as expected. or any other is key there.
    – chhantyal
    Commented Nov 22, 2016 at 16:54
  • 37
    This does not work when you actually want to POST some data. Curl says: Warning: You can only select one HTTP request method! You asked for both POST Warning: (-d, --data) and HEAD (-I, --head).
    – SebastianH
    Commented Dec 1, 2016 at 18:15
  • 3
    @nickboldt The point here is that a server might respond differently to a HEAD request than to a POST or GET request (and some servers actually do that), so -X HEAD is no reliable solution here.
    – siracusa
    Commented Jan 11, 2017 at 5:21
111

The Following command displays extra informations

curl -X POST http://httpbin.org/post -v > /dev/null

You can ask server to send just HEAD, instead of full response

curl -X HEAD -I http://httpbin.org/

Note: In some cases, server may send different headers for POST and HEAD. But in almost all cases headers are same.

12
  • 7
    It's unfortunate that the other answer won, because this is the correct answer - it doesn't unnecessarily transfer a ton of data.
    – Daniel
    Commented Jun 10, 2016 at 13:30
  • 1
    @dmd If I understand the cURL manual for -X, --request correctly, -X HEAD still results in “a ton of data” but there is -I, --head which should results in what you are anticipating. Commented Jul 29, 2016 at 22:59
  • 30
    Problem with -X HEAD is that the server might respond differently, since it now receives a HEAD request instead of a GET (or whatever the previous request was)
    – Grav
    Commented Aug 24, 2016 at 9:44
  • 6
    Warning: Setting custom HTTP method to HEAD with -X/--request may not work the Warning: way you want. Consider using -I/--head instead.
    – Dorian
    Commented May 31, 2017 at 17:57
  • 2
    @bfontaine a perfect example of a XY-Problem
    – tijko
    Commented Feb 23, 2023 at 15:35
61

For long response bodies (and various other similar situations), the solution I use is always to pipe to less, so

curl -i https://api.github.com/users | less

or

curl -s -D - https://api.github.com/users | less

will do the job.

2
  • 3
    these are not equivalent. the first issues a HEAD request to which many servers respond differently. the second issues a GET request which is more like what we are looking for here.
    – glasz
    Commented Aug 3, 2019 at 16:53
  • This is useful, but not an answer to the question. Therefore, I am voting down. Commented Mar 10, 2024 at 16:58
36

Much easier – this also follows links.

curl -IL http://shortlinktrack.er/in-the-shadows
  • -I is an alias of --head, the man page states that it fetch the headers only
  • -L is an alias of --location, the man page states that curl will follow the location header if there is one
1
  • Awesome! Clean and simple. Thank you! Commented Dec 5, 2024 at 13:46
33

Maybe it is little bit of an extreme, but I am using this super short version:

curl -svo. <URL>

Explanation:

-v print debug information (which does include headers)

-o. send web page data (which we want to ignore) to a certain file, . in this case, which is a directory and is an invalid destination and makes the output to be ignored.

-s no progress bar, no error information (otherwise you would see Warning: Failed to create the file .: Is a directory)

warning: result always fails (in terms of error code, if reachable or not). Do not use in, say, conditional statements in shell scripting...

7
  • 3
    Why use -o. instead of -o /dev/null?
    – bfontaine
    Commented Jul 10, 2019 at 9:14
  • 1
    @bfontaine -o. is used versus -o /dev/null for brevity
    – exebook
    Commented Jul 14, 2019 at 7:09
  • 3
    @bfontaine there are other answers that show how to do this the most correct way, this one is here to show the short alternative that does the same thing basically.
    – exebook
    Commented Jul 19, 2019 at 13:35
  • 2
    You should clarify in your answer that this command always fails. curl -svo. <url> && echo foo won’t print foo because -o. make curl return a non-zero (= error) code: curl: (23) Failed writing body.
    – bfontaine
    Commented Jul 19, 2019 at 14:20
  • 2
    a "solution" that ends with returning an error is not a valid solution. it's a happy accident. if something goes wrong, you have no way of knowing because you've already swallowed the error
    – 333kenshin
    Commented Dec 2, 2021 at 19:03
16

While the other answers have not worked for me in all situations, the best solution I could find (working with POST as well), taken from here:

curl -vs 'https://some-site.com' 1> /dev/null

2
  • 1
    I had to put the url between quotes to get this working.
    – CHW
    Commented Jul 19, 2017 at 9:09
  • 1
    Whether this is necessary or not might depend on url and used shell. I improved the answer accordingly. Thanks. Commented Jul 19, 2017 at 15:50
13

headcurl.cmd (windows version)

curl -sSkv -o NUL %* 2>&1
  • I don't want a progress bar -s,
  • but I do want errors -S,
  • not bothering about valid https certificates -k,
  • getting high verbosity -v (this is about troubleshooting, is it?),
  • no output (in a clean way).
  • oh, and I want to forward stderr to stdout, so I can grep against the whole thing (since most or all output comes in stderr)
  • %* means [pass on all parameters to this script] (well(https://stackoverflow.com/a/980372/444255), well usually that's just one parameter: the url you are testing

real-world example (on troubleshooting proxy issues):

C:\depot>headcurl google.ch | grep -i -e http -e cache
Hostname was NOT found in DNS cache
GET HTTP://google.ch/ HTTP/1.1
HTTP/1.1 301 Moved Permanently
Location: http://www.google.ch/
Cache-Control: public, max-age=2592000
X-Cache: HIT from company.somewhere.ch
X-Cache-Lookup: HIT from company.somewhere.ch:1234

Linux version

for your .bash_aliases / .bash_rc:

alias headcurl='curl -sSkv -o /dev/null $@  2>&1'
4
  • This will download the body and consume bandwidth, time. @siracusa 's answer (stackoverflow.com/a/38679650/6168139) doesn't have this overhead.
    – rushi
    Commented Dec 2, 2019 at 9:55
  • If & when you want POST, add -X POST to the passthrough parameters, if you want GET, use GET (i.e. default), as responses may differ. - Unless you do heavy curling in production scripts (not for diagnosis anddevelopment) I don't care about a bit of bandwidth.
    – Frank N
    Commented Dec 2, 2019 at 10:03
  • I am planning it to see if files on server are updated or not using 'Last-Modified'. The files in themselves are large, some are in GBs, and I am usually on cellular internet. So, this large bandwidth is an issue for me.
    – rushi
    Commented Dec 2, 2019 at 10:06
  • That would be hacky. I don't need to do this as siracusa's answer performs the task accurately.
    – rushi
    Commented Jan 17, 2020 at 12:35
6

The -w, --write-out <format> option can be very helpful. You can get all http headers, or a single one:

$ curl -s -w '%{header_json}' https://httpbin.org/get -o /dev/null
{"date":["Sun, 18 Feb 2024 13:47:12 GMT"],
"content-type":["application/json"],
"content-length":["254"],
"server":["gunicorn/19.9.0"],
"access-control-allow-origin":["*"],
"access-control-allow-credentials":["true"]
}

$ curl -s -w '%header{content-type}' https://httpbin.org/get -o /dev/null
application/json

read more

0

-D, --dump-header Write the protocol headers to the specified file.

   This  option  is handy to use when you want to store the headers
   that a HTTP site sends to you. Cookies from  the  headers  could
   then  be  read  in  a  second  curl  invocation by using the -b,
   --cookie option! The -c, --cookie-jar option is however a better
   way to store cookies.
0

To get only the response header, use the silent output -s along side -i, then output only the first 10 lines using the head command.

curl -si 0:80 | head

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.