616
votes

One can request only the headers using HTTP HEAD, as option -I in curl(1).

$ curl -I /

Lengthy HTML response bodies are a pain to get in command-line, so I'd like to get only the header as feedback for my POST requests. However, HEAD and POST are two different methods.

How do I get cURL to display only response headers to a POST request?

8

8 Answers

833
votes
-D, --dump-header <file>
       Write the protocol headers to the specified file.

       This  option  is handy to use when you want to store the headers
       that a HTTP site sends to you. Cookies from  the  headers  could
       then  be  read  in  a  second  curl  invocation by using the -b,
       --cookie option! The -c, --cookie-jar option is however a better
       way to store cookies.

and

-S, --show-error
       When used with -s, --silent, it makes curl show an error message if it fails.

and

-L/--location
      (HTTP/HTTPS) If the server reports that the requested page has moved to a different location (indicated with a Location: header and a 3XX response
      code), this option will make curl redo the request on the new place. If used together with -i/--include or -I/--head, headers from  all  requested
      pages  will  be  shown.  When authentication is used, curl only sends its credentials to the initial host. If a redirect takes curl to a different
      host, it won’t be able to intercept the user+password. See also --location-trusted on how to change this. You can limit the amount of redirects to
      follow by using the --max-redirs option.

      When curl follows a redirect and the request is not a plain GET (for example POST or PUT), it will do the following request with a GET if the HTTP
      response was 301, 302, or 303. If the response code was any other 3xx code, curl will re-send the following  request  using  the  same  unmodified
      method.

from the man page. so

curl -sSL -D - www.acooke.org -o /dev/null

follows redirects, dumps the headers to stdout and sends the data to /dev/null (that's a GET, not a POST, but you can do the same thing with a POST - just add whatever option you're already using for POSTing data)

note the - after the -D which indicates that the output "file" is stdout.

204
votes

The other answers require the response body to be downloaded. But there's a way to make a POST request that will only fetch the header:

curl -s -I -X POST http://www.google.com

An -I by itself performs a HEAD request which can be overridden by -X POST to perform a POST (or any other) request and still only get the header data.

81
votes

The Following command displays extra informations

curl -X POST http://httpbin.org/post -v > /dev/null

You can ask server to send just HEAD, instead of full response

curl -X HEAD -I http://httpbin.org/

Note: In some cases, server may send different headers for POST and HEAD. But in almost all cases headers are same.

58
votes

For long response bodies (and various other similar situations), the solution I use is always to pipe to less, so

curl -i https://api.github.com/users | less

or

curl -s -D - https://api.github.com/users | less

will do the job.

29
votes

Maybe it is little bit of an extreme, but I am using this super short version:

curl -svo. <URL>

Explanation:

-v print debug information (which does include headers)

-o. send web page data (which we want to ignore) to a certain file, . in this case, which is a directory and is an invalid destination and makes the output to be ignored.

-s no progress bar, no error information (otherwise you would see Warning: Failed to create the file .: Is a directory)

warning: result always fails (in terms of error code, if reachable or not). Do not use in, say, conditional statements in shell scripting...

19
votes

Much easier – this is what I use to avoid Shortlink tracking – is the following:

curl -IL http://bit.ly/in-the-shadows

…which also follows links.

14
votes

While the other answers have not worked for me in all situations, the best solution I could find (working with POST as well), taken from here:

curl -vs 'https://some-site.com' 1> /dev/null

6
votes

headcurl.cmd (windows version)

curl -sSkv -o NUL %* 2>&1
  • I don't want a progress bar -s,
  • but I do want errors -S,
  • not bothering about valid https certificates -k,
  • getting high verbosity -v (this is about troubleshooting, is it?),
  • no output (in a clean way).
  • oh, and I want to forward stderr to stdout, so I can grep against the whole thing (since most or all output comes in stderr)
  • %* means [pass on all parameters to this script] (well(https://stackoverflow.com/a/980372/444255), well usually that's just one parameter: the url you are testing

real-world example (on troubleshooting proxy issues):

C:\depot>headcurl google.ch | grep -i -e http -e cache
Hostname was NOT found in DNS cache
GET HTTP://google.ch/ HTTP/1.1
HTTP/1.1 301 Moved Permanently
Location: http://www.google.ch/
Cache-Control: public, max-age=2592000
X-Cache: HIT from company.somewhere.ch
X-Cache-Lookup: HIT from company.somewhere.ch:1234

Linux version

for your .bash_aliases / .bash_rc:

alias headcurl='curl -sSkv -o /dev/null $@  2>&1'