POST large stdin as multipart/form-data with cURL
Telling cURL to directly read a large (several gigabytes) file and POST it as multipart/form-data works:
$ # This works
$ curl localhost -F 'f=@large_file.txt'
However, cURL fails when trying to read the same amount of data from standard input:
$ cat large_file.txt | curl localhost -F 'f=@-'
curl: option -F: is badly used here
curl: try 'curl --help' for more information
(What I want to do, practically, is tar
a directory and directly stream that in an HTTP request: tar -cf - large_dir/ | curl localhost -F 'f=@-'
)
I think this is because cURL stores all of stdin into memory first before sending any data in a request:
-F, --form <name=content>
...
Tell curl to read content from stdin instead of a file by using
- as filename. This goes for both @ and < constructs. When stdin
is used, the contents is buffered in memory first by curl to
determine its size and allow a possible resend. Defining a
part's data from a named non-regular file (such as a named pipe
or similar) is unfortunately not subject to buffering and will
be effectively read at transmission time; since the full size is
unknown before the transfer starts, such data is sent as chunks
by HTTP and rejected by IMAP.
Is there any way to have cURL construct a request body in multipart/form-data format as it reads from stdin, and to stream that data to a server, without buffering it in memory or saving it anywhere?
I don't need the Content-Length
header to be set.
Comments
Post a Comment