Hacker Newsnew | past | comments | ask | show | jobs | submit | samueloph's commentslogin


> edit: even curl itself - which created the original document linked above - has http 3 just in an experimental build.

It's not experimental when built with ngtcp2, which is what you will get on distros like Debian 13-backports (plain Debian 13 uses OpenSSL-QUIC), Debian 14 and onward, Arch Linux and Gentoo.

Reference: https://curl.se/docs/http3.html


You can test it on Debian experimental, or use a Debian container:

$ podman run debian:experimental /bin/bash -c 'apt install --update -t experimental -y curl && curl --version'

Version 8.13.0~rc3-1+exp1 is syncing to the repositories and has HTTPS RR support enabled.


HTTP/3 is enabled in the curl package on Debian:

https://samueloph.dev/blog/debian-curl-now-supports-http3/


There's a serious regression in the fixes: https://github.com/RsyncProject/rsync/issues/702

It impacts those who need to use `-r` (recursive) together with `-H` (preserve hardlinks),


Fix was merged an hour ago, roughly an hour after you made this comment (at which time they were still working on it)



HTTP/3 is enabled in the curl package on Debian:

https://samueloph.dev/blog/debian-curl-now-supports-http3/

Daniel also has a more up-to-date post on HTTP/3:

https://daniel.haxx.se/blog/2024/06/10/http-3-in-curl-mid-20...


> In unstable you get a timely fix from upstream, in stable you get a fix by the security team.

For anything that is serious enough, you will get a security fix straight away for both unstable and testing (through the testing-security repository).

For things that are not really that important, yes, you will get the fix later.


> There is also --remote-header-name (-J) which takes the remote file name from the header instead of the URL, which is what wget does.

I don't think that's the case, that behavior is opt-in as indicated in wget's manpage:

> This can currently result in extra round-trips to the server for a "HEAD" request, and is known to suffer from a few bugs, which is why it is not currently enabled by default.


As others pointed out, you can do that, you can also set them in .curlrc, or you can write a script if you want to allow for multiple URLs to be downloaded in parallel (not possible with an alias), or now you can just use wcurl :)

Note: wcurl sets a bit more flags than that, it also encodes the whitespaces in the URL and does parallel downloading of multiple URLs.


“or now you can just use wcurl :)”

Sadly I can’t since it is dependent upon the util-linux version of getopt which means it fails on bsd and macOS systems. Understandable since it is always available on the specific target the script was written for and it does make life easier.


argh, we are looking into fixing that now.

I knew getopt was linux-specific but I thoughts the only impact was that the long form argument (--opt) would not work. I turns out it doesn't run at all instead.

We should be able to fix this within the next few days, thank you!


> We should be able to fix this within the next few days, thank you!

It's fixed now, should work in non-linux environments.


That might be a nice improvement, but I believe people want a single command without any arguments for this use case.


I can't speak for other people. But for myself I'd rather have it as a --option than another command. I dislike tools that install a lot of different commands, it just gets harder to remember them all.


That argument doesn't make sense at first glance: how is lots of differen't commands harder to remember than lots of different --options?


one command can list all it's options with tool -h?


alias wget='curl --wget', then?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: