Daniel Stenberg has observed the 25th anniversary of the curl open source project with the publication of curl 8.0.0, the 215th release of the command line tool, and a modest tele-celebration. The name curl, originally rendered as "cURL" to emphasize its function, stands for "Client for URLs" or "Client URL Request Library" or …
Ok Grandad it's time for your nap now. We'll talk about wget later. I've got some important multithreaded stuff to do with axel first.
I know, I know...in your day you didn't need multiple connections and threads....yes...2 weeks to get an ISO...sit down, shut up, here's your tea.
He did say that he preferred wget unless he was "doing something gnarly"
For myself I use wget loads more than I use curl as
$ curl -O ftp://serv1/pub/u*
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (78) RETR response: 550
$ wget ftp://serv1/pub/u*
works fine exactly as you'd expect it to.
> in your day you didn't need multiple connections and threads....yes...2 weeks to get an ISO...sit down, shut up, here's your tea.
If your link is so slow you need multiple threads to avoid an ISO download taking 2 week then perhaps you should make sure your soggy rusk isn't touching the string between your bake bean cans.
Yes curl can do lots of things that wget can't do, but a lot of the time we use these tools to download files from fileservers and wget is often a better choice in this simple usage case.
I was actually suggesting axel, which is basically wget on steroids.
cURL and wget are not the only options out there.
Sometimes you need multiple connections to get around transfer limits on a remote server...for example, some places that serve ISOs might limit you to well below what your internet connection is actually capable of...the way around this is multiple connections on multiple threads...allowing you to blast a download without worrying about the limit set on the other end.
wget is fine if you want small files and the transfer speed doesn't matter...but if you absolutely have to have a large file as soon as possible, axel is your friend.
cURL and libcurl in particular have been (and will be again, no doubt) *really* useful, especially as I *haven't* been putting network comms into every program: that means I've been having to refresh my knowledge of the library each time and relying on the docs and plentiful supply of examples. Which have never yet let me down.
Having the curl program be little more than a wrapper around libcurl is a really, really good way to arrange things. When my programs go bonkers I can test out the comms independently using curl (batch files for the win to deal with all those CLI args!), knowing that the underlying lib is the same one my code invokes: last resort, I can run curl under the debugger and see what I've done differently!
I wouldn't, I'd use Insomnia...which is a nice API testing front end for curl that can generate code blocks for you when you've got your API query just right...or I'd just bang together a script in the language I intend to interact with the API with.
Curl wouldn't help me test an API in a way that matters for the purpose of software development.
If anything, using *just* curl would slow me down quite a lot.