back to article Curl, the URL fetcher that can, marks 25 years of transfers

Daniel Stenberg has observed the 25th anniversary of the curl open source project with the publication of curl 8.0.0, the 215th release of the command line tool, and a modest tele-celebration. The name curl, originally rendered as "cURL" to emphasize its function, stands for "Client for URLs" or "Client URL Request Library" or …

  1. SJA

    curl <3

    curl <3

    1. David 132 Silver badge
      Happy

      Re: curl <3

      bash: 3: No such file or directory

  2. DrXym Silver badge

    wget

    Curl is a lot more powerful, but unless I'm doing something gnarly I prefer wget. Simpler syntax for just fetching and saving some thing from an endpoint without having to supply extra arguments.

    1. Anonymous Coward
      Anonymous Coward

      Re: wget

      Ok Grandad it's time for your nap now. We'll talk about wget later. I've got some important multithreaded stuff to do with axel first.

      I know, I know...in your day you didn't need multiple connections and threads....yes...2 weeks to get an ISO...sit down, shut up, here's your tea.

      1. Anonymous Coward
        Anonymous Coward

        Re: wget

        He did say that he preferred wget unless he was "doing something gnarly"

        For myself I use wget loads more than I use curl as

        $ curl -O ftp://serv1/pub/u*

        % Total % Received % Xferd Average Speed Time Time Time Current

        Dload Upload Total Spent Left Speed

        0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0

        curl: (78) RETR response: 550

        $

        Whereas

        $ wget ftp://serv1/pub/u*

        works fine exactly as you'd expect it to.

        > in your day you didn't need multiple connections and threads....yes...2 weeks to get an ISO...sit down, shut up, here's your tea.

        If your link is so slow you need multiple threads to avoid an ISO download taking 2 week then perhaps you should make sure your soggy rusk isn't touching the string between your bake bean cans.

        Yes curl can do lots of things that wget can't do, but a lot of the time we use these tools to download files from fileservers and wget is often a better choice in this simple usage case.

        1. Ozan

          Re: wget

          Wasn't the literal reason for wget is to download files across networks reliably? It does it well.

          curl is not a file downloader per se. It is full flag url tool with awesome library.

          1. Anonymous Coward
            Anonymous Coward

            Re: wget

            Yes :-)

            man wget

            Wget - The non-interactive network downloader.

            man curl

            curl - transfer a URL

            But curl is in a standard RHEL install and wget isn't and needs installing. So the workflow is often to use curl -O to download the local repo file and then install wget for the rest.

        2. Anonymous Coward
          Anonymous Coward

          Re: wget

          I was actually suggesting axel, which is basically wget on steroids.

          cURL and wget are not the only options out there.

          Sometimes you need multiple connections to get around transfer limits on a remote server...for example, some places that serve ISOs might limit you to well below what your internet connection is actually capable of...the way around this is multiple connections on multiple threads...allowing you to blast a download without worrying about the limit set on the other end.

          wget is fine if you want small files and the transfer speed doesn't matter...but if you absolutely have to have a large file as soon as possible, axel is your friend.

  3. that one in the corner Silver badge

    Solving hairy problems with curl

    cURL and libcurl in particular have been (and will be again, no doubt) *really* useful, especially as I *haven't* been putting network comms into every program: that means I've been having to refresh my knowledge of the library each time and relying on the docs and plentiful supply of examples. Which have never yet let me down.

    Having the curl program be little more than a wrapper around libcurl is a really, really good way to arrange things. When my programs go bonkers I can test out the comms independently using curl (batch files for the win to deal with all those CLI args!), knowing that the underlying lib is the same one my code invokes: last resort, I can run curl under the debugger and see what I've done differently!

  4. Dr Who

    Different things

    wget and axel are file downloaders.

    curl is a way of interacting with a URL in much more complex ways. How are you going to test an API call that requires a POST request, a json encoded payload and basic auth username with wget?

    1. Anonymous Coward
      Anonymous Coward

      Re: Different things

      I wouldn't, I'd use Insomnia...which is a nice API testing front end for curl that can generate code blocks for you when you've got your API query just right...or I'd just bang together a script in the language I intend to interact with the API with.

      Curl wouldn't help me test an API in a way that matters for the purpose of software development.

      If anything, using *just* curl would slow me down quite a lot.

  5. nautica Silver badge
    Thumb Up

    Donations are in order.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like