Whatever happened to local caching?
Why are builds like this continually grabbing from *any* server?
> This build was configured to run parallel simultaneous tests on 100 different types of computers/architectures
And all of these architectures need their own copy of the GMP code? Sure they aren't just using the same sources with the usual flurry of #ifdef's?
For that matter, what about the rest of the ffmpeg and dependencies source code? Are these being grabbed afresh for every single architecture? That certainly seems appears to be the case, if I'm reading the updated script correctly[1].
The DDOS aspect not only appears to be built in, but why aren't all these copycat builders pointing out that it is wasting *their* resources as well (time, bandwidth and electricity)?[2]
[1] and the author of that update wants to have a careful think - any guesses whether running a fetch from *any* version control server - in this case, a Mercury server - is going to more costly for both ends than a wget of a single tarball from an FTP server? Or could a version control client perhaps pull down *just* recent changes - oops, no, we deleted the local copy, can't do that!
[2] oh, it is just Github's bandwidth and electricity? Well, waste away!