Reply to post: Cache?

Ready for pull rate limits? Docker outlines 'next chapter' as Google tells customers how to dodge subscriptions

Henry Wertz 1 Gold badge


Seems like you'd be able to use some kind of cache; I mean, I doubt you're deploying 100 different images in that 6 hours, it'd be 100 of the same image (or a few images). Seems like you'd REALLY want to cache that somewhere for speed purposes if nothing else. Must admit, I can't think of how I'd burn through like 16 Docker images an hour either.

(Edit: Didn't notice the thread on caching just above. But seriously, who would not want to use a cache if they're doing more than an occasional Docker pull?)

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2021