back to article Mozilla returns crypto-signed website packaging spec to sender – yes, it's Google

Mozilla has published a series of objections to web packaging, a content distribution scheme proposed by engineers at Google that the Firefox maker considers harmful to the web in its current form. At its developer conference earlier this month, Google engineers talked up the tech, which consists of several related projects – …

  1. IgorS

    Can we get Web caching back, please?

    Looks like they are trying to fix what they have broken themselves!

    Before the hard push for HTTPS, most Web content could be cached, for example by Squid.

    No need for complicated schemes to get great performance... all you needed was a HTTP cache somewhere close.

    Then HTTPS-everywhere mania kicked in, and now every single load has to go back to the origin!

    Most Web pages we consume have zero privacy needs; they come from public Web pages.

    There is really no need for encryption in most of them!

    The need for integrity indeed very valuable, but that could be easily achieved in a cache-friendly manner.

    Why is/was that not done?

    Just usual business interests of big players, or is there something else I cannot see?

    1. Anonymous Coward
      Anonymous Coward

      Re: Can we get Web caching back, please?

      This is a common misconception. The worry isn't just that content might be read, but that it might be rewritten in transit. The only (current) way to protect against this is to sign the content with the client's public key, something which is impossible to cache and is handily part of encryption.

      1. Nick Kew

        Re: Can we get Web caching back, please?

        There are very good reasons to rewrite content in transit. Examples range from the classic case where content contains internal links that won't resolve for external readers, those need to be resolved at a gateway, through to services like accessibility enhancement for blind readers, or translation. Not to mention all kinds of content syndication frameworks.

        There are times when security concerns trump the usefulness of such things. But if reading El Reg really needed us to know the origin of the contents, we wouldn't have Anonymous Cowards.

      2. Luke McCarthy

        Re: Not sure the comparison is valid

        This could be achieved by other means, for example cryptographic signing of content which would still allow caching and ensure the content cannot be re-written by third-parties.

      3. IgorS

        Re: Can we get Web caching back, please?

        > > The need for integrity indeed very valuable, but that could be easily achieved in a cache-friendly manner.

        > The worry isn't just that content might be read, but that it might be rewritten in transit.

        Which part of the above "integrity" sentence did you not understand?

        integrity == "cannot be rewritten"

        And can be easily solved with e.g. "server side signing", whioch is indeed cache-friendly.

        No encryption needed.

        1. Nick Kew

          Re: Can we get Web caching back, please?

          Indeed. And, better than that, you can sign a normalised version of the page, as in taking a DOM and using that to sign contents but exclude irrelevant markup from the signed contents. Though that needs a Standard.

    2. Anonymous Coward
      Anonymous Coward

      Re: Can we get Web caching back, please?

      > Most Web pages we consume have zero privacy needs; they come from public Web pages.

      > There is really no need for encryption in most of them!

      Frankly, I don't think you have a clue. Every insecure page is an access route into a machine.

      That login button you just clicked that redirects you to a secure site? Replaced by one to a phishing site.

      That page you just loaded? Now it's got malicious Javascript and Flash applets injected into it.

      Browsing your favourite news website? Now your ISP or the government has a record of every page you've viewed. Public wifi hotspot? Now everyone else might have a log too.

      If you don't see the value of knowing the content you receive, is the content sent by the original site, and only visible to you and the origin, then I would suggest you have a very narrow view of computer security.

      1. Anonymous Coward
        Anonymous Coward

        Re: Can we get Web caching back, please?

        Frankly, I think you've been watching too many movies.

        1. Anonymous Coward
          Anonymous Coward

          Re: Can we get Web caching back, please?

          Probably has been, phorm never happened. https://en.m.wikipedia.org/wiki/Phorm

          1. Anonymous Coward
            Anonymous Coward

            Re: Can we get Web caching back, please?

            Ok then, as the downvoted AC, I'll bite:

            > Frankly, I don't think you have a clue. Every insecure page is an access route into a machine.

            Which, frankly, is untrue.

            > That login button you just clicked that redirects you to a secure site? Replaced by one to a phishing site

            He clearly mentioned "most web sites", not ones which you have to login to.

            .

            > That page you just loaded? Now it's got malicious Javascript and Flash applets injected into it.

            If that's going to happen, then you're screwed anyway if your system is so full of holes that your systems security depends on putting trust into the websites you visit.... Or do you consider https to be some sort of virus scanner too?

            Bu you know you're talking bollocks - you wouldn't visit ANY website if you knew it had the capability to P0WN you just by loading some flash/js module.

            > Browsing your favourite news website? Now your ISP or the government has a record of every page you've viewed.

            They can know every site you visit anyway, https, or no https

            > Public wifi hotspot? Now everyone else might have a log too.

            You can't add arbitary circumstances to his situation to make your point. He probably never uses a public wi-fi.

            Besides, while I'm sipping a latte, I really don't care who knows I'm reading "The Register".

            > If you don't see the value of knowing the content you receive, is the content sent by the original site, > and only visible to you and the origin, then I would suggest you have a very narrow view of computer > security.

            No. You do. You can't see the wood for the trees if you act like the sky is falling in at every opportunity... Whilst you're smugly sitting there watching your https sites, someone has broken into the warehouse, and is stealing your goods.

            As I said, too many movies.

            1. Anonymous Coward
              Anonymous Coward

              Re: Which, frankly, is untrue.

              Reasons? Or should we just take your anonymous word for it?

              You're going to need bring something more than "No. You do" to convince readers on here.

              This isn't facebook.

              1. alferdpacker

                Re: Which, frankly, is untrue.

                I think we are being trolled. I particularly like the (faux) lack of self-awareness in "You can't add arbitary circumstances to his situation to make your point. He probably never uses a public wi-fi."

                Also, "He clearly mentioned "most web sites", not ones which you have to login to."... I can't think of any site I visit these days that I don't log in to. Logged into the reg right now. Even the likes of avclub or imgur.

                1. John Brown (no body) Silver badge

                  Re: Which, frankly, is untrue.

                  "I can't think of any site I visit these days that I don't log in to."

                  Really? You must have a very limited list of sites you visit. I'd estimate that maybe 1% or less of sites I visit require a login.

      2. ParksAndWildlife

        Re: Can we get Web caching back, please?

        Frankly, you don't have a clue about cryptography.

        Encryption protects confidentiality. Signing protects integrity.

        All versions of SSL and TLS can be decrypted through man-in-the-middle attacks. All the HTTPS encryption of the origin does not guarantee that the content received is from the origin. Signed content from the origin does improve the integrity, whether that content is sent encrypted or not.

        1. Anonymous Coward
          Anonymous Coward

          Re: Can we get Web caching back, please?

          Going to need a citation for viable MitM attacks on all protocol versions please, also the TLS (and by extension "HTTPS") specs specify the use of MACs to ensure integrity:

          https://tools.ietf.org/html/rfc4346#section-1

          "The primary goal of the TLS Protocol is to provide privacy and data integrity between two communicating applications [...] The connection is reliable. Message transport includes a message integrity check using a keyed MAC."

    3. fuzzyfelt

      Re: Can we get Web caching back, please?

      If only there was a network of caching proxies trusted by the page owner installed near the end user with the page owners private key and certs that would deliver content...

      1. JohnFen

        Re: Can we get Web caching back, please?

        "trusted by the page owner"

        And the user.

      2. IgorS

        Re: Can we get Web caching back, please?

        I am well aware of the current CDN model. And it is a security nightmare!

        The content owners have to give up any control to get the needed speedup.

        I was under the impression was the Google's proposal was about improving exactly this.

        1. Anonymous Coward
          Anonymous Coward

          Re: Can we get Web caching back, please?

          If by improving you mean preventing people from disabling all the non-essential Google links and Javascript, sure.

        2. sabroni Silver badge

          Re: I was under the impression was the Google's proposal was about improving exactly this.

          Google would like us to believe that's what this is about. That in itself is enough to warrant cyniscsm.

    4. asdf

      Re: Can we get Web caching back, please?

      I encrypt everything I can over the internet just to make it harder for anyone and everyone in power to spy on me or anyone else. Not much to hide really (a quiet domesticated life) but not going to make it easy for fscks like Facebook, Lexis Nexis, or any government agency to get data for free that they monetize or weaponize and then use to contribute to the dystopia.

    5. Crazy Operations Guy

      Re: Can we get Web caching back, please?

      Clients will still cache pages and objects no matter the transport. Caching on a network is fairly pointless now as the bulk of your data is going to be single-view images and videos anyway (EG, images that change uri based on the person looking at them. Like how a social media site is going to provide a fresh uri for each user viewing the same image)

      I do use a decrypting/re-crypting proxy to do content filtering and malware detection, I did do some caching, but I ended up with a less than <1% cache hit percentage, and only saved a piddling amount of bandwidth as the items that were in the cache were tiny icons, 1K css files, and other minuscule files that the client would cache anyway; while it was missing on all the image and video data, which made up like 99% of our traffic.

      1. Bronek Kozicki

        Re: Can we get Web caching back, please?

        I think the best use case for such packages are client-side scripts. They have to be downloaded by the client, often contains unsigned components from 3rd parties (thank you, npm) and do not change that often.

    6. JohnFen

      Re: Can we get Web caching back, please?

      Although I installed an HTTPS MITM system in my home servers in order to mitigate the security problems that DoH brings, it occurs to me that this is also useful to deal with the problem that you're citing, too.

    7. A random security guy

      Re: Can we get Web caching back, please?

      I think I understand where you are coming from. HTTPS can do encryption and/or authentication of the traffic, not of a web page per se. HTTPS does guarantee the source and freshness of the page. HTTP can easily be hijacked with an MITM attack and older or wrong pages inserted in the stream.

      That squid you so love: perfect MITM tool. And you do know how many of our routers are entirely hackable. So your DNS query can return an IP address for a server in some other country.

      Even if a page is signed, it may be older than the latest version. What will end up happening is that you will have to reinvent something similar to HTTPS to make sure the pages arrive in order from the right server.

      1. kevfrey

        Re: Can we get Web caching back, please?

        HSTS (Strict HTTPS) makes MITM proxies, legitimate or otherwise, ineffective and non-functional.

        1. JohnFen

          Re: Can we get Web caching back, please?

          True, which is why such sites literally no longer exist for me.

    8. VikiAi
      Megaphone

      Orrrrrrrrr...

      ...could we just be served web pages that have information in them, rather than being 98% bloated* adverts, oh-my-lookie-at-the-pretty spinning doodads, and kilobyte after kilobyte of copy-paste spyware scripts.

      *Small, efficient, non-anoying ads. I can tolerate as a cost-of-doing-business.

      1. Crazy Operations Guy

        Re: Orrrrrrrrr...

        Kinda like the original model where an advertiser would pay the website to display the ad, then send them a simple jpg, gif, and/or chunk of HTML. The website owner then drops it into the site's pages. The advertising site then gets paid when they see a link with the advertising site's url in the referrer field of the http request. Figuring out how much was owed was a simple matter of a few command line utilities (Like grep and sed). But then, that led to such situations where websites were able to cut out the ad men from the deal, and if there is anything that useless scum like admen hate more than being cut out, I've never heard of it.

        I miss the days of when you'd be on a forum like an amateur aircraft builder's forum and you'd have simple static, or simple, non-eye-searing gifs provided to the operator of the forum by various aircraft parts companies and the like. The image was a simple link to the store it was advertising, no redirects, no third-parties, nothing but simple link. Periodically, the advertised store would look at their logs, pull out the web access logs and counted the number of unique visitors and also number of visitors that bought something, then cut a check.

    9. Hawkeye Pierce

      Re: Can we get Web caching back, please?

      > Then HTTPS-everywhere mania kicked in, and now every single load has to go back to the origin!

      Absolute 100% codswallop. Your local browser is more than capable of caching HTTPS resources and will be doing so on every HTTPS site you visit unless that site is explicitly instructing the browser not to.

      And to pick up on one other comments made here... Using HTTPS does not prevent a man-in-the-middle from seeing what DOMAINS you're accessing but does prevent them seeing what PAGES within that domain you're reading. That anti-government Facebook page you read - no the MITM can't see that you're accessing that.

      1. Luke McCarthy

        Re: Can we get Web caching back, please?

        Your local browser can cache content that you have looked at, but an ISP for example would be unable to cache content that multiple customers are accessing.

        1. Aitor 1

          Re: Can we get Web caching back, please?

          The whole point of https is about them NOT knowing, thank you very much.

        2. Ben Tasker

          Re: Can we get Web caching back, please?

          > Your local browser can cache content that you have looked at, but an ISP for example would be unable to cache content that multiple customers are accessing

          Except, most ISPs partner with and host the boxes of various large CDNs.

          So, as long as the content you're accessing is served via one of those CDNs, you're still going to get served from an on-net device rather than having to hit a peering point.

          And for "large CDNs" above you can substitute in the following names as a minimum

          - Akamai

          - Google

          - Edgecast

          - Netflix

          - Cloudflare

          Most have quite a few others too.

          On-net caching is still very much a thing, what's changed (and this is pretty crucial) is that this model means the ISPs get the caching benefit, but none of the access to what your doing (preventing them from injecting ads, or profiling your viewing habits), because the boxes are controlled by the CDN providers and the ISPs just provide the connectivity

      2. JohnFen

        Re: Can we get Web caching back, please?

        "Your local browser is more than capable of caching"

        I assumed that he wasn't talking about browser caching.

    10. eldakka

      Re: Can we get Web caching back, please?

      The lack of cachability issue started way before HTTPS became common.

      In the late 90's I administered our proxy solution for an enterprise that had ~30k desktops behind it.

      Our internet-facing caches (we also had internal ones for corporate intranet sites) had 40GB per server caches. May not sound like much, but we are talking the lat 90's here.

      Over that time period we could see a serious degradation to the efficiency of the caches (how much traffic was served from caches vs total internet incoming data).

      This was primarily due to the proliferation of 'active' pages. jsp's and asp's and other cgi-based dynamic content. This means each page was dynamically created, on the fly, for each page-load. Many of the pages pulled data from databases and other backing stores and crafted the page on the fly. None of these are cacheable, as most that used these technologies diabled cacheing (in page headers, which can, of course be ignored by your cache, but then, how do you know as the cacheing provider whether it is a valid directive, or bad implementation?). And even if the page was cached, the next time the page was hit the URL would be unique due to the way that the page itself crafted links with unique ids in the URL, therefore each time the page is hit, the URL changes, thus won't match an existing page in the cache. And many of our own websites for public interaction were dynamic themselves, with the webservers doing nothing more than forwarding the requests to active pages that pulled all the content directly from databases.

      Sure, HTTPS made this worse, but face it, most of the web these days is dynamic, uncacheable, content. Most static HTML pages out there are usually so small data-wise compared to the real content, the dynamic content, that implementing a cache probably costs more money than the bandwidth costs of those cacheable pages.

      And its not just HTTPS or the general migration to an active internet (dynamic page content) issues either, a lot of it is DRM issues. Video bandwidth is a huge % of all data these days, netflix, youtube, etc., and all of those pages are encrypted for DRM, not for user security. Therefore that segment of traffic would be encrypted irrespective of the general move to HTTPS. It's that video traffic that chews up the bandwidth and would be ideally cached - at least for multi-user networks, e.g. corporates, apartment building networks, educational institutions, etc. - but 'piracy', therefore DRM, therefore it is encrypted anyway whether you want HTTPS or not.

  2. Ian Michael Gumby
    Boffin

    Hmmm man in the middle?

    How do you know that the web page you requested, which is being delivered by a third party... is actually the web page you wanted? And how do you know that its the more recent version? (e.g. dynamic content)

    1. A random security guy

      Re: Hmmm man in the middle?

      Worse: if I know that an older page had a vulnerability but the new one didn't I'd ship you the old one. Since you are trying very hard not to hit the original server, You will get exploited.

    2. Nick Kew

      Cacheualty

      There's quite a lot to address that, going as far back as HTTP/1.1 and the cacheing framework the Web grew up on. Dynamic content and third-party caches is a long-solved problem.

      What HTTPS solves is the malicious MITM. The entirely benign cache is a casualty.

  3. Chris Stephens

    W T F,... This is so ADS CANT BE BLOCKED.. Right now you can DNS block ad sites. No longer with this tech. This tech is to make it impossible to block ads.. From the largest ad maker.... Why is that not in the story ?

    1. Chronos

      Because the motive was mentioned in an earlier story about changing the way Chrome[ium] exposes the API that uBlock Origin et al uses, so it went without saying. GoOgle are trying to reinvent the way client-server works which removes all choice from the client and forces the server elements that aren't GoOgle into a framework that gives GoOgle and a few selected (read monied) providers control of the traffic.

      Enjoy the www while you can. The halcyon days are over if this isn't stopped firmly with a clue-by-four, a classic Etherkiller or several cattle prods. The trick is going to be getting the ordinary users to care.

      1. JohnFen

        "The halcyon days are over if this isn't stopped firmly with a clue-by-four, a classic Etherkiller or several cattle prods."

        The halcyon days were over years ago. The web has been declining ever since. I'm now seriously thinking that I may see the day when the web becomes completely useless to me.

        1. Chronos
          Devil

          Search with simple terms is almost useless now, John. With aggregators, price comparison sites, ad men and so on your first SEO'd up the ladder results might as well not be there, for all the use they are. You need a long list of advanced operator strings to make search work, even to get '90s AltaVista levels of accuracy.

          Sometimes, it is incredibly irritating. If tech is your bag, you're more likely to get decent information from a well aimed post on what's left of Usenet than the web.

  4. doublelayer Silver badge

    How often

    The rest of these comments cover the very real security implications of this. I agree on all of that. However, I also need to ask another important question: when would this actually be of much use?

    We no longer use the internet of the 1990s, where everyone saw the same page, the pages didn't change frequently, and network bandwidth was limited. Nearly every site and page online falls into one of these categories, none of which would benefit from this at all:

    1. Sites with user-specific content, requiring direct contact with the server. Webmail, anything with a login page, etc. falls under this. Obviously, nobody gets an advantage if this is cached, and for privacy and security, this would have to be encrypted from server to device.

    2. Sites with very dynamic content, requiring that visitors receive up-to-date versions of the page. Yes, for some sites, you could get a bit of cache benefit by having a small time, perhaps five minutes, of caching. That only works if the site gets accessed by a lot of people and the site only changes every once in a while. News sites might work this way. But many other sites update more frequently than that.

    3. Sites that offer small pages. When a site infrequently updates, the pages from that site are often very small, so there isn't much downside to a direct connection.

    4. Sites that are rare. Many other sites will be accessed so rarely that, by the time someone else wants a page, either the cache has evicted the copy, or the copy has expired.

    We need a page that updates relatively infrequently, has no user-specific content, no private content, is accessed by a bunch of people who all want a small enough set of pages that the cache keeps them, and has large enough files that the cache provides a real benefit. The only one like this that comes to mind is Wikipedia. Of course, you could always save some pages yourself or keep the whole thing offline. Any other sites that come to mind?

  5. Anonymous Coward
    Anonymous Coward

    same-origin policy

    The same origin policy protects against DNS rebinding attacks that could allow a malicious website of application to get inside the LAN.

    https://medium.com/@brannondorsey/attacking-private-networks-from-the-internet-with-dns-rebinding-ea7098a2d325

    Take TCL/Alcatel's cloud for instance.

    It appears that TCL was able to register a private internet address (192.168.33.33) for their cloudy applications on their smartphones:

    https://www.virustotal.com/#/ip-address/192.168.33.33

  6. jonathan keith

    Can't see a single benefit to this.

    Please feel free to point out the benefits I've obviously overlooked, but from where I'm sitting, web packaging offers absolutely no improvements to end users at all, and in fact seems like it would be actively detrimental.

    1. Giovani Tapini

      Re: Can't see a single benefit to this.

      There is some possible benefits to performance as it turns the whole internet into TOR nodes... For almost every other aspect I agree with you. It removes a number of layers of security and is quite unfriendly to work with.

  7. Andy Non Silver badge

    Not directly related but...

    I used to provide a number of shareware / freeware packages that I'd developed, all available from my website. One day I had a complaint my software "contained malware" which it didn't but a little research showed me that a scammer had cloned my website (and many other sites that offered software) and put a wrapper around my software bundling adware and malware with it. Google was happily linking to the corrupted version of my site and software.

    Bundling up other people's sites and content sounds like a recipe for mischief or fraud.

    1. Nick Kew

      Re: Not directly related but...

      Of course it is. That's why anyone who's remotely serious about distributing executable contents will PGP-sign their packages.

      I take it the time you refer to was a more innocent era. Not this century.

      And of course, you can't entirely protect dumb users from counterfeits!

      1. Andy Non Silver badge

        Re: Not directly related but...

        Thing was, they didn't interfere with my package or installer, they just bundled the whole thing plus malware into another package. So signing my part of the contents wouldn't have made any difference. They chained the installation of their outer package to the installation of my inner package.

        1. Nick Kew

          Re: Not directly related but...

          You sign your package. Then you can demonstrate to the world that the bundle with added malware is nothing to do with you.

          Clearly tell your users to check the signature and you can firmly blame them for ignoring your advice.

  8. pavel.petrman

    Web, the new desktop...

    ... and with it Google, the new Microsoft. Remember the times when it was nearly impossible to avoid Microsoft in personal computing? Schools requiring pupils to deliver slideshows in Powerpoint, authorities requiring and publishing solely in Word? Websites "best viewed on Internet Explorer at 1024x768"? Here it comes again, only this time users and programs alike have moved to the web and touchscreens. In touchscreens, the Google domination is apparent already, and with web, one can see their push for domination in clearer shape every day. If one looks a bit further than what the news already report as fait accompli, one can see their efforts at turning everything non-Google into second-class citizenship, together with restraining user liberties of the new fisrt class (see proposals for API changes in browser add-on standard and lack of network traffic accessibility therein, for just one example). Don't be evil my firefoxhole.

    1. NATTtrash
      Paris Hilton

      Re: Web, the new desktop...

      Remember the times when it was nearly impossible to avoid Microsoft in personal computing? Schools requiring pupils to deliver slideshows in Powerpoint, authorities requiring and publishing solely in Word?

      Why do you think this is past tense?

      As a person who has to present at congresses regularly: just Google (no pun intended) "instruction for presenters" and see what you get back. Sometimes, if they are savvy, they allow you ppt in stead of the usual pptx. About 20 years ago, you had the occasional "nerd" who wanted to use his Apple, and his Apple only. But with the introduction of centralised server setups that presenters upload their presentations to, this has become more and more mono-culture, to the point that now: Want to do pdf? (I know, I know, not perfect either, but hey, indulge me). Nope, sorry, PowerPoint only! And adding to that, maybe authorities have become a bit more pdf minded what communication is concerned, but ISO standard/ Open Document they are (in my experience) certainly not. How many times have we heard: "Oh, can you send me the Word document please?"

      1. Franco

        Re: Web, the new desktop...

        I get that constantly. I send my CV in PDF format, so it can't be edited or have recruitment agency tat added to it, and they frequently object to it and ask for Word format.

        Back on topic, I am inclined to believe that this is another attempt by Google to bypass ad-blocking though.

        1. Spacedinvader
          Meh

          Re: Web, the new desktop...

          They could just right click - open with - word...

          1. Franco

            Re: Web, the new desktop...

            I can only assume you don't deal with a lot of recruitment consultants. They would have to pay attention to what you were saying to hear that advice.

    2. Vincent Ballard

      Re: Web, the new desktop...

      I had to install Word just this week because a client sent a docx with a form to be filled out and returned and LibreOffice didn't support the ActiveX controls in the form.

      1. Ken Hagan Gold badge

        Re: Web, the new desktop...

        Did you bill them for the WORD licence?

  9. Mahhn

    Google is drooling

    They can't wait to have complete control over what you see. Think that editing all the hyperlinks in gmail (so they get add hits) was bad. Every thing you will see will be modified by google to meet their best propaganda advantage over your financial and political situations.

    Google has gone full evil.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like