back to article Boffin suggests Trappist monk approach for Spectre-Meltdown-grade processor flaws, other security holes: Don't say anything public – zip it

A computer engineering professor has an interesting idea for how to handle the public disclosure of serious vulnerabilities: don't. Professor Gus Uht, engineering professor-in-residence at the University of Rhode Island, USA, argues that everyone would be safer if those who discover serious vulnerabilities refrain from …

  1. VikiAi

    Know thy enemy (bugs in this case)

    The second a fix for an unreported vulnerability is distributed, black-hats are all over it to see what was changed and how to exploit it before everyone has had a chance to test and deploy the patch (BIG assumption that the majority of users ever get around to it!), so what is the point in 'quietly' patching vulnerabilities again?

    1. Christopher Reeve's Horse

      Re: Know thy enemy (bugs in this case)

      And if the the vulnerabilities were unknown, there'd be fewer reasons for consumers to have to buy new kit, cough *planned obsolescence* cough, cough, pardon me.

    2. The Man Who Fell To Earth Silver badge

      Re: Know thy enemy (bugs in this case)

      The problem with Prof. Uht's proposal is that to be successful, it requires trusting software vendors to do the right thing and devote resources to fixing bugs. Experience shows that there's no disinfectant like sunshine to motivate vendors of any kind of products to "fix" defects, and this is especially true of the software industry. Hide the security vulnerabilities, and the software providers will adopt "security through obscurity".

      "Obscurity is to security what camouflage is to armor." - Page 44

      1. Charles 9

        Re: Know thy enemy (bugs in this case)

        The problem with your last line is that, sometimes, camouflage is your only hope because your adversary has access to superior technology (like an attack craft) that can tear apart nigh anything on a mobile base. All he needs to know is where to attack and you're dead. So, as they say, you can't hit what you don't know about. So I'll counter your line with my own.

        "Sometimes, the only defense you really have is to make the enemy think you're not there at all."

        1. Doctor Syntax Silver badge

          Re: Know thy enemy (bugs in this case)

          "sometimes, camouflage is your only hope because your adversary has access to superior technology"

          The superior technology may well be rendering your camouflage useless anyway in which case all it contributes is a false sense of security.

          1. Charles 9

            Re: Know thy enemy (bugs in this case)

            In which case, that's like saying they already know your location in any event, in which case You Are Already Dead. A false sense of security (like Security Theater) still provides confidence, which can be useful when facing the inevitable bullets.

    3. Mark 85

      Re: Know thy enemy (bugs in this case)

      so what is the point in 'quietly' patching vulnerabilities again?

      Simple, they don't. Well, some do but Intel is one who doesn't quietly patch. One has to go to Intel and look for patches and then install them. I suspect that there's just too many PC's out there with Intel chips for them to figure out which computers have their chips, where they are located, and then attempt to push a fix. The Net is just too big for that sort of thing an if Intel quietly pushed out a fix to every machine (server or PC), it might cause other issues due to bandwidth. Come to think of it, we've not seen any stats on what percentage of processors have been patched.

  2. Wellyboot Silver badge

    Keeping mum

    Is Prof. Uht sponsored by the electronic voting industry?

    1. a_yank_lurker

      Re: Keeping mum

      No, just clueless, many will privately report bugs to the vendor and monitor to see if they are fixed in a timely manner. However too many including Slurp have been known to sit on bugs because they do not want to spend the money to fix them. Only public shaming seems to get them to do the right thing.

  3. DJV Silver badge

    My first reaction is to smack my forehead, and think, "What an idiot!"

    However, a lot of it depends on the company or organisation that owns/maintains the software/service/device upon which the flaw exists.

    If the company is known to be responsible and can indicate privately to the whitehat discoverer that the flaw needs X amount of time to fix (where X might be relatively small for software and probably a lot larger where hardware may be involved) then the discoverer should allow the company that amount of time (+ a bit of leeway) if it sounds reasonable to develop and push out the update. For those companies that are known to ignore such things and only react when the faeces hit the rotating blade type air circulating device then public disclosure should not be held back as that is the only way to get the flaws fixed.

    In the end the way a flaw should be reported, patched and eventually made public should depend on an agreed, responsibly thought out timeline for that particular flaw that prioritises public security (and doesn't just pay lip service to it).

    With regard to Spectre/Meltdown then El Reg's disclosure was probably timely as Intel has, in the past, been known to downplay things until their hand is forced.

    1. DropBear

      No. I'll just stick with "what an idiot!". Keeping vulnerabilities quiet is not a valid approach if your aim is to get them fixed. The one, single and only thing that causes that is announcing them publicly. Anyone arguing against it is an utter idiot at best and actively means you harm at worst.

      1. quxinot

        I'll agree with 'what an idiot!' pretty rapidly.

        Specifically for the more mass-market users; when they hear about an important flaw that requires patching, they're more likely to turn the updates back on and get the fix... most users have been repeatedly taught that updates just mean 'we screwed up the UI and broke stuff', so if they are aware enough to turn it off, they do.

        1. Doctor Syntax Silver badge

          "they're more likely to turn the updates back on and get the fix"

          And have other upgrades break stuff. Sometimes you can't win.

  4. Paul

    the price of zero day vulnerabilities

    The price of zero day vulnerabilities has been increasing over the years, very significantly for some platforms.

    This suggests that either it's getting harder to find significant vulnerabilities, and/or that the value of a security break-in has increased a lot too.

    So even if the "white hats" decided that trying to find vulnerabilities was a bad thing and stopped altogether, the "black hats" have a big financial incentive to carry on, and of course the latter will do their best to keep them secret which makes things less secure for everybody and reduces the chance of a fix.

    Personally, I'd prefer to keep going with the good guys finding bugs and getting paid for responsible disclosure, I can't see a better way other than revolutionising the way software is developed so that such bugs are made unlikely or impossible to make!

  5. Nick Ryan Silver badge

    Security through obscurity

    ...or in other words, pretending that because you can't see it, there isn't a problem.

    1. VikiAi

      Re: Security through obscurity

      Ravenous Bugblatter Beast of Traal confirmed!

      Aaaaaand I should have read ahead to the next sub-thread vvvv

  6. The Oncoming Scorn Silver badge
    Thumb Up

    Mind-bogglingly Stupid

    ...that they can be easily eluded by closing one's eyes or wrapping a towel around one's head. They think that if you can't see them, they can't see you. If you carve your own name on the memorial before this little disappearing trick, a Beast will probably assume it already ate you in a fit of absentmindedness, as its mind is quite small and frequently absent.

  7. ThatOne Silver badge

    You don't say

    > allowing the flaws to be secretly fixed by vendors

    That's where he lost me. It's well established that vendors love wasting money when they aren't forced to. And what forces them to do it is the public naming & shaming, followed by the customer pressure of "fix it or else". If nobody knows about it there is no pressure, ergo no reason to fix anything.

    (All right, some might, but the majority definitely won't. A cent not spent is a cent earned.)

    1. Dave 126 Silver badge

      Re: You don't say

      The professor addresses your point in his original post:

      "One argument for full disclosure is that companies will not fix vulnerabilities unless they are forced to. However, at the risk of excusing less-than-ideal behavior, looking at the situation from a company’s point-of-view shows that inattention to a fix may be reasonable. There are a plethora of vulnerabilities and bugs that need to be fixed at any given time, and resources are limited, so where should such resources be allocated? Logically, it would be to address the problems having the highest potential for damage, that is to minimize overall risk. "

      1. DropBear

        Re: You don't say

        "The professor addresses handwaves your point in his original post". Fixed.

        1. Robert Helpmann??

          Re: You don't say

          "The professor handwaves your point in his original post".

          He addresses it, just not in a convincing manner. We already have plenty of examples of a variety of approaches, from immediate full disclosure to reporting directly to vendors with no public disclosure and we have seen plenty of responses ranging from completely inadequate to robust. We have enough data to make up our minds which approach works best in most cases and that's really what's important. More to the point, what the professor proposes simply flies in the face of real world evidence.

          Also, having the public's anxieties ratchet up is necessary in as much as if there is no anxiety about security flaws, there will be no patches deployed or fixes made. Complacency is the enemy of security!

          1. Mark 85

            Re: You don't say

            Also, having the public's anxieties ratchet up is necessary in as much as if there is no anxiety about security flaws, there will be no patches deployed or fixes made. Complacency is the enemy of security!

            Most people don't read tech nor does the popular press cover it unless it's something really big. Then again, most users have their PC's set to "auto" for patches. Servers are a different critter in this as they are watched and administered. If the users were paying attention and administrating their own boxes, chances are Win10 wouldn't have as large a share of PC's as it does now.

      2. ThatOne Silver badge

        Re: You don't say

        > The professor addresses your point in his original post

        No, he just gives a potential excuse why they might not fix something. An excuse all too easy to wield if bugs are not disclosed as there is always something more pressing/important to do (management bonus, golf meeting, etc.).

        (i didn't downvote you BTW)

  8. Mephistro

    Professor Uht...

    ... seems to think that there aren't already big teams of [states/criminal groups] sponsored hackers looking specifically for this kind of vuln. The responsible thing to do would be private disclosure to the company involved only IF -and that's a big "if"- the companies affected acted quickly to patch the holes.

    If Professor Uht really thinks that, e.g. Intel, will be in a hurry to patch one of these holes -something that automatically implies making it public, as explained in the article- and drag them through the courts for decades and cost them many billions in lawyers, compensations, company prestige and lost sales, he's living in Lalaland.

    I've been advising my enterprise customers to postpone new PC purchases -when possible- until the next generation of microprocessors is available, and many other techies have been doing the same.

  9. Tom 38

    With greatest respect to El Reg's team

    If they can spot that there is something awry with some unannounced security fixes, professional blackhats would also be aware.

    1. Spazturtle Silver badge

      Re: With greatest respect to El Reg's team

      Yeah I first learnt about Meltdown in December 2017 when somebody made a post on reddit saying that there were some strange patches being made to the Linux Kernel, making changes to core parts of the kernel that appear to only reduce performance for no apparent reason is a key sign that there is a major security issue before new year people had figured it out.

      In fact the first details started coming out in November 2017:

  10. Anonymous Coward

    Fair Disclosure Request

    Would Dr. Uht be willing to disclose any associations that he may currently have, or may have had in the past, with Intel Corporation?

    Google is your friend.

    I've heard of coincidences. I just never met one in person.

  11. doublelayer Silver badge

    An open letter

    I'd like to write an open letter to people who think this professor's approach is the right one.

    Dear members of the computing community:

    You're wrong. No, really. Completely wrong. I don't know what leap of logic you took, but while there might have been logic when you went up, there is none where you came down. You clearly need to be let in on a few facts of how security vulnerabilities work.

    When a researcher finds a vulnerability, they identify it with enough precision, and report it. They could release it publicly, but few do. Usually only if it's a thing that will never be fixed. But they usually don't both because it's a bad idea and because they might get paid for their hard work. So they report it to a company, who hopefully does its homework and figures out how bad a problem this is and how they're going to fix it.

    You see that "hopefully"? That's because some times they DON'T. They leave their product vulnerable, keeping the customers at risk, completely ignoring the researcher, and making a mockery of security. And that, my friends in the audience, is not a very nice thing. So sometimes, a bug has to be disclosed so the company will get up and actually do something, or at least they can be held responsible for their negligence. Do you know the word negligence? Do you know that it happens sometimes?

    Now, let us surmise that a company has proceeded with our hopefully and fixed their bug. Yay, the patch is released. The vulnerability is gone. Yeah... Do you remember that whole wannacry thing? It was kind of a big deal back in May of 2017, when a lot of things suddenly started breaking? That bug was patched in March, and a lot of people didn't have it. Maybe that is because a lot of people are lazy and incompetent. Actually it definitely was. But another set were unaware how critical the patch was. That's what publicity does. It informs the IT literate that they need to get fixing, and it alerts those who are not IT literate to find someone who is IT literate to fix their stuff because it can be broken. This, in turn, results in less broken stuff.

    You can disclose improperly or in a counterproductive way. No contest. So what? You can also drive in an improper way, too, but we don't ban driving because we're better off being able to get places quickly. Having something that can be done improperly isn't fixed by never doing that thing again. It is fixed by finding the ways to do it improperly, and not doing those. If it's critical enough, it's done by putting incentives in place not to do it improperly.

    Welcome back to logic. Let me help you up. Now, if you'd like to start researching again, that's fine, but maybe run your output past us next time. After all, you seem to have been doing it improperly, and we don't want anyone hearing about it and deciding there will be no more research.

    1. Dave 126 Silver badge

      Re: An open letter

      There's a bit more nuance in the Professor's original post than in your rebuttal of it. Specifically, he directly addressed the argument that companies won't fix bugs unless threatened by full public disclosure, but your post reads as if he hasn't considered it at all.

      I'm not saying he's right and you're wrong, but there's no point in attacking an overly simplified version of his views.

      The game is chess, not draughts. The least bad approach - since perfect security doesn't appear to be an option at this point - will be found through wargaming, simulation and analysis.

      1. doublelayer Silver badge

        Re: An open letter

        I'm aware of that, and perhaps my joke is not the style of rebuttal you would submit to the professor personally. That doesn't change the fact that he is wrong. It doesn't change all of the reports made to all of the companies that never did anything (for an example, see the tracking watches that have been known to be insecure for a year but are still insecure, about which multiple articles have been posted here in the past week). He seems to think that negligence doesn't exist. Either the company fixes their thing or they calculate that it is not important enough and are fixing something more important. That's not true; companies sometimes choose not to fix anything because they are spending all their time on the next product. That is a problem, it happens too often, and something needs to be done about it. Disclosure is a way to get something done, and if this professor refuses to acknowledge that the problem exists, he can't help to fix it. So I would argue that my comments, informal and inappropriate in tone as they would be if I submitted it as an official rebuttal, are still accurate.

  12. Anonymous Coward

    "The world has been shaken up by the disclosure; was that necessary and helpful?"

    It seems someone had gotten too comfortable about the foundation under all their ${everything}, so yes, they and their reputation needed to get jostled a bit, and if everyone feels it then everyone can remember-- kind of like "informed consent of the governed" or something approximately as quaint. If that reputation survived (due specifically to some responsible behaviour as opposed to widespread ignorance), and someone learned to be less hubristic, then yeah that seems pretty helpful too. I could be saying this 15 years ago about MS and it wouldn't be spelled very differently. (but 15 weeks ago they're doing even worse things. that's the joke)

  13. Ashentaine

    > "The world has been shaken up by the disclosure; was that necessary and helpful?"

    The chaos and loss of trust that would occur if knowledge of such a major vulnerability were actively suppressed and was then later leaked with incorrect information, or worse, some baddie managing to cause real damage and that being what exposed it, seems like it would be far more damaging in the long run.

  14. redpawn

    Famous saying...

    See no evil, speak no evil, be a weevil

  15. A.Lizard

    Should this guy be teaching?

    1. Evil Auditor Silver badge

      Yes indeed, he should. I'd suggest modern mythology is his subject.

    2. Doctor Syntax Silver badge

      "Should this guy be teaching?"

      I was wondering what branch of engineering he was teaching in and how to avoid any products his students might have had a hand in.

  16. eldakka

    As others have noted in the comments, there are many issues with this proposal, some of which I can think off the top of my head enumerated below:

    1) Many bugs only get patched under the threat of public disclosure and shaming of the companies. Therefore if vulnerabilities were kept secret, many of them would only get patched once in-the-wild exploits were found and already being actively exploited.

    2) How can you possibly notify all relevant vendors if a bug is found? For example, the Meltdown bug required operating-system software mitigations. From memory, only "the majors" were contacted to patch the issue - Microsoft, Linux, Google, a few others. I don't believe FreeBSD was notified. What about other smaller or niche O/S vendors? In this proposal, these other vendors may never know they need to patch their O/S until an active exploit was detected.

    3) Just because no-one knows of an active exploit doesn't mean its not being exploited. If there are local configuration settings that can be done to mitigate the exploit even tho no actual patches have been released, then by this proposal no-one would know to use these settings.

    1. Dave 126 Silver badge

      He addressed your point 1 in his original post. I can't help but feel this Reg comments thread would look a little different if the Reg had just reproduced his original post in full (it's not long) rather than select excerpts.

      Again, I'm not saying he's right and your wrong, but that your post would be more interesting if it engaged with his stated reasoning surrounding the incentive of companies to fix bugs.

      1. eldakka

        What, you mean this statement from the 2nd paragraph of that original post:

        looking at the situation from a company’s point-of-view shows that inattention to a fix may be reasonable.

        No, that does not address the problem. That tells me the author is a shill for the companies. The point of view that matters is that of the customer's, not the companies. i.e. the ones who are exposed to the direct effects of any exploits.

        1. eldakka


          Replying to my own post because I've realised something could be misconstrued if the context of the comment history isn't taken fully into account.

          In the statement "That tells me the author is a shill for the companies.", refers to the "He" subject of the email I am replying to, that is, Prof Uht, who is the author of the editorial this Reg story is about, it is not a reference to the Reg's story/journalist.

    2. Doctor Syntax Silver badge

      "I don't believe FreeBSD was notified."

      AIUI they didn't know anything about it until elReg's story went out. All this tells us is that if something is responsibly disclosed to the vendor the vendor should take steps to responsibly disclose it to all those who could be affected.

  17. Blazde Silver badge

    Science 101

    Apart from notifying regular users of what bad actors inevitably learn about earlier, and forcing the hand of lazy vendors, there is a pure science argument here. Publishing research spurs further research. It allows results to be replicated, deeper insights to be gained by others, and the frontier of knowledge progresses wider and faster as a result.

    Spectre/Meltdown is as an adequate case study for this because new variants have continued to be discovered, in more chips, and the widespread open research into side-channel attacks (and paging specifically) before discovery meant software fixes had already been developed ahead of time. It meant the flaws were discovered several times semi-independently, as is often the case with these things (and highlights the futility of keeping them secret). The end result will be more comprehensive hardware fixes, in more chips, and more secure design practices in future. That even suits Intel because they're less likely to go through another round of bad publicity several years down the line.

    Of course conventional wisdom should be challenged but it's surprising to see a research professor of all people advancing these naive, worn-out closed-science arguments.

    (Please think of the information Gus, it just wants to be free)

  18. Anonymous Coward
    Anonymous Coward

    Spectre and Meltdown?

    Citing those flaws as a reason to not disclose seems, well, wrong. If ever there were a flaw that needed rapid exposure, Meltdown and Spectre was it. Forget Intel / AMD for a moment; there's probably billions of devices out there (phones) vulnerable to Spectre that are never, ever, going to receive a patch to fix it. For those platforms, the earliest possible disclosure was the only way forward. And from that, disclosure of the rest (Intel / AMD) would have followed anyway.

    Ok, so I know that the flaws on x86 platforms were identified before those on ARM, but the x86 community not telling the ARM community about this kind of vulnerability wouldn't have been great. X86bods: "Here's a load of flaws in our hardware, now patched, this is how it works". ARMbods: "Gee well thanks for telling everyone EVENTUALLY, now we've billions of devices that have been vulnerable for yonks and there's nothing we can do to fix it."

    Arguably, earliest possible disclosure on Intel platforms was important too; Intel can't fix everything they've built over the past decades. Meltdown was such a monumental clanger of a flaw, so readily exploited, that keeping it quiet was probably a bad idea. I do wonder what Intel were hoping for; unexplained microcode updates not really admitting to the fact that their previous decades' of production had a serious cock-up?

    It's far easier with software; one bug, one author, one userbase, one fix easily rolled out to all users in a reliable fashion (apart from Android...).

  19. amanfromMars 1 Silver badge

    Money, Money, Money Makes the World go Round and Keeps the Narratives Churning and a'Changing*

    There are effectively an infinite number of unknown vulnerabilities ... What then is the point of actively ‘discovering’ new vulnerabilities and disclosing them?

    What is the point? Oh please, one point is immediately obvious surely to everybody and anybody.

    "Discover" unpluggable vulnerabilities whose stealthy disclosure would crash and crush formerly assumed and presumed invincible systems is a constant source of unbelievably generous fiat wealth which provides bounty with decisions/agreements/promises equally for either disclosure or an extended non-disclosure.

    *The magic secret being ..... being able to ensure future directions and/or disclosures are also user friendly rather than simply and exclusively systems catastrophic.

    IT aint rocket science. Even SMARTR Humans are easily bought off until such times as their discoveries and exploitation of new vulnerabilities are mitigated or superseded and turned to another more mutually advantageous beneficial direction. But do not be expecting to pay peanuts whenever the cost of failure to engage with such ACTivIT and principals is realistically priced in the trillions.

  20. Nano nano

    Capability ..

    What price Roger Needham's CAP architecture now .... would have saved us from overflow-type exploits a long time ago ...

    1. Anonymous Coward
      Anonymous Coward

      Re: Capability ..

      None, as people today can BS their way around a wrong answer but not around a missed deadline. Between doing it fast and doing it right with no possibility of doing both at once, fast wins.

  21. Peter Galbavy

    He's obviously a tenured academic who has never had to work for a living.

  22. imanidiot Silver badge

    So? Responsible Disclosure?

    Isn't what he's arguing for basically the ground rules for responsible disclosure? Keep schtum, report the problem to the company, give them enough time to fix it or at least give you a response and reasoned argument why they need more time or won't fix it. Then if the company is being an arsehat and only AFTER the previously set deadline has expired without action or response do you disclose to put pressure on the company.

    I think the professor is being a bit naïeve in his thinking if he's arguing we should wait with disclosing until it's either patched or exploits are out there because the past has shown we won't know of many of the exploits in use until it's far, far too late.

    1. Michael H.F. Wilkinson Silver badge

      Re: So? Responsible Disclosure?

      Waiting until exploits are out there would also be hampered if we do not know what exploits to look for. Of course it makes sense to alert the manufacturer of the vulnerable hardware or software before going public, and there is a case for waiting some time before going public, but only telling the general public about serious threats LONG after discovery is simply not on.

    2. S_W

      Re: So? Responsible Disclosure?

      Yes, there are several faulty assumptions in his thinking.

      Firstly, that vulnerabilities are being immediately made public. In most cases they are not.

      Secondly, that black hats are learning about these vulnerabilities from the disclosures. While some will, if the vulnerability exists it may already have been discovered and be being actively exploited.

      Third, that the only way to be safe is with a vendor provided patch. In many instances a configuration change can eliminate the vulnerability albeit at the cost of functionality or performance loss.

    3. Blazde Silver badge

      Re: So? Responsible Disclosure?

      No, he's not arguing for responsible disclosure. He's arguing for security by obscurity. He says it may not make sense for vendors to assign resources to fixing a disclosed vulnerability until it's being actively exploited. In his view there are infinite vulnerabilities and limited resources to fix them, so white-hats are actually making things worse by finding(*) new vulnerabilities.

      (*) They don't just find vulnerabilities, he says "white-hatters seek out or *create* weaknesses or vulnerabilities".

      1. Anonymous Coward
        Anonymous Coward

        Re: So? Responsible Disclosure?

        So IOW, he's saying bug hunting is like hunting the Hydra. For every bug you squash, two more are being exploited without your knowledge AND you're planting a big fat bullseye on the one you disclose, which may not even be fixable (we're just lucky at this point to not have a Game Over bug yet--one that cannot be fixed without an overhaul so drastic as to nuke the whole landscape). I mean, we're already hitting some serious architectural and physical dilemmas: for example, a need for highly-secure computing or communications in an environment with very little available power.

    4. Mark 85

      Re: So? Responsible Disclosure?

      we won't know of many of the exploits in use until it's far, far too late.

      Well.. probably the 5-Eyes know and wouldn't want those exploits or faults revealed.

      1. amanfromMars 1 Silver badge

        Re: So? Responsible Disclosure?

        Well.. probably the 5-Eyes know and wouldn't want those exploits or faults revealed. .... Mark 85

        Howdy Doody, Mark 85,

        Until such times as the likes of a 5-Eyes gets in touch, with your designation being an obscure subject of vested interest, it is only a possibility that they know, and those sorts of wannabes are myriad and scattered about everywhere working in Great Game Changing Fields completely in the dark and thoroughly out of their depths and familiar comfort zones/areas of traditional expert tease.

        Such then would suggest that they know a little/more than just a little about how to lead with everything rather than them being universally recognised as an enthused and confused cuckold to crazy naked emperor type psychopaths/sociopaths/psychotics more into the mad and bad rather than the rad and the good.

  23. _LC_
    Paris Hilton

    Can't we just shoot him and not tell anyone?

    Just saying...

  24. JimmyPage Silver badge

    "Professor Uht was not available for comment today"

    For some reason it amused my tiny mind that an article about security by obscurity should end with that sentence.

    If it was engineered, please have a ----------->

    1. Blazde Silver badge

      Re: "Professor Uht was not available for comment today"

      He actually was available, but answered the phone with: "Hallo! The professor is Uht!". So they hung up.

  25. iron Silver badge

    Corporate shill

    So who is paying this corporate whore? Intel? Diebold? Oracle?

  26. Anonymous Coward
    Anonymous Coward

    Insecurity by obscurity

    "Professor Gus Uht, engineering professor-in-residence at the University of Rhode Island, USA, argues that everyone would be safer if those who discover serious vulnerabilities refrain from revealing the details to the public, allowing the flaws to be secretly fixed by vendors and developers..."

    ... or not.

  27. WibbleMe

    Well the design of my car wheels might make them explode at high speeds but I'm not allowed to know.

    1. G.Y.

      air bags

      or your air-bags (just hypothetically) ...

  28. Fatman

    RE: "Shhhhh Don't tell anyone in the public about your recently discovered bug"

    <quote>Professor Gus Uht, engineering professor-in-residence at the University of Rhode Island, USA, argues that everyone would be safer if those who discover serious vulnerabilities refrain from revealing the details to the public, allowing the flaws to be secretly fixed by vendors and developers, and updates pushed out before anyone crafts suitable exploits to hack victims.</quote>

    OK Prof, what do you do about those vendors who just sit on their hands, and ignore the bugs???

    My opinion of your position is lower than the bottom of a cesspool!

  29. a_yank_lurker

    Human Behavior

    Companies are like people. Some are responsible and will fix a bug when alerted in timely manner. Others are lazy and will only get off their asses when shamed. Finally some are unethical and do not care; again the only hope to getting a bug fixed is public shaming. And of course all kinds of shades of gray in between. So to get bugs fixed there must be the threat of public shaming with the real risk of a law suit if the bug is not fixed. This is the real world.

    Also, it is unknown how many unreported bugs are being used in the wild; it is some number greater than 0. Since we do not know what is being used in the wild we do not know their severity as a security risk.

    Publicizing bugs does alert the public to be watching for an major update to X or to get the update to X. Keeping it quiet might mean many may not update X.

    1. Charles 9

      Re: Human Behavior

      And what if, as they say, the cure is worse than the disease? A vulnerable machine may well be preferable to a brick.

  30. Michael Wojcik Silver badge

    No news here

    Tourist academic writes unreviewed opinion piece on subject he hasn't studied, rehashing arguments that were developed much further by more-knowledgeable commentators years ago. Happens all the time; indeed, this sort of thing makes up the bulk of the letters in CACM, for example.

    As the Avett Brothers put it: "Ain’t it like most people? I’m no different / We love to talk on things we don’t know about".

  31. devTrail

    Withdraw his degree

    Is he really a professor? How come he is unable to understand how complex is the chain of dependencies in modern software? For every library for every system API there are dozens of dependent modules handled by different teams. It's not just matter of fixing things at the source it's about informing all the parties involved and if something can't be immediately fixed at the source the dependent modules can adopt some mitigation strategies. The case of Spectre and Meltdown is exactly the example he shouldn't have raised. How come he didn't notice that even if the issue was not patched immediately a lot of workarounds were quickly released, first of all all the browsers reduced the accuracy of the system time functions in their Javascript engines, then came all the other patches one by one and the basic compiler changes were among the last to be released.

    Furthermore it's not just about the software developed, an enterprise network is a complex environment, the software updates must be properly planned and sysadmins must know what they are dealing with.

  32. SNAFUology


    Some bug are a horrendous #SNAFU - aka Intel's Spectare / Meltdown where fixes take repeated testing in a myriad of situations - keeping quiet is very difficult in that situation. As how do you know it is difficult to fix unless you try it and get feedback of the result.

  33. G.Y.


    Way back then, a prof told Intel about flaky FDIV; he was ignored until the web exploded. Not a security issue.

    A few weeks ago, a mother told Apple about flaky FaceTime (her son hit the problem); she was ignored ("go pay for a developer license 1st") until the web exploded. Very much a security issue.

  34. Anonymous Coward
    Anonymous Coward

    A very different perspective, perhaps....

    Professor Uht is no dummy, very, very much a HW guy. I think that might color his attitudes towards vulnerability reporting, in terms of how and how quickly remediation might be possible or effective. No

    "Dr. Uht has been on journal editorial boards, including the Journal of Instruction-Level Parallelism. He was Co-General Chair and Program Chair of the Boston Area Architecture Workshop (BARC-2006). Prof. Uht has served on several program committees and NSF review panels. He has contributed to many publications, including: COMPUTER magazine, IEEE Micro magazine, IEEE Transactions on Computers, IEEE Transactions on Parallel and Distributed Systems, and various conference and symposia proceedings. In 2001 and 2005 Prof. Uht was recognized for “Outstanding Contributions to Intellectual Property” by URI. He holds eight U.S. Patents and has one patent application filed; these are all in computer architecture, especially instruction-level parallelism and better-than-worst-case design. Dr. Uht is the recipient of the URI 1998 Aurelio Lucci Faculty Excellence Award in Electrical Engineering. Prof. Uht worked for four years at IBM on mainframe main- and extended- memory development. He also worked at the Laboratory of Nuclear Studies at Cornell University. He is a licensed Professional Engineer (P.E.) in the states of New York, Pennsylvania and Rhode Island. Dr. Uht was the College of Engineering representative of the local chapter of the Sigma Xi science honor society, and is also a member of the Eta Kappa Nu electrical engineering honor society, the IEEE (Senior Member), and the ACM."

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like