"X Windows"?
No such animal. ;)
X.org, which develops the open-source X Window System for Linux and other Unix-y desktops, has warned security flaws have been discovered in the code – and some of them have been hanging around for 27 years. The bugs can be exploited by applications to crash the window system, or run malicious code as the root user if the X …
Argh gonna drown myself in bleach.
No no no. Retcon it as a successful attempt to troll old X hands. Success!
(That said, I admit I upvoted the original post too, since I've been using X11 since, oh, 1988 I think. Probably still have the protocol and xlib reference books around somewhere.)
There's always Wayland and Mir.
X's biggest problem stems from the era it existed in. Back in the days when the Internet was entirely inhabited by academics and military people. "The Internet" was like a little village where people mostly knew each other, and you could leave the front door unlocked without fear of being ransacked.
Not today.
The good news is that very few people have X ports directly exposed, the default configuration on Linux systems for a long time was that X only listens on Unix domain sockets and the only real thing I'd be worried about is X11 forwarding over SSH to untrusted hosts or some possible attack vector via WebGL.
Out of these, only the latter one is of particular concern, since it's possible to coax someone to click a malicious link with specially crafted WebGL code to perform something nasty. Sandboxing in the browser may help, but it looks like the X.org people are on top of it already.
"Exactly Stuart. It's impossible to look 30 years into the future and predict anything*, let alone what part of your code may be exploited."
Let's be honest... You don't need a crystal ball to tell you that it's bad practice to try and dereference a pointer that *may* be invalid.
In 1985, before these bugs appeared, it was already a well established fact that dereferencing a NULL pointer was singularly a "very bad idea." So bad that mi Amiga 1000 and its Lattice C compiler, both were able to detect and crash the task that was idiotic enough to do it. [Enforcer and Mungwall had many more capabilities than just this, detecting stack under/overflow, memory out of bounds (e.g. buffer overflow), ....] By 1987 having the tools running on both mi Amigas 1000 and shiny-shiny A2000 at boot. That didn't help for games but for anything else, it allowed me to quickly sort the good from the dead-on-arrival crap.
It wasn't just me, all the sysops I worked with around the Amiga were running such tools. Especially if you were a Librarian on boards and, in my case, multiple fora on CompuServe. Bad assumption, but I *thought* BSD had such protections built in. It was running on the A2000/030/882. [I also had Mac and PC, PC through the Bridgeboard.]
So stating that "[i]t's impossible to look 30 years into the future and predict anything*, let alone what part of your code may be exploited" is complete, utter, bull-shite.
Nonsense. The moment I got my first ISP account ever, I managed to see people trying to get into my Linux box. This was ancient dlalup. I'm not even sure if my modem was up to snuff in terms of speed. Still didn't stop people from the other side of the planet to probe my box.
If the "consumer Internet" was dangerous pretty much day one, then the "academic Internet" must have been similarly dangerous.
My last "pre-Internet" employer certainly took their security seriously. A bug of this kind would not have been exposed to dog+world even then.
From then to now, the number of attacks per day isn't even all that different.
You have to realise how long ago 1987 is. This is the year before the Morris worm (which came as a real shock I can tell you). An era when default passwords to allow the DEC engineer access to your VAX was normal. Also five years before the CERN conference where I first heard about the world-wide-web which was at that stage an internal tool that just a few non-particle-physicists were starting to use.
You have to realise how long ago 1987 is. This is the year before the Morris worm (which came as a real shock I can tell you). An era when default passwords to allow the DEC engineer access to your VAX was normal.
Exactly… admittedly I was just 3 at time time, so personal memories, but as I say, the Internet was a "village" compared to what we have today.
If you had access via some system, chances are the person you were attacking knew who operated the machine you were attacking from, and a quiet word would soon have your access revoked. It was a totally different world.
Sorry, but "hacking" was already common in those days to get more access or more resources to shared machine. Mitnick started to break into systems in the early '80s.
As soon as systems became connected - and the Internet was not the only network available - they became a target - and someone break into them.
In 1983 I took a day off from work and went to see "War Games." Stopped off and bought a Commodore64 I was so enamored of the idea. BBSes and AOL in the late 80's. The first real ISPs, Mindspring at the time was one, didn't appear until the early 90's, IIRC.
1987? I don't think there was anything we could call the internet then. In '89 at the school I worked at, it was all one could do to install Mosiac, later we got Netscape, and make a slip connection to a University account for some real WWW stuffs. Not very many web sites to visit in those days...
Home computers in 1987 were very rare. No one I knew used one at work, all that was just starting with 8088/8086 machines with 10MB HDs used for word processing mostly. Engineers still had drafting boards, there was no AutoCad, if there was it was such a small number of users as to be virtually non-existant.
I'm always wary of stories of how much some people claim to be involved with computers in the early to mid 80's, most of these stories are BS...
I'm always wary of stories of how much some people claim to be involved with computers in the early to mid 80's, most of these stories are BS...
What, you mean such as claiming to have installed Mosaic in 1989, a full four years before it was released? Your dates are well off. Home computers were commonplace by 1987 although admittedly it's before the PC became the default option - its four or five years after the Spectrum/BBC/C64 and around the time the Amiga and ST were beginning to gain traction. An yes, AutoCAD certainly did exist and when I first encountered it only three years later it was even fairly mature for 2D at least. Of course draughting boards were still around - they still are. An architect friend of mine still does most work on paper for the simple reason that the effort needed to redraw existing plans electronically negates any advantage of computerisation when modifying an existing building.
If no one you know used one for work at that point that says something about the people you know, not the state of the technology. Computers were around and in common use significantly earlier than you claim.
1987? I don't think there was anything we could call the internet then.
There most certainly were both internets and the Internet in 1987. The IP Internet had existed since 1984, and the NCP Internet before then.
No, there was no HTTP, much less graphical browsers. But the Internet is not the web. Hell, we even had SMTP in '87.
"X's biggest problem stems from the era it existed in." - this would be true for Windows as well, wouldn't it? Back in the days of Windows machines with no Internet and often no LAN at all, when you can leave your machine unlocked in your office and no one would touch it... but of course this is a lame excuse for Windows, X Window, Linux and whatever software still in use today.
A lot of code still in use was written many years ago - and now with a lot of efforts to uncover vulnerabilities - more sophisticated researchers and more sophisticated tools - lots of vulnerabilities will be discovered, many of them dating back to many years ago.
"X's biggest problem stems from the era it existed in." - this would be true for Windows as well, wouldn't it?
No. Windows NT was born well into the internet era ('93), was conceived as a full multi-processing system, and failed to implement any serious multi-user security. X started in '84, when the Internet largely consisted of students sending emails and the odd bulletin board.
X's biggest problem stems from the era it existed in. Back in the days when the Internet was entirely inhabited by academics and military people.
Neither of those statements is particularly accurate.
X was a product of Project Athena, and was incorporated into the Andrew Project as well. It was initially used by highly-knowledgeable students at MIT and CMU, who were both capable of and motivated to find security holes (if only for their own amusement). X11R1 included quite a range of security features, and there was much discussion of their relative merits. Don't forget it was contemporaneous with Kerberos, also a product of Project Athena; PA made network security a primary goal.
And while the NSFNet backbone prohibited commercial traffic in the '80s, there were other Internet backbones, and some commercial entities used NSFNet for non-commercial traffic. There very definitely were commercial users on the Internet in '87.
According to the announcement linked to in the article, the '87 core protocol bugs are all integer overflows. That class of bug was not visible as a security issue in 1987 (at least in public discussions; who knows what the spooks might have been up to?).
Prior to Levy's "Smashing the Stack for Fun and Profit" article in '96, stack-smashing in general was not broadly seen as a major security threat - despite the Morris worm using a stack overflow as one of its attacks. (There was a perception that stack-smashing was too difficult to leverage in general.) Integer overflow attacks, as a special case of stack-smashing, didn't become prominent until the early 2000s.
I have seen the reports generated by the static analysis software at Coverity and find them *very* revealing. Someone registered an opensource project with them (free for opensource) and it found a couple _thousand_ "hey you might want to have someone look at these couple of lines right here". And some of those were "I'm pointing here at the dubious use of variable value, but that dubious value was set hundreds of lines earlier" gotchas.
I'm sure it would have generated an "unchecked possible use of null pointer" report on the mentioned bug line. And there are more than a couple other such services besides Coverity, free for some opensource projects at least.
May I ask WTF? How does a large critical opensource project not use free tools?
How does a large critical opensource project not use free tools?
The answer is that these are not large opensource projects, they are mostly tiny opensource projects that receive little to no attention apart from "make it continue to work". Large opensource projects are the ones that lots of people want to work on.
I'm sure it would have generated an "unchecked possible use of null pointer" report on the mentioned bug line.
That's a very bad thing to be sure of.
Anyone with extensive experience with static-analysis tools for C knows that they suffer from a lot of false positives, a lot of false negatives, or both. Some are better (sometimes much better) than others, and most are worth using. But they will not catch everything, unless they emit so many spurious warnings that they're useless; and using them against a large, old codebase is a major project.
Dynamic-analysis tools have their own limitations too, of course.
That doesn't mean analysis tools shouldn't be used. As Peter van der Linden wrote in Expert C Programming, one of the worst decisions in C's history was deciding to make lint a separate program; that applies even more for more modern analyzers, including free ones like splint and cppcheck. But they're no silver bullet.
How does a large critical opensource project not use free tools?
But all such projects do use those tools - unless, of course, none of the unpaid volunteer developers who work largely without supervision on projects that interest them personally take the time to do it.
How does anyone drive a car without checking the tires and lights before every journey? It's free, after all.
They're open source. Are you volunteering to do the analysis?
> you never hear about 27 year old Windows bugs, do you?
Yes, you do. On these esteemed pages, no less. And they are without exception greeted by howls of derision from those who claim that Open Source == Completely Bug Free And Perfect (Except When Microsoft Open Sources Anything). And there are a depressing amount of those people.
I do on my notebook too - but servers are headless.
Just read up on the patch, and at the end is this:
Users can reduce their exposure to issues similar to the ones in this advisory via these methods:
Configure the X server to prohibit X connections from the network by passing the -nolisten tcp command line option to the X server. Many OS distributions already set this option by default, and it will be set by default in the upstream X.Org release starting with Xorg 1.17.
Disable GLX indirect contexts. Some implementations have a configuration option for this. In Xorg 1.16 or newer, this can be achieved by setting the -iglx X server command line option. This option will be the default in Xorg 1.17 and later releases.
I already used the -nolisten to disable port 6000 and just added the -iglx thingy.
Ah, the good old days. When we were running whole applications on our colleagues' Unix workstations, displaying on ours, or even (for a laff) a third person's screen. Distributed compute power, leveraged in a way so integrated it makes even today's co-working environments look primitive. Ah, the Apollo/Domain converged network root, which I first... [continues at some length]
No surprise's there then as quoted from the original unix haters handbook, "and why the hell does my display now have weird hexadecimal digits, that require some kind of obscure networking protocol? - it is by the will of allah, that I throw this unix box out of the window!"
Whats always caught my interest as im sure it'll catch other peoples is that all unix & linux display UUID's are unique to that platform, so if you can query the uuid, you can emulate there display, bug fix's on the roll, bug fixes that fix what exactly.. time to go read the CVE's...
This is what happens when you give a load of troll's access to unlimited budgets and bullshit like echelon and they want to be able to scan your display.. Of course it need's a Serial Line TTY on the display read out, how else would the scanner be able to emulate your screen!
Never fear lets all piss money on it, to hell with other peoples privacy. An to hell with there software. we need to know it all, especially what's on the VDU.
Shoulder surfing is considered illegal, but when security agencies do it from 8 miles away with kilowatts of power frying the operator, it's legal of course!
The Queens assesment of pedophiles with there manuals wasnt lost on all of us, go google Micheal Aquilo a former General for the NSA and dont forget to chant "Hail Satan!"
Bless you little GCHQ guy that thumbs me down everytime, you must have an interesting Job, surrounded by all that villany? Do you sleep well at night...
All hail Lucifer all hallowed is his name,
His infernal kingdom comes,
His divine will shall be done, on the dirt as it is in heaven,
Forgive us our deed's as they need to be write,
With torture and waterboarding shall it be so...
Hail the infernal kingdom of his majisty as we put his designs for that infernal encryption everywhere!
Never fear my child, for you wll not be left out, we're programmers, the devils own spawn and we're here to make life difficult for those that would rob us all of our own fundermental rights to privacy!
http://cdn.arstechnica.net/wp-content/uploads/2013/12/nsa-ant-ironchef.jpg
http://cdn.arstechnica.net/wp-content/uploads/2013/12/nsa-ant-ginsu-640x828.jpg
http://cdn.arstechnica.net/wp-content/uploads/2013/12/nsa-ant-cottonmouth-i.jpg
http://cdn.arstechnica.net/wp-content/uploads/2013/12/nsa-ant-nightstand-640x828.jpg
http://www.9gridchan.org/
Oh boy, yeap, that must be how it feels to be stupid! "Hail Satan!"
Their claimed speedup over classical algorithms appears to be based on a misunderstanding of a paper my colleagues van Dam, Mosca and I wrote on "The power of adiabatic quantum computing." That speed up unfortunately does not hold in the setting at hand, and therefore D-Wave's "quantum computer" even if it turns out to be a true quantum computer, and even if it can be scaled to thousands of qubits, would likely not be more powerful than a cell phone!
LoL - "it's for you it's destiny calling, they say they want there 12 million back!"
MM is a bot or human bot wrangler Your bot has to address the discussion somehow, random verbiage won't do. cf amanfrommars, the Register's best comment bot so far:
http://forums.theregister.co.uk/user/31681/
Currently infesting El Reg and thedailbell.com forums. amanfrommars is pretty impressive. Its output tends to be largely chopped-up epigrams from popular culture, and sentences are too long, but the grammar is always correct. Bots are interesting but do we want them ?
No mention of XGKS or PEX. Whew! In the clear. (None of my other X11 stuff ever made it out of IBM, as far as I know. Thank goodness.)
Or to put it another way, apparently no one's bothered doing a security analysis of XGKS or PEX yet...
(I almost want to dig up the XGKS metafile handling source. I have vague memories of fixing - this would have been around '89, I think - various overflows there, but I strongly suspect now that my "fixes" are rather fragile under deliberate attack. Oh well - it's not part of any stock X11 distribution, and probably isn't used much. Though I see Steve Emmerson migrated it to Sourceforge in 2000. Warms the cockles of my heart, that does.)