* Posts by fch

126 publicly visible posts • joined 7 Aug 2009


This may seem weird but don't give us all the chip funding, say Intel and friends


Re: The long game

Honestly, I can't see a fault with that approach. For many smaller companies, the buy-out would be taken as proof of success, and the marketing as well as cap-ex powress of the buyer be seen as the needed lever to take a superior product to stellar sales. After all, what good is it if you've designed the most-fabulous CPU of the last 1000 years and adapted it perfectly to ASML's beeleeon-of-beeleeon-gigadollar chipmaking machinery ... if you can't afford as much as their waste heat nevermind the kit to do a real wafer run.

Acquisitions aren't necessarily evil; think AMD purchasing NexGen and turning that tech into the Athlon and Opteron CPUs; without that acquisition, we'd probably using overpriced overheating Intel Itanic (as partner-buyout from HP) to this day. Or, gasp, Oracle SPARC (another acquisition, that). Damned. Acquisitions everywhere. Did I mention Apple bought PA Semi, instead of setting up their own chip design team from scratch ?

I suspect we'll see both chip design and manufacturing becoming more of an "aS" type business. And only the largest users will in-source - either design, or manufacturing, or both. In a way, by and large, we're actually there; that's how TSMC runs their business. If the funding billions go to build a to-order shared fab, it may just become v2 of that. Or so the initators must hope.

'Bigger is better' is back for hardware – without any obvious benefits


You're speaking from my soul there !

Somehow, it feels as if the increases of the "power at your fingertips" we've seen in the last decades have largely been used to ... make the eyecandy a little sweeter still. Advances in usability ? Word'365 or what it's called today is not so far ahead of what Winword'95 could do; user interfaces haven't become more obvious or snappier either (but thanks to the huge compute power increases, also not slower even though every mouse pointer move in your Electron-Framework-App pushes around 100 REST API requests with a number of grpc-to-json-to-yaml-to-binary-to-toml data conversions. Oh, forgot about that old Java component in there that's using a REST-to-XMLRPC bridge thing. anyway ...).

Mobile Apps have seen advances; the interface on current Android or iOS devices beats 00's era S60 or Windows Mobile any day for sure. Desktop Apps have stagnated, though, and for a longtime. Still no voice dictation. Still no talking to my computer. It's not even taking advantage of my two widescreen displays ... still puts taskbar, menubars, toolbars, ribbon bars ... vertically on top of each other, to shrink the space that already shrank from the screens getting wider even more ... so I can now see four top halves of A4 pages next to each other alright, . Thanks Microsoft, but please train the monkey brigade doing your UI reviews a bit more in practical tasks.

Still, much of the time it feels like modern computing is like modern cars - only available as huge SUV and ultra-huge-hyperluxurious-SuperSUV, with a fuel efficiency quite a bit worse than an early-1980s Corsa, but oh yea it'll have airconditioning inside the tires as well as an exhaust pipe camera livestreaming the glow of the catalytic converter to the cloud.

Mitel VoIP systems used in staggering DDoS attacks


Re: man wtf

Back in he days, using such test functionality would have required opening the case, flipping the positions of 10 dip switches on the config panel, closing the case again, pushing the hidden "test enable" button that you only could reach inserting a small needle into the corresponding port. That's when the device would finally react to the "test-enable" packets.

I guess the Internet of Things made all that much easier. Who'd want to touch hardware anyway.

Thousands of Firefox users accidentally commit login cookies on GitHub


Re: How does anyone manage to do this?

What is this "Dev Sandbox" thing you're talking about ? Only kiddie developers use sandboxes. Real Develoopsers commit sins^Wcode right on their live instances. And github is free cloud storage for Real develoopsers. All cool. Oops.

Research finds consumer-grade IoT devices showing up... on corporate networks


Re: Corporate networks with IOT devices

Makes "DOIT". The numbers of I-DO-ITs are always converging on infinite.

A closer look at HPE's 'The Machine'


Re: Actually a quantum leap will not be enough

A "quantum leap" is a state transition that cannot possibly happen in a gradual way. Just because these effects in "real life" (aka, the realm governed by the laws of physics) are most-pronounced in the microscopic world doesn't mean they are a measure of the "smallest possible" at all.

A "revolution" in that sense isn't a state transition within the same system either, but one that replaces one system with another. In the realm of compute, go from maths with an abacus instead of using fingers and sticks, or the analytic engine instead of the abacus, and from there via general-purpose programmable computer to generalized quantum computer.

Where "The Machine" will fit in we'll hopefully find out soon!

Google offers 'INFINITY MILLION DOLLARS' for bugs in Chrome


Re: Google admits there are INFINITY MILLION bugs in Chrome!

The writer of the story doesn't get it, ∞ clearly is a peanut.

Now if they'd offer ∞ ∞ , that'd really draw the monkeys !

You can crunch it all you like, but the answer is NOT always in the data


Re: Stats 101

Last time I said that, some business boffins gave me a dozen downvotes.

Well, maybe because I couldn't hide my belief (which I can't be bothered in the slightest to back up by data) that the main usage of data mining in business is to justify decisions. Make a decision, then look for data to provide a justification.

Not the scientific method. But apparently very practical in business and politics.

Was Nokia's Elop history's worst CEO?


Re: Um, no...Stalingrad smartphon

Well, more like, sticking to their way of making phones - development teams "competing" with each other for ever more lofty and remote goals, using SymbianOS versions obsolete by the time Nokia set the project up, then burdening it with irrational half-way-wrencharound managerial politicing ...

... and they held on, after the enemies closed in, surrounded them, attacked from all sides, till the last bullet, the last drop of blood, the final breath ...

[ that said, Godwin's law ... let them and the thread rest in piece ]

Ellison: Sparc M7 is Oracle's most important silicon EVER


Re: Boastful bravado

You mean, the new Oracle salesman pitch will be:

"Would you want some free hot chips with your beeleeon-$$$-DB ?"

Oracle is doing something right ... Sun used to give software for free as long as you bought enough hardware. This will be the first one where you get the hardware for free as long as you shell out for the full DB license.

Oracle reveals 32-core, 10 BEEELLION-transistor SPARC M7


Re: Multi-core

Both gzip and bzip2 parallelize well only for compression - because that's a "blocked" operation, i.e. a fixed-size input block is transformed into a hopefully-(much-)smaller output chunk. The latter are then concatenated into the output stream. Since the output is stream, there's no "seek index table" at the beginning though and hence one cannot parallelize reverse in the same way. You only know where the next block starts once you've done the decompression and know how far the "current" one extends. While one can "offload" some side-tasks, the main decompression job is singlethreaded with both the abovementioned implementations.

One can, though, obviously compress/decompress multiple streams (files) at the same time. That's what ZFS uses, for example - every data block is compressed separately, and hence compression/decompression on ZFS nicely scales with the number of CPU cores.

[ morale: better use a compressing filesystem than compress files in a 1980's filesystem ? ]


M/T processor vs. systems ... [ was: Re: Cache size ]

There's T-series/M-series CPUs - which are all Oracle SPARC.

Then there's T-series systems - which are all Oracle, using Oracle T4/T5 CPUs in the systems of the same name.

And there's systems colloquially termned "M-Series".

Of which only the M5/M6 (and M7 to come, unless Oracle chooses to rename the system before launch) are Oracle, and use Oracle SPARC CPUs of the same name.

The older Mx000 and current M10 systems, though, are designed by Fujitsu's, and use Fujitsu's SPARC64-series CPUs (in the M10 series, the SPARC64-IX - the "commercial spawn" of the current K Super). On HotChips, Fujitsu also presented on the SPARC64-XI - to go into the post-K-Super, and possibly later into (an update of) the M10 series of systems.

Noone quote me on all these names and numbers please - refer to the vendors' marketeeting departments for the canonical incomprehensible advice instead, and to their legal departments for even more incomprehensible guidance on trademark usage.

Massive news in the micro-world: a hexaquark particle


Haven't used the boffin icon for no reason. Couldn't help the inner scientist when someone talks about strange things in the context of quarks, as if the stuff stuff is made of isn't strange enough already. I'm sure Zaphod would approve.


For an early morning comment on particle physics, that really tops it. Couldn't you find something more charming to say ? Commenting clearly reached a bottom, but I don't dare to predict whether things are going down further, or finally up again !

One more quark, that's when it'll get really interesting :-)

What can The Simpsons teach us about stats algorithms? Glad you asked...


Re: "prove causation" != "observe a [certain amount of] correlation"

I'm almost tempted to say "QED" ... as I've quoted without context to make my point more obvious. The mere mention of "cause" in the same context [ sentence ] as "statistical correlation" is huge b*llsh*t honeypot. People have gotten it into their heads somehow that statistics prove causation, and the explicit mention (as you did) about first needing a testable theory with a claim of causation and a prediction of measurable changes - that's so conveniently left out.

I do wonder who started down that slippery slope ... when I retire [ never ... ] I may do a PhD on the history of politics/economics to find out :-)


"prove causation" != "observe a [certain amount of] correlation"

If even the author of the article makes that basic mistake, then there's nothing left to conclude but you're correct - the purpose of statistics is to prove someone's point / back up someone's claim, not to ... learn anything from the data.

Go ... [massage] figure[s] !

Bing Maps COCKUP: Oracle UK HQ is 'Elvis Impersonators' joint


Dyslexia ...

makes Evils into Elvis. No doubt Oracle impersonates one or the other.

I need Friday-beer-drinking-grammar-nazi combi icon.

Ancient carving of 'first human-built holy place' = Primitive Vulture Central


The beautiful cygnet ...

that how fairy tales start, vulture-turns-swan ...

(troll icon for schmoozing up to El Reg ... can't not do it)

Low power WON'T bag ARM the server crown. So here's how to upset Intel


Re: Nothing new here

Intel hasn't always been the big name in computers and servers. In the 80's, there were microcomputer companies (anyone remembers names there other than DEC VAX ?), and a big bunch of mainframe peddlers (anyone remembers names there other than IBM ?). In the early 90s, the various UNIX vendors replaced the microcomputer ones, and the mainframe bunch shrunk. Late 90s / early naughties, [LW]Intel dug their way into the server space, and "UNIX proper" shrunk. Intel today isn't more dominant in the server space than IBM was in the 70s (huge but not without alternative nor without competition), nor do they (Itanic, wink !) always succeed in big-iron projects they start.

The memorial halls of the computer industry are littered with former "industry kingpins" that missed the next key trend, or invested too much money into the wrong projects.

"See my works, ye mighty, and despair !" - the last one is self-referential, in the end, always.

Oracle drops shedload of CRITICAL vuln-busting Java patches


Re: They also provide installers without potentially unwanted extras...

... you mean, they really bother creating a no-op installer, something that just pretends to install java but doesn't ?

(can imagine Oracle charging for that one ...)

Open ZFS wielders kick off 'truly open source' dev group


Gift horses ...

Apples and oranges. Even if either came for free, only one can turn into orange juice.

It's not a bad thing at all that it isn't GPLv2 licensed; but that it's not licensed GPLv2-compatible (as BSD or LGPL would've been), and/or not dual-licensed, that limits applicability.

The result is that there's a great opensource filesystem with far far less traction than it deserves. If Sun wanted to make Linux developers envious by dangling all these nice technology carrots, they surely succeeded to a degree. But if you want to motivate contribution and/or use, inciting envy is more likely to result in the opposite.

China's corruption crackdown killing off Unix


Re: Switching from big iron to x86 virtualisation


Not called you any names, really, apologies if you misunderstood me there. Nor have I ever claimed Linux scales to "ludicrous" (in the spaceballs sense) "numbers of CPUs" (measured by whichever gauge). 32 CPU cores on Linux are "pretty much ordinary" these days, and such systems perform "good enough".

I also agree both that there are workloads which exceed what you can do with "whitebox HW running Linux", as well as there being hardware/software vendors providing systems capable of running such workloads. Interesting where you need it - if you need it. Technically fascinating ? You bet !

I've merely made observation that the "good enough" frontier has continued to steadily encroach into that terrain. The pond full of "stuff Linux / x86 hw doesn't do [ well ]" hasn't dried out completely yet, altough, neither, have I personally seen signs of the water level there rising.


Re: Switching from big iron to x86 virtualisation

I apologize to the readers for not having given the proper icon first way round. After all, the orig comment I replied to was about 32 CPUs, not 32 CPU sockets ... in any case, I agree that "more" of some sort will give you bragging rights amongst a certain audience.

Highly profitable the high end may still be, but I'd stand by the assertion that the number of sharks in that pond hasn't changed much, while the pond is drying out. And those who dip their feet in there are either very brave, very foolish, or so desperate as to have no other choice.


Re: Switching from big iron to x86 virtualisation

Refresh your tech knowledge. There's quite a few options in the x86 space these days that offer 16 or 32 CPU cores. Even if you count chips / sockets, you have a range of choice in the 8-socket space (remind me there, how many CPU sockets did a T5-8 have, again ... ?).

Yes, there's not that many x86 servers out there that can have 32 CPU sockets. If that's what counts for you, go IBM / Fujitsu / Oracle.

No, there's not exactly a huge number of usecases for iron of that size. A bunch of two-socket boxes behind a load balancer does, these days, often beat the "big fat box" approach not just on latency/throughput but much more so on price, and/or price/performance.

Or, to put it differently, CPU-performance-wise, what a Sun E10k did in 1997, a Samsung Galaxy 4 smartphone chip does today. Probably more. You just no longer need 64 CPUs for this.

Enterprise workloads have grown nowhere near as much as have the abilities of "cheap" (x86) hardware.

That's the bane of the non-x86 vendors; same number of sharks in an ever-shrinking pond. Are those "big" boxes faster / can they process more data in shorter-a-time ? All yes, but the deciding question really is: "what box do I need for my workload ?".

New iPhones: C certainly DOESN'T stand for 'Cheap'


Amazing (non-)event

There's the first mass-market available product with a 64bit ARMv8 processor, yet noone considers it a breakthrough ... on the other hand, like cars, having a sports car with a flat-12 engine doesn't mean acceleration / speed are any different from the competitor using a v6 one. To be proven ...

Must admit that I'm otherwise sort of underwhelmed as well; good design/looks and well-though-out usability are selling points for sure. Apple made their brand on that for decades, and haven't failed on that front with the new iPhone either.

But visible, in-your-face-cool features which weren't available in either Apple's own or its competitor's older models, where are those ? Apple hasn't even played catch-up here, much less leap-frog. And there are even unique Apple facilities the new baby could've used, like, how about a thunderbolt interface to connect the phone to accessories, like, gasp, external storage ... How about a geeky change to the camera app to do manual focus / aperture, say, using the volume control buttons ? Or camera RAW, another first (?) in a smartphone, definitely if you did it for HD video using your own new codec ? Maybe a well-working OCR app that'd allow me to create my own ebooks from the old paper collection ?

It's an expensive development board to get your hands on ARMv8, if that's what you want ...

Google's Native Client browser tech now works on ARM


new year's greetings from the ActiveX zombie ?

... and there was me thinking HTML5 succeeded killing it finally, once and for all.

Foiled again ! The spectre never dies ...

Time to buy Microsoft stock. They must've plastered that path with patents thrice over.

Engineers are cold and dead inside, research shows


Re: *facepalms*

... let's see if a proper flamewar on units of measurement and/or the value of welsh oppression measured in metric will prove once and for all that nerdy engineering people have just as strong feelings as everyone else !

Does the paper at the very least provide properly error-corrected measurements for emotional suppression in units of mmHg ?

'SHUT THE F**K UP!' The moment Linus Torvalds ruined a dev's year


Political correctness vs. honesty

Reading the comments, I can't help but wonder - what has happened to honest, outright, direct communication ? Can one no longer name a turd for what it is, and call out stuff that stinks for its smell ?

I don't get this sensitivity thing. Being bullied behind your back is far worse than being called names outright and face-to-face. Yes, conversations can turn into shoutfests, but with those, at least the release of anguish and emotion prevents the buildup of resentment and desire for revenge.

What usually happens, though, is that the project manager you p*ssed off will stop talking to you, but give devastating feedback to your boss' boss, and your boss, and nine months later you'll be told that for all your great work, you need to improve on your interpersonal/communication skills before you can "reach the next level" (read: no extra peanuts for you, monkey, and definitely no promotion to chimp). Of course, all feedback is confidential, so good chance you won't even know who the guy was that blackmailed you.

Sod it. It's Monday, and I'm thinking of beer. Or read this: "Go, Linus !".


You can't assign anything to Torvalds, so for correctness, this has to be:

BOFH = Torvalds;

remember, C requires statements to be terminated by semicolons.

Craptastic analysis turns 2.8 zettabytes of Big Data into 2.8 ZB of FAIL


Re: The answer's in there somewhere

Don't know your perl operators ?

"." doesn't multiply but concatenate. If anything, "6 x 9" is 666666666.

(is that in the 0.5% usefulness or in the 99.5% "stuff" ?)

What Compsci textbooks don't tell you: Real world code sucks


"textbook perfect code" ...

... is rarely found in textbooks, from my experience.

Why is that so ? Because textbook code is, by virtue of its objective - teaching coding - too simple and too small for the issues associated with paid-for / commercial software development to pop up. It avoids covering issues encountered in developing software not strictly associated with the technical act of coding itself.

Amongst those, in no particular order:

Teamwork - the absolute necessity to either work with other programmers or work with project managers, business requirements, your boss and her/his budget - is usually absent from textbooks. If no framework for this exists in the software project at all, it'll end up with interesting "warts" - think of massive code diffs in series of subsequent commits for no other reason than two developers having set their IDE's to "auto-indent" with different rules. Or imagine the odd little key component in an otherwise-pure-Java project pulling in Jython plus a bunch of 50 python modules including a few binary components written in C++, because one developer got away with writing a 50-line piece in Python using those modules since that was so much simpler and faster than the 500 lines it'd have taken in Java. Think about the "write to spec" warts that occur because specs and tests are done first for "validate known-to-be-valid input only", the code is written that way, and half a year someone using it is terribly surprised about the security holes blown into it by obviously-invalid input ... never spec'ed, never tested, never requested ... and no two-way communication anywhere between requirements mgmt, technical architects, project managers, testers and coders. And where in a textbook have you ever seen an example covered so big that you _couldn't_ implement it on your own in a month (Knuth's level 3.5+ exercises notwithstanding) ?

Complexity - textbooks often are extremely brief on this one, and C.Hoare's "The Emperor's old clothes" is quoted nowhere near as often as it should be. Abstraction, Encapsulation, Specialization, Generalization and other "layering" patterns are often covered extensively from the "how to do it" perspective but warnings about the dangers of extensive layering, respectively guidelines on "good usage" (do you really need to subclass if all you want is to have an object instance that differs from the default-constructed one in a single property ?), or, to put it differently, "when not to use it", are often absent. The coverage of interfacing with pre-existing libraries is often scant. As a consequence, have bunch of new interface layers here, a new generalization there counteracted by an additional specialization elsewhere, and have multiple sequences of conversions resulting a chinese whisper chain of data transfers. Every layer on its own might be a great, well-written, perfectly documented and admired piece of code ... but the combination screams at you just as much as the result on the floor of a night drinking great wine after a michelin-starred ten course meal.

Libraries - face it, you're not going to write everything from scratch. But which Java programmer knows even a third of the Java 7 classes by name ? Which C++ programmer can use a third of Boost without the manual ? Which C/UNIX programmer knows how to use a third of Linux syscalls ? And that's only _standard_ libraries. Not to talk about the vast worlds of things like Perl or Python's component archives. Effective programming often means effective research into what existing library already-in-use-elsewhere in the project will provide the functionality that you've been considering reimplementing from scratch. Which book teaches this ?

Legacy - textbooks like to advertise the latest-and-greatest; of course it's a necessary of work life to keep your skills up, and to follow changes and enhancements in the technologies touching your area of expertise, but even more so is it important to develop "code archaeology skills". That is, go backward in time and check out what was considered good practice in 1990, what was cutting edge in 1995, which code patterns and styles were considered best in 2000. Go to the library and read a book on J2EE from 2002 as well as the newest one on Java 7, then grab a 1992 and 2012 Stroupstrup. Read and compare the Documentation/ subdirectory of the Linux kernel sources in the 2.2 and 3.4 versions. Much of this reading will trigger "duh" moments of the same sort that you'll inevitably encounter when you start to look at existing code. Nonetheless, textbooks (or, even more so, lessons/lectures) that emphasize how mistakes are identified, rectified and avoided in future projects are barely existant, else F.Brooks "mythical man month" wouldn't still be so popular more than 40 years after it was written.

Evolution - coding, and I object to adherents of "software design" here - is never intelligently designed. In the beginning, it's created, and from that point on it evolves. It grows warts, protrusions, cancers that are hindering its adaptibility to the tasks it's set to just as much as new nimble limbs and telepathic powers to help it. A wart for one usecase can be the essential sixth finger for another. Code Veterinary can be a dirty job, and the experience of humility involved with dragging the newborn out past the cr*p of the parent is not something you're prepared for by reading about it, because books don't smell. Often enough there'll be the revolting ethical challenge of becoming a Code Frankenstein as well - when you'll stich a few dead protrusions dug out of a code graveyard by someone onto the already-mistreated beast. It's like textbooks that tell you about how to have fun making babies but not about how to change their nappies or how to deal with teenagers having a fit.

All of these one can learn to live with; none of these are necessarily perpetuated ad saeculum saeculorum.

Thing is, "textbook perfect" isn't the same as "beautiful" and that isn't the same as "best for the usecase". The grimy bit about software development is that a really beautiful solution can be built from frankensteinish code, and to develop the skills as well as the thick skin to be able to do this, no textbook I've seen will prepare you for.

(I still like coding, but sometimes I think I should'be become a vet)

Slideshow: A History of Intel x86 in 20 CPUs


To Intel - give us the chip art references !

Like the Smithsonian collection does. You guys must be sitting on a secret list of your own chip design easter eggs, how about 'fessing up a little there ?


Re: But why did it take until the 386

Educatedly guessing there, but it might be that the 386 was the first one manufactured at structure widths on the order of magnitude visible light wavelengths (i.e. not significantly more than a micron). That'd give color effects because the structures will work like diffraction gratings then, and the whole die looks like areas of color. Larger structure sizes don't cause this effect, at least not at close to perpendicular angles of incidence, so they would look largely grey - apart from intrinsic coloring of the material used.

If you look closely enough, you'll notice some parts look reddish on the 4004/8008 (probably copper contacts), and the 286 one has that little red coil-like structure on the right edge. I'd contest all these pictures are color.

The GPL self-destruct mechanism that is killing Linux


Re: Nein! Nein! Nein! Nein! Plan 9!

configure / autoconf doesn't make me wonder - it makes me curse, swear and use sewer language of the worst kind. Nuking it from orbit is too kind a death for it.

It's not a tool, it's a non-tool. Full agreement with the *BSD ranter there - noone bothers understanding autoconf input / setting it up properly; it gets copied-from-somewhere and hacked-to-compile; if "development toolkits" provide/create autoconf files for you, they're usually such that they check-and-test for the world and kitchen sink plus the ability to use the food waste shredder both ways.

The result is that most autoconf'ed sources these days achieve the opposite of the intent of autoconf. Instead of configuring / compiling on many UN*X systems, you're lucky today if the stuff still compiles when you try on a different Linux distro than the one the autocr*p setup was created on.

It had its reasons in 1992, but the UN*X wars are over; these days, if your Makefile is fine on Linux it's likely to be fine on Solaris or the *BSDs as well. Why bother with autoconf ? Usually one of: "because we've always done it that way, because we've never done it otherwise, and by the way, who are you to tell us !"

Nokia earnings pain masks intact war chest, brewing counterattack


Amazing how much Symbian must be left after the torching ...

... given they sold 2.9M WinPhone-Lumias but 6.3M smartphones in total, does that really mean more than 50% of what Nokia sells as "smartphone" are still Symbian devices ?

Wow. I'm truly impressed. All the backstabbing, burying-(half-)alive, torching, butchering, burning, throwing-off-platforms etc. of Symbian, but the zombie just won't go away.

Somehow, other ex-Symbian-licensees like Samsung, Sony, LG, ... didn't need two years to get their product line to over to Android. Wishing all (ex-)Nokianites good luck!

How Nokia managed to drive its in-house Linux train off the rails


Re: From the article

That man begs to rant. Granted, there might be differences hidden in the sheer amount of drivel. I usually tire before I find them though. But then, Tomi Ahonen writes like a real Nokian (Nokianiac ? Nokiate ?)

Oracle nudges Sparc T5s back out to 2013


Re: Take no bets

when Larry says "just double it", then consider that doubling the length of a yacht won't double its speed, but will more than double its cost.

It all depends on what exactly you double, and there are many ways of doubling "performance". As there are even more benchmarketeering ways of measuring doubled performance.

Sun sailed that course for many years with this idea of "sum(more cores) > sum(fast cores)". It seems Larry got converted, like if there's not much else to show then at least show you can double "it" - find a suitable "it".

I don't doubt there's some market for these, just it's shrinking not growing; what you can do these days with a 20k$ x86-based server you couldn't do with a cluster of ten Sun E10k's fifteen years ago. "High-end" computing is becoming a commodity and that's not a trend which will reverse any time soon.

Solaris, on the other hand ... give me more. Site licenses for Solaris, for example, would be a great way of knocking crimson headwear peddlers out in places. Relay that to Larry if you would ? Something like, we'd be happy to more than double our use of Solaris - though, if and only if we can cap the license costs ...


Re: Take no bets

well, "you don't get that from either Intel or IBM" - not quite so. IBM maybe, but Intel's roadmaps as far as one of their tick-tocks ahead are usually very well published (by Intel, in fact) and much talked about. Who cares that you can't buy Haswell-based Xeons till mid-2014 ? Everyone knows they're coming, the instruction set enhancements as well as throughput/latency figures for the CPU as well as some chipset details are out there, not just in leaked NDA presentation slides but pretty much all over the tech press.

On the other hand, maybe you meant "Itanium" when you said "Intel" ?

Hobbyist star-gazer cops amazing eyeful of Jupiter's space ball

Thumb Up

... exactly.

... when my brother-in-law used to ask me "how much exactly again did you spend on that astroboffinry stuff ?" I only asked him back "how much exactly are you putting into your Harley-Davidson fund?".

I don't think for my astro-spending I could've gotten a Harley quite yet, nor an Audi TT instead of a VW Golf, or a luxury Mauritius holiday, or even, gasp, a pint a day for a decade. Still, had a lot of enjoyable nights out, of the other kind. Everyone to their own.

Agility without anxiety


Re: Agile development works well ... if done properly

Why is it that whenever the topic of "Agile development" comes up, someone feels compelled to state that it needs to be done properly ?

The answer, I guess, must be that doing it properly is a) hard and b) anything but obvious ?

Or maybe it's that it's much easier to specify "Agile is not ..." than "Agile is ..." ?

Now Apple wants Samsung S III, Galaxy Notes off the shelves too

Thumb Down

Re: The only way to be sure...

whenever someone comes up with this "prodded with a finger [ from orbit ]" thing, the image of the opening scene for "The last remake of Beau Geste" is conjured up with my mind. Is it just me ?

(Icon for obvious reasons, at least once you've seen the scene ... not downvoting anything here)

Samsung whips out Galaxy Note II, cam-phone with proper zoom lens


Re: WTF?

If anything, they're "copying" (shall I say: "improve on", because the Samsung thingy does the photography stuff right, good lens, optical zoom) the form factor of the Nokia PureView / N808. I'd be quite interested in an image quality comparison between the two.

Oracle knew about critical Java flaws since April


<quote>(protip: Java is an ex-SUN asset)</quote>

Need to correct you there. Java is an ex-Sun liability. It might've been an asset for Oracle and/or IBM. Never really for Sun ...

Nokia's WinPho 8 double date announced


Re: Just two?

you mean a good dirty dozen like they used to have in their S60/Symbian days ? One "flagship" and some 25 satellites of odd colors, shapes and sizes ? And possibly fries with it ?

I'm not so sure that Nokia did itself a favor with that "variety" (rather, varieté).

Sometimes, less is more. Even if they're not going all the way to Fruitycorp-style product line clarity. The "dozens and dozens" Nokia still has in their S40 lines.

Why women won't apply for IT jobs


Re: Interesting

I've done both - applied to some jobs where I've met all the criteria, and applied to some where I didn't. In cases where I got the job, I ended utterly bored out whenever I seemed "the perfect fit", but if there was an element of the unknown in it, I enjoyed and stayed in the job for many years.

Can only speak for myself there, but I guess the boost from someone else saying "we believe you can do it" is better for my motivation and willingness-to-strive-for-it than the intrinsic "I believe I can do it" thing. Whether this externally/internally-induced motivation thing is gender-biased I don't know, though.

Giant super-laser passes 500 TRILLION watts


Re: Pulsed power...

Otto Hahn got his Nobel prize (on the discovery of uranium fission) for chemistry not physics. "Nuclear chemistry", alchemy's finest hour. Well, until these "elementary" particle physicists came along and got all the spotlight ;-)

It costs $450 in marketing to make someone buy a $49 Nokia Lumia


Re: Only hope

You mean like use the toilet paper to mop up a little mess, the wellies to remain clean if the mess is a bit deeper, and the gear to climb up phone mast once the mess level rise significantly ?

By those standards, right now, Nokia is probably three quarters up the phone mast. Toilet paper and wellies surely have outlasted their usefulness for Nokia. Seems a bit like Nokia's leadership have dreamt about building Icarus wings for too long ... and while the mess levels are still rising, the parching Sun is melting those winglets Nokia hoped to take off with.

HP started then spiked HP-UX on x86 project


Re: Matt B...

As an ex-Sun(shiner) I take offense at the statement that the Solaris/Itanium port _failed_.

It ran just fine in the lab. Noone wanted to have it, not a single then-Sun and then-prospective-Sun customer asked for it (very different from "canning" Solaris/x86 a few years later).

Sun simply decided that there'd be nothing to sell, so it never was pricelisted / made available.

CPU and RAM hogs overstaying their welcome? Here's a fix


Re: So let me get this right ...

... are you saying they've recreated a shell version of sysadmin doom ?

Moody's downgrades Nokia to near-junk status


Re: I'm actually long on Nokia stock

So are you or aren't you ?

You start your post with "I'm actually long on Nokia stock", but finish it with "Too early to jump in but I'm watching closely". Sounds like either a very quick reaction to a turn in the markets, like someone having gotten cold feet over the writing of a forum post, or maybe just an enormous amount of confusion about Nokia taking its toll.

Anyway, good luck, you'll need it.