* Posts by Hoagiebot

66 publicly visible posts • joined 26 Jul 2011


Zilog to end standalone sales of the legendary Z80 CPU


Re: Not fade away...

Sadly, all Z180 and Z182 CPU's are being discontinued at the exact same time as the Z84C00 (Z80) family. Here are the EOL/Last Buy notifications letters from Zilog for them:



FYI: The Z280 was discontinued in the mid-1990's, so that ship sailed long ago.

Version 100 of the MIT Lisp Machine software recovered


Re: The Forgotten Fifth Generation

First of all, there was a computer from the Japanese ICOT "Fifth Generation Computer Systems (FGCS)" project that was commercially sold. Mitsubishi sold a couple versions of the ICOT "PSI" ("Personal Sequential Inference") computer. Its operating system was written in a language called "KL0", which was a derivative of Prolog that was expanded so that it could also be used as a kernel development language. (That is where the "KL" in the name came from-- it was "Kernel Language 0". As an aside, there was also a KL1 language that was concurrent so that it could run on massively parallel ICOT research inference machines known as "PIM"s.) The KL0 language's code execution was sped up by being run on a custom microcoded accelerator chip in the system designed specifically to execute KL0. This custom processor was more or less an extended "Warren Abstract Machine (WAM)" implemented in hardware. Anyway, the Mitsubishi PSI was targeted specifically towards natural language processing and automatic document language translation, and was sold to Japanese multinational companies for that purpose. If I remember correctly, only a few hundred Mitsubishi PSI's were sold, so they were not common machines.

With that said, the history portion of Liam Proven's article was an absolute mess. The rise of the specialty LISP and Prolog machines of the 1980's was a result of the keen interest in expert systems at the time. Expert Systems were considered to be bleeding edge A.I. in those days (and a third of Fortune 500 companies were using them), but if you gave the expert systems more than a few hundred rules to follow in their decision trees they became painfully slow at returning answers. Thus several research efforts around the world started up to find ways to speed up them so that they could use exponentially more rules to make inferences from so that they could be used to in broader and broader knowledge domains and be ever more flexible, usable, and powerful. This research lead to the development of the before-mentioned Warren Abstract Machine and Prolog compilers based on it at the University of Edinburgh, *many* custom LISP and Prolog accelerator chips from many different universities, and the Japanese FGCS project, which hoped to both use hardware accelerators *and* run them in parallel for an even bigger speedup.

What killed specialty hardware systems like the Symbolics LISP machine and the Mitsubishi PSI was Moore's Law and bigger better-funded companies making huge advances with developing high-performance RISC CPU's. UNIX workstations in particular took over this space because UNIX systems were very general-purpose and could be used for lots of things, not just specialty A.I. applications, and by the early 90's Sun SPARCstations and Digital DECstations with their SPARC and MIPS RISC CPU's, respectively, had become *faster* at running Prolog and LISP than the specialty machines with the hardware LISP and Prolog accelerators were. (There is a great PDF report written by DEC about this here: https://www.info.ucl.ac.be/~pvr/PRL-TR-36.pdf ) What were you going to spend your tens of thousands of dollars on if you were a company? Buying a specialty hardware Prolog or LISP machine or a far more useful and flexible UNIX workstation that could also run LISP or Prolog faster than the specialty machine? It was a no-brainer decision. Even the Japanese FGCS project saw that it was futile to keep producing their custom inference machines when competing against the ever-rising performance of SPARC CPU's and created a UNIX KL1 compiler so that they could start porting their inference machine software over to SPARC machines to continue their research. So it wasn't so much a "MIT/Stanford" style of programming vs. a "New Jersey" style of programming war like the author tried to portray it as. Instead, it was that better funded large CPU companies were able to increase the performance of their CPU's faster than university researchers and tiny niche workstation companies could increase the performance of their custom single-use special purpose chips. Just like the Commodore Amiga and it's custom chipset, the Symbolics LISP machine and the Mitsubishi PSI had the performance edge in their particular niche at first, but their specialty custom nature and disadvantage in R&D funding eventually allowed more standardized, general purpose, and better-funded platforms to steamroll over them and crush them in the market.

Seth Rogen bags Woz role in Sorkin's Steve Jobs biopic


There was already an absolutely excellent bio-pic on Steve Jobs and Steve Wozniac (with a little bit of Bill Gates thrown in for good measure), and that was the made-for-TV movie "Pirates of Silicon Valley" that was released by Turner Network Television in 1999. I enjoyed it so much that I bought it on DVD and still watch it regularly. You can see the Wikipedia page for the film here:


Noah Wyle and Joey Slotnick did excellent portrayals of Jobs and Wozniak respectively in the film, and despite being only a made-for-TV movie it was very well done and very compelling to watch. I say just hunt down a copy of this film from somewhere and call it a day-- I can't imagine any film involving Seth Rogen in any capacity as being any better.

Ghastly! Yahoo! Groups! gripes! grip! grumpy! gremlin! grumblers!


Trying to moderate spammers in Yahoo! groups now is abysmal!

I myself just experienced the sheer horror and lunacy of the new Yahoo! Groups user-interface first-hand. For nearly a decade now I have been the moderator of a small special-interest Yahoo! group. The Yahoo! Group that I moderate, while at one-time was once fairly active many years ago, has been completely dead since 2006 or so. Still, I refuse to look into shuttering the group completely because there are some interesting discussions, files, and other such stuff uploaded there, and I would like to leave that available for any members still remaining that might want to access some of that stuff someday. It's almost a nostalgia thing from a different era, you know? Since the group is for the most part completely dead, I don't check it as much as I should. In fact, I don't even log into the Yahoo account that I moderate the group with very much anymore-- I pretty much only log into it just often enough to keep Yahoo! from closing it.

With a long-dead Yahoo! group it is easy to become completely complacent about it, which to my discredit, I allowed myself to do. Well, I logged into the group earlier tonight, and despite the fact that all new members to the group are supposed to be moderated until I approve them to be able to post content, some spammers still managed to get into the group a few months ago somehow, and absolutely *trashed* the entire Yahoo! group on me. Yeah I know, it's my fault for not checking how the group was doing sooner, but as I said, after years of seeing nothing but tumbleweeds there you grow complacent. Anyway, in a relatively short amount of time the spam-accounts had posted several hundred new files to the group, new photo albums, dozens of new links, and worst of all, set up a daily re-occurring "event," which of course only contained a message on how to get to some spam site, that then posted daily reminders to the group for every single day for months, choking out the group's "conversation" section with dozens upon dozens of spammy reminder messages. This is where the insanity of the new Yahoo! Groups user-interface comes in, because removing all of the spam content and then the spammers themselves became a tedious chore of monumental proportions.

To explain what I mean, do you know how in most web-mail user-interfaces you can select several e-mail messages at a time, usually through check boxes, and then perform the same action on them simultaneously, such as deleting them all? Considering the fact that you have been able to do that sort of thing in web-mail user-interfaces for at least a decade-and-a-half now if not longer you would think that such a feature would become obvious to have when it comes to any web application where you have to manage a lot of messages. Sadly, whoever designed Yahoo! Groups "Neo" is apparently oblivious to the obvious, because I found myself having to select every single spam message one-at-a-time and then deleting them one-at-a-time, then repeating that process for the spam links, and then again for the spam files... It took me almost two hours to get rid of all of that cruft! The only thing that I was able to do in one go was delete the daily reoccurring spammer-created "event" all at once, which was nice because I was about to lose my ever-loving mind at that point!

I mean really, why can't I select a whole bunch of messages, or files, or spam links, or photo albums, or what have you, and delete them all with one click of my mouse? Why do I have to go through them all one-at-a-time? It's such a miserable and unproductive site design that it's absolutely painful! There is only one thing that I can say which must be painfully obvious to everyone by now (well besides the fact that I am apparently not that great of a group moderator that is), and that's that whoever redesigned the Yahoo! Groups site must never have actually *used* a Yahoo! group before in their lives! The usability of the new site-design is just so unbelievably poor its unfathomable! The horribleness is simply beyond words! There is no easy way to quickly do *anything* that you need to do as a moderator to maintain a functioning group! All I can ask at this point is what in the heck were the people at Yahoo! thinking? Why has 2012 and 2013 become the years that all user-interface design for just about everything gone off the rails? When did the patients start running the asylum? I just can't understand how so many organizations (Yahoo!, Microsoft, Ubuntu, The GNOME Project, et al.) could all lose the plot when it comes to UI design so quickly and completely! It's as if the people at these companies simply hate us, and want to make us all suffer for their pleasure or something! Either that or they are all hiring trendy hipsters to rewrite their UI's while simultaneously kicking sensible people out the door! I really hope that Yahoo! back-peddles on their new Groups UI design at least to some degree, because it is an absolutely abysmal user-experience as it currently stands!

MacBook Air now uses PCIe flash... but who'd Apple buy it from?


It would make a great conversation piece a couple decades from now...

One of these brand new Mac Pros would look absolutely fantastic sitting on my shelf next to my NeXT Cube workstation. However, based on the astronomical cost of new Macs and the fact that I was never a MacOS fan, I think that I will happily wait some 19-years to buy one from a vintage computer festival for relative peanuts just like I did with the Cube...

More than half of Windows 8 users just treat it like Windows 7


Re: re:screen res

I was well aware of the display "downscaling" registry trick, but I can't consider it to be a serious long-term solution. Have you actually tried to work on a screen that has downscaling enabled for any significant length of time? It's not a pleasant experience! Downscaling a 1024x768-resolution image onto a 1024x600-hardware-resolution screen squishes the image and makes it fuzzy. It's fine to have downscaling enabled for a few minutes as a way to open a Windows Store app just long to configure its settings, such as setting your location in the Windows Store UI weather app so that the "live tile" displays the correct weather for where you are, but there is no way that you can use downscaling for any length of time because it makes all of the text being displayed on the screen fuzzy to the point that it becomes significantly more difficult to read text that has been rendered in smaller font sizes. Because of that, I would hardly call enabling downscaling a truly viable work-around-- it is much more like an emergency MacGyver-like solution that you can get away with for a short period of time when you find yourself in a pinch.

Since getting my location set in the weather app so that I can see my local weather info in the live tile was the only "Windows Store" UI-related thing that I ever felt truly compelled to do, I just stay in my netbook's 1024x600 native resolution and use the traditional desktop nearly the entire time. Heck, if Windows 8 still supported sidebar gadgets like Windows 7 and Vista did I would place my weather gadget on the desktop and never have a compelling reason to switch back over to viewing the TIFKAM side of things again.


I have Windows 8 Pro installed on a little Intel Atom-powered Acer Aspire One netbook, and the desktop side of Windows works great on the thing. However, due to the netbook having a maximum screen resolution of 1024x600, which is below The Interface Formerly Known As Metro (TIFKAM)'s minimum screen resolution requirement of 1024x768, absolutely no Windows Store app will load. None. Nada. And you know what? I really don't feel like I am missing out on a thing.

Microsoft reveals Xbox One, the console that can read your heartbeat


Re: What a terrible name

"the best that these geniuses could come up with was calling it "The Windows Store User-Interface"

Citation needed? What I read said it was "Modern UI" or "Windows 8 UI".

Microsoft calling the user-interface formally known as Metro "Modern UI" was extremely short-lived, and abandoned extremely quickly in favor of calling it the "Windows Store" UI because it runs "Windows Store" apps. As far as a citation to back my assertion up goes, The Register covered the name change for starters back in September. Check out this article here:

One more try: Metro apps are now 'Windows Store' apps


If that's not good enough of a resource for you check out Microsoft's "Windows Store Apps Dev Center Page" here: http://msdn.microsoft.com/en-us/windows/apps/jj679957 All throughout the page the apps are referred to as "Windows Store Apps."


Re: What a terrible name

Sadly, this is the best that you can expect from Microsoft's branding people. I mean really, could you think of a group of people that do any less work? As best as I can figure, the conversations around their office must go something like this:

"Oh gee, what should we call the follow-up to Windows 7?"

"How about Windows 8?"

"Brilliant! Lets go collect out unjustly over-sized paychecks!"

"What should we call our new tablet PC?"

"I don't know. I don't feel like doing any work today. Lets just steal the name from one of our other products and then call it that."

"Well, our large Microsoft Surface tables that we sell to hotels are pretty cool. Let's name our tablet after that!"

"Brilliant! Now we can go drink some lattes, smoke some expensive cigars, and play a round of golf to congratulate ourselves!"

And when these people finally did find themselves faced with some real work after they had to rename the "Metro" user-interface at breakneck speed due to some international trademark concerns, the best that these geniuses could come up with was calling it "The Windows Store User-Interface." Yeah, that really rolls off of the tongue, doesn't it? So it doesn't shock me in the least that they would call the new XBox console the "Xbox One," despite the fact that it is the third-generation of Xbox console and that it would be potentially confusing. In fact, we should be happy that they didn't rename the new Xbox the "Microsoft Game Console 2014," because that is how they get out of doing so much work (or even having to suffer the nuisance thinking) by naming so many Microsoft products that way.

Deep inside Intel's new ARM killer: Silvermont


If only there could be "Silvermont"-based netbooks

I currently have a little Acer netbook that has an Intel Atom N455 "Pineview" microprocessor in it, and I love it. Understandably, the little netbook is no processing powerhouse, but then again I don't need it to be. Instead, I like its light weight, small size, portability, and adequately-sized keyboard. It's great for taking notes at developer user-group meetings, it's satisfactory for browsing the web, and when I am working out of a hotel room I can easily type e-mails, short documents, or edit Excel spreadsheets on it-- all without having to lug around a much larger and heavier laptop, and without having to payout hundreds upon hundreds of dollars to buy an expensive "Ultrabook."

Why do I bring this up? Well, I just can't help but think about what could have been. With how much I like my trusty little netbook now, I would absolutely *love* to buy one sporting one of these new "Silvermont" Atoms in it. Think about it-- if they could produce a netbook with more than 3-times the performance of the current ones and still somehow keep its cost within the typical netbook range of $250, that would be a real winner as far as I'm concerned. Unfortunately, that will almost certainly never happen. To the best of my knowledge all of the current big PC manufacturers have completely discontinued their netbook lines, and the netbook form-factor has been eagerly declared as "dead" again and again and again by the media. To think that I would have to shell out some serious $$$ to buy an overpriced Intel-based tablet or "Ultrabook" to experience the benefits of these new chips in a portable form factor is just depressing.

'We invented Windows 8 Tiles in the 1990s', says firm suing Microsoft

Thumb Up


Wow! That image that you linked to is great! The resemblance between the old AOL start page and the Windows 8 Start Screen is rather striking! Even sadder still is that I actually remember having to use a version of AOL like that way back in the 90's, as it was the only local ISP that still supported the old hand-me-down Windows For Workgroups 3.11 machine that I used as a high school student during that time. Every other ISP demanded Windows 95 as a minimum, and that would have been a bit much for the ol' 16MHz 386SX with 16MB of RAM to handle!

Nobody knows what to call Microsoft's ex-Metro UI


Re: 'research and development'

'research and development' Is all that profit should be spent on. (Especially when you've got a cash pile the size of Microsoft to cover rainy days)

I don't know about that, especially since it seems like Microsoft is going to have a lot of rainy days ahead of them that they'll need that cash pile to cover. I mean lets face it-- corporations are stubbornly sticking with Windows XP or are at best in the middle of their upgrade cycle with Windows 7, so I doubt that Windows 8 and the programs being released with it such as Office 2013 are going to enjoy much corporate love. Add to that Windows 8's mixed reviews, the slowing growth of the PC market, and strong competition from Apple, Android, and even Windows 7 in the consumer market, and I have a feeling that Microsoft's "rainy day fund" is going to start shrinking rather rapidly...

Disney buys Lucasfilm, new Star Wars trilogy planned


Re: Hallowe'en

I for one have long wished to see actual Stars Wars movies get made from the two early 1990's Dark Horse Star Wars trade paperbacks "Dark Empire" and "Dark Empire II." They were written by Tom Veitch, and they actually spun off of the events from the Admiral Thrawn Trilogy. The Admiral Thrawn Trilogy was great, but I loved the two Dark Empire trade paperbacks even more. They featured a cloned Emperor Palaptine going on a galactic rampage with several new super-weapons, including the World Devastator machines, the Eclipse-class Super-Star Destroyers, and the Galaxy Gun. And if that wasn't enough, you got to see Jedi Master Luke Skywalker get turned to the Dark Side and take his father's place at the side of the Emperor and Boba Fett once again chase Han Solo, this time through the winding neon-colored labyrinths of the smuggler's moon Nar Shadda. I used to be quite the Star Wars-obsessed geek in my youth, at least until the Prequel Trilogy came out tarnishing things for me and causing me to move on to other geeky pursuits, but even so the two Dark Empire trade paperbacks are the two works from the Expanded Universe that I remember the most fondly. I realize that even with Star Wars having a new owner I will probably never get to see those trade paperbacks get turned into films, but you never know.

Users grumble after Adobe cancels Acrobat X Suite


That's some expensive software, even for Adobe!

Amusing typo alert! To quote the article:

"An upgrade to eLearning suite 6.1 from from Acrobat X Suite is US$599, and a full version of the former suite is US41799, according to this Adobe page."

While in this day and age I almost wouldn't be surprised to see Adobe charging a price as ridiculously high as $41,799 for one of their bloated software suites, I think that what happened here is that Mr. Sharwood neglected to hold down his "shift" key on his keyboard when he went to type a dollar sign, and thus typed a "4" instead. So in other words, "US41799" probably should have been typed as "US$1799." So attention to the appropriate El Reg editor-- you need to correct your "APAC Editor," because apparently on this particular article he didn't do quite enough editing! :D

Edgy penguins test-fly Ubuntu's Quantal Quetzal


Re: No mention of Kubuntu?

If so - then I'll be off to a proper mainstream KDE based distribution. Any suggestions?

Personally, I have always been a huge fan of the Pardus distribution, which is KDE4-based. You can find more information about Pardus and download it at: http://www.pardus.org.tr/en

Firefox 15 offers fewer leaks, more frags


Re: Is the browser really the best "app engine?"

"Yeah, it is. As I understand it, making the browser into an OS is a way to break Microsoft's desktop OS monopoly."

Really? A web browser is the best way to "break Microsoft's desktop OS monopoly?" And what are you going to run this Microsoft-slaying web browser on top of? Windows? Windows is the OS that some 90% of the Web's users are going to be running any desktop browser on, so Microsoft's really losing out big there! LOL!

Geez man, if you're going to bring up wanting to break Microsoft's dominance of the desktop OS market, how about arguing to try to combat it with another desktop OS? You know, so that you're actually trying to fight a battleship with another battleship instead of trying to fight a battleship with only a small rowboat that has been painted "battleship gray?" You see, back in 1991 this guy named Linus Torvalds created this thing called the Linux Kernel, and with the help of some great tools, window managers, and other bits from various other teams and people there are quite a lot of nice desktop OS's that are out there based on it now. You know, like Linux Mint, Fedora, Debian, Mageia, Pardus, etc. You don't need to try to force the entire role and functionality of Linux onto a web browser. In fact you shouldn't. Let the web browser load web pages and the OS act like an OS-- that's what they're there for! Trying to make what once was essentially just a remote document renderer into something that can run interactive real-time 3D games, have local storage API's, and even run native code when NaCl is used is really redundant, and the Web probably isn't the most ideal service to try to piggyback all of that increased functionality onto, especially when so much of it can be done so much more efficiently and effectively through purpose-built native applications.


Is the browser really the best "app engine?"

While it was interesting to watch that video demo of BananaBread running in Firefox 15, I couldn't help but pause for a moment as ask myself why this kind of stuff is being done at all. Or maybe the more correct question to ask in this situation is should it really be done at all? Before you make up your minds and decide to down-vote me, please hear me out.

If you look back at what HyperText Markup Language was originally designed to be, it was supposed to be an easy way to create formatted documents that linked to one another on the Internet. In other words, it was supposed to be a more flexible competitor the the Gopher protocol. It was very simple, easy to learn, and just about anyone could figure out how to do it. That's why back in the 90's so many people were able to create their own Anglefire, Tripod, Geocities, etc. web pages-- because the barrier to entry was extremely low. I mean think about it-- even your grandparents could figure out that you could start a new paragraph with "p" tags, make your text bold with "b" tags, create a heading with "h1," etc. But somewhere along the lines someone forgot that HTML documents were supposed to be documents, and came up with the idea of "web applications." Soon, in the name of making these web applications act more like native desktop applications, we started seeing technologies spring up such as JavaScript, VBScript, ActiveX, Java Applets, Flash, and more.

This slow and gradual perversion of what web pages were supposed to be has continued over the past decade or so, and now we are getting to the point where we have WebGL and Canvas allowing us to create amazing full-blown games in browsers, Google's NaCl plugin that allows you to actually run native code in browsers, and the "Boot to Gecko" and "Chrome OS" projects that are trying to reshape the browser into a computer's entire operating system! Has anyone ever stopped for a moment and thought to themselves that maybe they have taken this "lets run everything in the browser" idea a bit too far? If you're getting to the point where your web application needs to run native code in the browser using the NaCl plugin you probably should have just written the entire application as a native application anyway, because you're even losing HTML5's one strength, cross platform support, once you start using native code in your web app! This wasn't what the browser was supposed to be-- it has been slowly repurposed from an application that was designed to deliver formatted documents to now practically being an operating system running on top of your operating system. And all of this technological advancement was achieved through kludge built on top of kludge built on top of kludge.

The most ironic thing of all is that the more that the W3C and the WHATWG try to make the Web act like a platform for native applications the more that they are losing potential users to actual native applications. Smart phones and tablets are steadily becoming the most used computing devices, right? And what do they have on them for everything? Apps. If you want to check your e-mail you don't visit your webmail account using your smart phone's browser-- you use the phone's e-mail app. Do you use your phone's browser to visit youtube.com to watch videos? Heck no, you use the phone's YouTube app. Do you want to read the New York Times? How about check Facebook? There are apps for that too! Pretty much the only time you use your smart phone's browser for anything is during those rare times when there actually isn't already an "app for that." Considering the fact that web applications, due to their lowest common denominator approach so that they can be (in theory anyway) cross-platform, will always only be a second best solution when compared to purpose-designed native apps. With the traffic of the entire World Wide Web starting to decline in favor of native Internet-connected apps, it almost makes me wonder if it is wise to still try to turn the Web into something that mimics these full-blown native "apps" when, by their very design, Web apps will always be outperformed by their dedicated native counterparts.

Look, I am no Luddite. I am in no way saying that we should bring the Web back to the days of 1997 with HTML 3.2. As a long time Web user, and someone who has done both server-side and client-side web development professionally, I can definitely say that I have benefited a lot from many of the advancements made to browsers over the years. But when I read about advances such as WebGL, the NaCl plug-in, and other such technologies that essentially turn the browser into more and more of a full-blown virtual machine with a mini OS, I can't help but wonder if all of that new functionality could have been better implemented on a platform that was actually intended to perform those functions instead of coercing what was supposed to be a document viewer to do it. You know, like creating a whole new cross-operating system platform designed from the start for these kinds of applications instead of continually tacking more and more features and API's onto a legacy document markup language and the application that was meant to render that markup language. Of course then again I guess that that was what Sun Microsystems was originally trying to do with Java and the Java Virtual Machine back in the 90's, and apparently that didn't work out so well either or else we wouldn't have felt the need to create full-blown Web applications complete with local storage, Web Workers, and real-time 3D with HTML5 in the browser to begin with-- we would all be running Internet connected purpose-built Java apps instead.

In any case, I'm not saying it's wrong or right to add capabilities such as WebGL into the browser. I am personally still very much divided on the idea myself, and I can see both opportunities and pitfalls from the additions of such features. All I am saying is that we are getting to the point where we are trying to do so much in the browser now that we should start asking ourselves a little bit more frequently about why what we are trying to do needs to be done in the browser, and if it actually should be done in the browser at all. After all, there are already huge swaths of people that are using add-ons like NoScript and disabling Flash, Java, Silverlight, and Shockwave as it is to cut down on some of the Web's current in-your-face feature clutter and unwanted interactivity. To these people, Canvas animations, WebGL animations, and HTML5 videos are going to become that much more annoying stuff that they are just going to have to disable and block. Just because these new features are being defined by the W3C doesn't mean that they won't be used by advertisers to annoying ends like their past proprietary counterparts!

Troubled! Yahoo! tries! to! keep! staff! sweet! with! free! nosh!


I personally hope that Yahoo! does find a way to survive with most of its current web services intact, as I like having a third viable choice for web services such as e-mail, maps with directions, aggregated movie theater show times, special-interest discussion groups (e.g. Yahoo! Groups), etc. than just Google and Microsoft's competing offerings. While Yahoo! may not be the great force in the Web that it once was, there are still a few niche things that I feel that it does better, so I really want to see it stick around.

Groupon loses second top sales bod in a week


"Groupon is losing its US sales boss Lee Brown, who followed other senior execs who scarpered off earlier this year."

Like rats deserting a sinking ship...

Akihabara unplugged: Tokyo's electric town falls flat

Thumb Down

Re: Yes, we know.

"I can't understand why Japanese nerds are into manga. I'd think think comics of any sort too juvenile to be of interest to a nerd."

A "troll" icon would have been appropriate for your post if you weren't hiding behind the "Anonymous Coward" mask. First of all, individual nerds like what they happen to like. Some people are huge into Star Trek, others are big on Doctor Who, some build and launch model rockets, some enjoy video games, and then there are some who enjoy reading comic books and/or Japanese manga. Calling someone else's past time "juvenile" is throwing stones while living in a glass house, as I am sure that there are things that you're interested in that might not be seen as worthy pursuits by everyone else around you. You may not have to like what other people choose to spend their free time doing, but don't go badmouthing other peoples' interests either.

With that said, while in the West comic books and cartoons are often (unfortunately) seen as works that should be made primarily for children, in Japan that is not the case. In Japan, comic books and animation are seen as a medium, not a genre, and as a result you can have any kind of story, ranging from light-hearted stories meant for small children all the way up to dark and serious stories meant strictly for adults, placed in animated or comic book form. So calling the reading of all manga "juvenile" is painting with an overly broad brush, because many popular manga titles in Japan are specifically created for and read by adults. And besides, I don't know why you have determined that reading comics and manga has to be an exclusive pursuit. There is nothing that says that someone who enjoys reading comics or manga couldn't also be an avid BMX cyclist and/or enjoy working with technology as well.

Microsoft's new retro-flavoured logo channels Channel 4


Re: Yellow.

Easy! The yellow represents the yellow smiley face-logo from Microsoft's disastrous past product Microsoft Bob because Microsoft is repeating its past mistake and trying to cater to only the lowest common denominator of its users once again!

(Other than that, the only other yellow-color branded Microsoft product that I can think of off of the top of my head was the cute little fox logo from Microsoft Visual FoxPro. Hmm, they said at the time in 2007 when they end-of-lifed Visual FoxPro that they were doing so because it was COM-based and they didn't want to redevelop it for .Net. Now that the new WinRT application architecture is essentially COM-based again, I wonder what their manufactured excuse would be for not further developing that product now... Well, besides the fact that they already killed it that is.)

Pixar open sources production animation code, patents


Re: Pixar will miss Jobs like a fish misses a suntan

Now don't get me wrong-- I am by no means a Steve Jobs fan or apologist. In fact, I can't stand Apple products and have owned computers from just about every other company except Apple over the years because of it, including some from some of the more exotic companies such as Sun, SGI, and even DEC VAXstations. With that said, I don't think that you are giving Steve Jobs credit where credit is due. Did Steve Jobs found Pixar? I would argue against it. Did Steve Jobs create any of Pixar's software? No. Did Steve Jobs try to turn Pixar into a hardware business to sell Pixar Image Computers, almost to the detriment of the company? Yes. However, guess what Steve Jobs also didn't do-- he didn't run Pixar into the ground.

Now, at first this may not seem like all that big of a deal. A CEO of a company shouldn't be running their company into the ground after all. But now take a step back and look at the state of the technology industry today, where you have Steve Ballmer and his personal little "Wormtongue" Steven Sinofsky doing everything that they possibly can to run Microsoft into the ground, and how Léo Apotheker nearly ran Hewlett-Packard into the ground, and how Stephen Elop is trying to run Nokia into the ground, and how the last half-dozen or so CEO's of Yahoo! each brought Yahoo! a little closer to death, etc. Sadly, somewhere over the years the ability to not run a company into the ground has become an exceptional and rare skill for a corporate CEO.

As much as I hate to say it, sometimes knowing when to leave well enough alone at a company can be just as important as directing a company for a CEO, and Steve Jobs seemed to know when to leave well enough alone at Pixar. (Either that, or Steve Jobs was just so busy running NeXT, Inc. into the ground during the 90's that Pixar escaped his attention long enough to become successful despite him! LOL!). Either way, Pixar ended up flourishing while Steve Jobs was at the helm of the company, so you have to at least give him some credit for that. If it was another Steve (such as of the Ballmer, Sinofsky, or Elop variety) that ended up buying Pixar from George Lucas all of those years ago, Pixar might have ended up as nothing more than another little technical footnote in the history of the CG industry, such as Dave Poole's revolutionary company, Foonly, Inc., before it.

Microsoft: It was never 'Metro,' it was always 'Modern UI'


Re: Naff: adjective. See "Microsoft"

Wait just a second there... are you telling me that so.cl is still around? I had only first learned about its existence at a local Windows User Group meeting a couple of months ago, but it was so unremarkable and unimpressive that I had literally already completely forgotten about it since then. Up until reading your comment above I had never seen it mentioned again either, keeping me blissfully ignorant of its continued existence. Thanks for ruining that.



Re: Armadillo?

FYI: The text introduction at the beginning of the NASA video clearly states that this lander was partially manufactured and assembled by Armadillo Aerospace.


I will have to remember this the next time one of my little hobby electronics projects go awry-- even things designed and controlled by the brilliant minds at NASA fail sometimes.

In any case, while I found this video fairly amusing, I sincerely hope that the U.S. Congressmen don't get the wrong idea from it. As NASA stated, it is far better for problems to crop up during the prototype phase here on Earth when it is still early in the program instead of having these problems crop up in the atmosphere of Mars years later after billions of dollars has been spent. Even so, I could see a U.S. Congressmen seeing this video and interpreting it as yet more "wasted" tax payer money literally going up in smoke, and then slashing NASA's budget even further to the bone. NASA's one of the shining examples of the good that the United States can do, so I would hate to see anything that would cause NASA to get crippled even further than it already has been by the U.S. government.

Microsoft: It's not Metro, it's Windows 8


Even the dead work harder...

Man, I wish that I worked for Microsoft's marketing department-- from the looks of it, it appears that Microsoft's brand marketers do even less work on a yearly basis than the Porsche 911 body-style designers do at Porsche!!! I mean seriously, I don't think that Microsoft's branding team have performed any service requiring anything even close to the firing of a single brain synapse in years. For example, calling the new ARM version of Windows 8 "Windows RT" when both the ARM and x86 versions of Windows use the WinRT platform was anything but brilliant. Naming their new tablet PC the "Surface" when they already had a touch-based UI table computer called the "Surface" was even worse, and now this whole "Windows 8 style" term for the UI of Windows Phone 7, Windows Phone 7.5, Xbox 360, and Windows 8 is potentially the most confusing naming blunder of all. I swear, if these jokers had to name a car they would call it the Microsoft "Drive Shaft," and then call its gauges and steering wheel (e.g. its "user interface") "Horse and Buggy Style," even though it doesn't involve a horse or have reins! *face palm*

Party like it's 1999: CDE Unix desktop REBORN

Thumb Up

CDE is 20% cooler with ponies!

Check out the photos of the Sun SPARCstation 2 running the Solaris 7 CDE desktop in this article:

"Rainbow Dash Pulls a Kevin Flynn" - Equestria Daily


Rainbow Dash can make anything look good, even CDE! :)


Full disclosure for those of you who are interested: I am the same "Hoagiebot" that is mentioned in the Equestria Daily article above, and the SPARCstation 2 that was photographed in the article is from my exhibit at last year's Vintage Computer Festival Midwest (VCF-MW). I had spruced the machine's CDE desktop up with the custom MLP:FiM "Rainbow Dash" X PictMap backdrops to make it more eye-catching and a better conversation piece for the festival's attendees. The other items shown in the photo are Sun Microsystems-related in nature, and were also part of my little exhibit. The plush toy dog and cheetah were actually promotional items handed out by Sun Microsystems at some of their conferences back during the 1990's. The plush dog was based on a real Greater Swiss Mountain Dog named "Network" the dog, which was Sun's animal mascot through the late 1990's and a real-life pet of Sun co-founder and then-CEO Scott McNealy. The plush cheetah was handed out to promote the Sun UltraSPARC III microprocessor, which had the development codename of "cheetah." The blue mat that the two plush toys are sitting on is actually a Sun Microsystems Field Service Kit that I own. The second Sun SPARCstation that you can partially see in the far right of the photos is an even older Sun SPARCstation 1.

It's a shame that Oracle no longer produces Sun workstations-- you can't really put technicolored cartoon ponies on database appliances! What fun is that! *sigh*

Why women won't apply for IT jobs

Thumb Up

Re: well

@stanimir: The "The Gender Equality Paradox in Norway" documentary film that you linked to was very interesting and enlightening. Thank you for posting the link.


Re: I must of missed

The US has a quite anti-intellectual culture and we have the elevation of athletes and things like "Revenge of the Nerds". It's little wonder that only dedicated geeks enter ANY sort of engineering or tech field. It doesn't help that most Engineering programs don't try to retain people but try to actively discourage them.

The seemingly anti-intellectual culture in the U.S. is something that I actually think about quite often, especially since it didn't always seem to be that way. I almost wonder if the resentment held towards science and technology by many Americans today has to do with our society's failure to reach the lofty goals that we had set for ourselves back in the 1960's. For example, in 1967, Time magazine declared the entire generation of people that were "twenty-five and under" their "Person of the Year," and wrote the following about them:

“He is the man who will land on the moon, cure cancer and the common cold, lay out blight-proof, smog-free cities, enrich the underdeveloped world, and, no doubt, write finis to poverty and war.”

Let's see how much of that we have accomplished since then:

Land on the Moon: Yes! However we haven't had a manned spaceflight to the moon since 1972. Bummer.

Cure Cancer: We have much better treatments for cancer than we did in the 1960's, with many of them extremely successful, but by saying "cure" in the article I think that Time was looking for something much better than that, such as a fool-proof prevention of all cancers. Unfortunately, we have not found anything close to that yet.

Cure the common cold: No

Lay out blight-proof, smog-free cities: No

Enrich the underdeveloped world: We could have done a whole lot more than we did

Write finis to poverty and war: Not even close

I mean think about it-- in the 1960's everyone was talking about how wonderful the "Space Age" was going to be, with mile-high cities made of gleaming steel and glass, jetpacks that you could fly yourself to work with, supersonic airliners that could fly us across the Atlantic in less than 3-hours, pills that would allow you to not have to sleep, doubling your productivity, a supply of nuclear power that would be so inexpensive that electricity would become a basic human right, robot servants, giant laboratories were scientists could permanently live on the bottom of the ocean, or in orbit, or on the moon... I mean people really believed that we could achieve that sort of stuff back then, and that science would eventually make almost anything possible.

It was that hope that inspired young people to join the scientific and technical fields during those days. Back then everyone wanted to be an Astronaut, a rocket scientist, a biologist working on cracking interspecies communication with dolphins, or a researcher looking for the next disease-eradicating miracle pill. Everyone wanted to be the next "Johnny Quest," and use super-science to save the world. But when it turned out that achieving those lofty goals wouldn't be so easy, and that to reach them it would take more hard work, funding, and time than anyone had ever imagined. I think that somewhere along the lines as a culture everyone started to become disillusioned. People began to stop viewing science as our great savior and the scientists as the great heroes who were going to bring those technological miracles into being. Pretty soon people stopped doing things for the good of mankind and started doing things only based on how much money they could make doing it. Businessmen and politicians starting looking for short-term research projects that could lead directly to a quick financial return, often neglecting the fact that sometimes scientific research done for research's sake can eventually lead indirectly towards greater more important discoveries. Think for example how NASA's support for the then fledgling semiconductor industry in the 1960's helped indirectly lead to the "Information Age" of the 1990's. Could solid-state computers have evolved to where they are now without the Apollo program's help? Sure, but it would have taken a lot longer if the research and funding required only became available through market forces.

Nowadays there is no need to want to be an Astronaut anymore because there is no manned space program. The pharmaceutical industry, whether fairly or not, is now viewed by most people as valuing their financial bottom line more highly than trying to rid humanity of plagues and disease. Our great supersonic airliners ceased their operations in 1983 (the Tu-144) and 2003 (the Concorde), respectively. Our great advances in Information Technology are now being used against us by corporations to profile every detail of our lives and deliver targeted advertising to us. And worse of all, we actually live in a world where we have to tell our children how we used to be able to send people to the moon and how we used to have airliners that traveled faster than sound. And those few young people out there who still want to get a degree in a scientific or technical field often find that the great research and achievements that their universities are responsible for are overshadowed by how good their varsity football and basketball teams are. Even when these students graduate their futures aren't certain-- with the short-term "how will this boost our next fiscal quarter results" kind of thinking that corporations and shareholders have today, many companies have moved much of their R&D to labs overseas where PhD's can be hired for cheaper, leaving the brilliant young technical minds here to get a job where they are treated like cattle in a cubicle farm at best, or left only being able to get a job at Wal*Mart or McDonalds (or no job at all) at worst. And all the while our politicians wonder why our students' math and science scores are slipping and people care more about becoming "famous for being famous" than actually achieving something that has any social worth.

To sum up my thoughts, I think that this is in many ways a top-down problem. I believe that more of our youth (including young women) would get excited about science and technology if there was something huge going on to get excited about. The U.S. Space program inspired a massive increased interest across all of the sciences during the 1960's-- there must be something along those lines, or perhaps several things, that either government or private enterprise could use to inspire the youth of today in much the same way. I think that if young people are given a compelling reason to "shoot for the stars" in math and science that they will put in the extra effort to do so on their own, and those drooping math and science scores would fix themselves.

Hams: We're good in a disaster – UK Radio Society boss


Re: Breaker, breaker....

@JohnG: I realize from your icon that you're trolling, but I'll quickly feed the troll anyway for anyone here that may not actually know the difference between Amateur Radio and CB. As a licensed U.S. amateur radio operator, I can only speak knowledgeably about how things are done here in the U.S. In our case, there is a huge difference between Citizen's Band (CB) radio and amateur radio. Without getting too specific, CB radio is for the most part limited to 40-voice channels on a single band (the 11m band), and 4-watts of transmitting power on AM (or 12-watts if you are transmitting using single-side-band.) CB Radio can be used by anyone in the U.S. without needing to pass any kind of exams or earn a license, and while there are rules regarding its use, frankly the enforcement of those rules is pretty lacking so the CB radio landscape is very much a disorganized free-for-all.

Amateur radio operators, in contrast, can operate on several different radio bands. In the U.S. there are amateur radio bands ranging from the 160m band at the lowest reaches of the shortwave portion of the spectrum to bands located all the way up in the microwave portion of the spectrum. Depending on the frequency band and the privileges of the class of license that an amateur radio operator holds, a U.S. amateur radio operator may transmit using up to 1500-watts of power, which is more than powerful enough to literally bounce radio signals off of the moon and back. With all of that power however comes much greater responsibility, which is why amateur radio operators are tested thoroughly on radio law, radio theory, basic electronics, and other related topics, and are granted licenses by their respective governments. Unlike with CB radio, the rules of the Amateur Service are strongly enforced, and in addition to that are self-policed by amateur radio operators themselves as well.

It is partially because amateur radio operators have capable radio gear and partly because of our training and discipline that we become such a valuable fall-back communications asset during times of national disaster or emergency. In the U.S., amateur radio operators even hold what they call a "field day," which is where they practice running their amateur radio equipment in remote areas off of batteries and generator sets to practice operating during emergency conditions when mains power isn't available. U.S. amateur radio operators also maintain dedicated "ARES" groups, which are amateur radio operators formally trained in providing public service and emergency communications and sponsored by the ARRL. I myself am a trained severe weather spotter and volunteer my services to my local National Weather Service office during severe weather outbreaks. I do this by being part of an amateur radio "severe weather net," were dozens of hams spread across the effected area with mobile radios, maintain contact with each other and the National Weather Service and provide "eyes on the ground" to report severe weather events such as tornadic activity, large hail, damaging winds, fallen trees, and flash floods. I am sure that the U.S. is not unique in these efforts, and that UK hams probably have similar services setup as well.

So amateur radio operators truly are useful to have around as a backup emergency communications service. While there are a few "strange creatures" involved in amateur radio, the vast majority of us are very civic-minded and willing to help our communities, and when other more conventional forms of communication fail our gear and people can still get the job done and the message out. And as one AC above me already mentioned, since amateur radio operators are already trained and buy their own gear, we don't cost the government a thing when they need to use us. So whether you think that we're all a bunch of bearded weirdos or not, like the article states we're good to have around.

Digg, deep in the hole, sells self for $500K


To quote the CEO of Digg, Matt Williams, from the article:

"We wanted to find a way to take Digg back to its startup roots."

By making a buyout deal with that retains absolutely *zero* of Digg's current staff? That's not taking the company back to its "startup roots," that's hitting the damn reset button!

Since you could argue that a company's employees, the people that make a company actually work, are one of its most valuable assets and Betaworks is keeping none of them, Betaworks must be buying Digg primarily for its brand-name recognition only, which is pretty darn sad statement for the value of Digg. If Betaworks was interested in any of Digg's underlying technology you would think that they would retain at least some of Digg's remaining engineers and developers because those are the people who understand how Digg's platform works and could help Betaworks incorporate it into their own web offerings.

On a personal side note, these kinds of stories always make me feel kind of down. I was never a Digg user, so I have no attachment to Digg itself, but it is always kind of sad to see any tech company that was once considered to be "the next big thing" or an industry darling do a catastrophic tumble from grace, lay off most or all of its staff, and then get picked apart by the vultures like this. We've seen it so many times before: Infocom, Commodore, Silicon Graphics, Thinking Machines, DEC, Midway Games, SiCortex, and so on. It makes you feel bad for those poor mill-of-the-run employees who lose their jobs (especially when the company's execs get a golden parachute at the same time), and it makes you wonder what "could have been" if things at the company were run better, the company's innovations came to market quicker, or external circumstances had been better. Yahoo! and RIM had better take a note of this and shape up, because if they don't I have a sinking feeling that in a few months I am going to be writing these same sort of lamenting comments all over again, only next time around it's going to be about them.

Early verdict on Intel Ultrabook™ push: FAIL


Apparently, Ultrabooks are at least 5-times "deader" than netbooks...

Everybody in the media seems to "know" that the netbook computer is dead. PC World claims that they're dead, PC Magazine claims that they're dead, Forbes claims that they're dead, Gizmodo claims that they're dead, ZDNet claims that they're dead, etc. However, based on the current sales statistics released by Canalys, the IT analysis firm that many of the above-mentioned media outlets sourced to make their netbook death claims, over 5.3-million netbooks were sold worldwide in the first quarter of 2012. And that sales figure is for just one quarter of 2012 mind you. If the IDC analyst that was quoted in this El Reg article is correct, and PC makers only end up seeing around 1-million Ultrabook sales by the year's end, there are only two ways that you can look at it: either Ultrabooks are at least 5-times "deader" than netbooks are, or maybe netbooks are not quite as dead as the media and the tablet floggers would like you to believe. Either way, things don't really seem to be looking all that positive for Intel-based Ultrabooks.

Android games console scheme nets $2.5m


Re: Just wait until the lads hear about this...

King Gameboy III! LOL! Now I know what I will name my next pet cat!

Disable Gadgets NOW says Redmond


Re: the moaning bandwagon must be close to capacity...

"That is of course assuming MS won't simply finish them off in a next "security" update."

That is my fear exactly, ShelLuser. I am one of those people here who actually liked Windows Sidebar Gadgets, and I still have four of them running in my sidebar as I type this. Heck, I even bought a couple of books about programming Windows Sidebar Gadgets so that I could create a few of my own. Sidebar Gadgets can be really handy as long as you can find gadgets that suit your purposes and appeal to your personal sense of taste.

Now that Microsoft has decided to become hell-bent on getting rid of the little gadgets and have labeled them as a "security risk," they very well may end up posting an "Important" update during the next patch Tuesday that eliminates Windows Sidebar Gadgets from Windows Vista and 7 automatically, without asking the user first. And how will most Windows users even know that an "Important" update will remove their gadgets before it has already happened? First of all, many users use Microsoft's recommended setting of having automatic updates. Should Microsoft push a gadget-killing update out, these people will just turn on their PC one morning and find that their gadgets are mysteriously gone.

Even people like myself that like to review updates before installing them may still inadvertently lose their Sidebar Gadgets should such an update go out, since so many Windows updates are only generically described as:

"A security issue has been identified that could allow an unauthenticated remote attacker to compromise your system and gain access to information. You can help protect your system by installing this update from Microsoft."

How many "Important" updates with similar descriptions were sent out during this last patch Tuesday? Five? Unless you start reading the associated knowledge base articles for every patch from now on, you could still easily let a gadget-killer update through. And while I am capable of reading these knowledge base articles if I have to, several members of my family all use gadgets on their Windows laptops, and they'll get plastered by such an update for sure leaving me to have to clean up the mess and try to get their gadgets back. I sincerely hope that Microsoft leaves Windows Sidebar Gadgets alone-- I don't want Microsoft to take the easy way out and not bother to fix the flaws in the gadget platform and just kill them outright because Steven Sinofsky has suddenly decided that they are passé!

New electronic labels squeal to spare you from food poisoning


Re: power source

FYI: The lemons and potatoes used in those childhood Science experiments were not what was generating the electric charge-- the purpose that they served was to be the ion bridge between the anode and the cathode that you were using. It is the electrochemical reaction between the metals used for the anode and cathode that actually produces the electrical potential difference (the voltage of the lemon or potato "battery").

Source: http://en.wikipedia.org/wiki/Lemon_battery

Ten... pieces of tat for Apple fanboys


Why couldn't they have chosen a computer company that's cooler than Apple?

Why couldn't they make this kind of cool clothing and merchandise based off of the logos and products of computer companies that actually were cool, like Cray Research Inc. and Thinking Machines Corporation? That's the kind of stuff that I would want to wear! And to heck with those cheesy-looking "iOS app refrigerator magnets." How about making a refrigerator that looks just like a full-size Thinking Machines CM-5 massively parallel supercomputer? (You can see what a CM-5 looks like here: http://people.csail.mit.edu/bradley/cm5/ ) Now that, my friends, would truly be awesome!

Mystery buyer scoops working Apple 1 at auction


Re: I don't get it.

They're buying it because it is a piece of computing history. The Apple I may have been a very humble beginning, but it was still the starting point of what is now a massive computing empire-- the birth-product of Apple Inc. Computer enthusiasts may want to own an Apple I in their collection for the very same reason that an automotive enthusiast may want to own a Ford Model T-- because they are products that changed the landscape of their respective industries and lead to the rapid growth of their producing companies.

While I will never have the kind of money to be able to afford purchasing a real Apple I, back when I was in college I bought a kit that allowed me to solder together a working replica of an Apple I. The replica that I built is completely compatible with real Apple I's, and can actually run real Apple I software if you can find any. The kit that I built is called the "Replica I", and while I bought and built mine several years ago a later revision of the kit is still being sold. The Replica I kit is produced by a small company called Briel Computers.

I was going to link to the company's website here to help anyone interested with finding these kits, but I don't want to risk potentially running afoul of The Register's house rules against spamming. As a result, if you're interested in one of these kits you will just have use your favorite search engine to find the company's website yourself. The cost of the kit for the Replica I is about 0.0004% as expensive as the price of the real one that just sold at auction, so you can have the fun of building it yourself and messing around with it however you want without having to worry about destroying a rare and valuable museum piece.

Microsoft corrects itself: 'We expect fewer people to use Windows 8'


"We’ve just passed the 500 million licenses sold mark for Windows 7, which represents half a billion PCs that could be upgraded to Windows 8 on the day it ships. That represents the single biggest platform opportunity available to developers."

There Microsoft goes again, happily spouting that any machine that can run Windows 7 can also effortlessly be updated to run Windows 8. Unfortunately, that is simply not the case. Apparently Store Partner Program Manager Ted Dworkin must not be taking into account all of the millions of Windows 7 Starter edition netbooks that are out there with 1024x600 resolution screens. They can run Windows 7 just fine, but if you install Windows 8 onto them they will not be able to launch a single Metro app due to the inflexibility of Microsoft's seemingly arbitrarily defined Metro 1024x768 minimum screen resolution requirement. The best that a poor netbook user can do in that case is alter a value in the Windows registry to enable screen downscaling, but then you have to put up with a distorted and very blurry display just so that you can click on that Metro weather app.

I bring this up because it is a real personal thorn in my side since I own one of the before-mentioned Windows 7 Starter netbooks. I installed Windows 8 Consumer Preview onto it thinking that it would go as smoothly as Microsoft hinted that it would only to later get the rude awakening that Metro apps refused to work. I realize that it has become popular lately to try to marginalize the millions of netbook users that are still out there and pretend that they just aren't as important as all other Windows customers, but don't outright lie either and claim that all 500-million Windows 7 PCs that are out there are all capable of flawlessly running Windows 8. They aren't.

Toshiba America says no to new netbooks

Thumb Up

Re: When Ultrabooks are the same price as netbooks...

Man, you really hit the nail on the head right there, James 51-- I couldn't agree with you more! While I don't think that I would ever want to replace a full-sized laptop or a desktop machine with a netbook for any kind of serious development work or content creation, my little Acer Aspire One netbook is great for taking notes and following along with programming examples at programming seminars, IT conferences, and computer user group meetings. It's just so small, light, and convenient to take with me, and its keyboard keys are big enough for my sausage-fingers to easily type with. And even better, netbooks are cheap, and they really do provide good performance "bang for the buck" considering how inexpensive that they are. Just as you stated, if the more powerful Ultrabooks were being sold at less expensive netbook prices (say, $250), I would definitely replace my netbook with one, but until that day comes my beloved netbook is just fine!

Ten... Star Wars videogame classics

Thumb Up

Re: Empire Strikes Back on the Atari 2600

I second your praise for "Empire Strikes Back" on the Atari VCS (2600). I still distinctly remember begging my mother to buy me that game when I first saw it on a "clearance" shelf at a Sears outlet store back in the 1980's. To my father's dismay my mother caved in and bought it for me, and it became one of my all-time favorite games for that console. Man, I played the heck out of that game while I was growing up! Some of my friends and I were also extremely addicted to X-Wing for MS-DOS and its two expansion packs, "Imperial Pursuit" and "B-Wing." And I can't even tell you how many quarters I pumped into the Atari Star Wars arcade game over the years, especially once I discovered that there was one that was located at the local late-night Mexican restaurant that was a couple blocks from my college dorm room!

Yeah, Star Wars was the best, at least until the late 1990's came and George Lucas decided that he was going to create "special" editions, prequels, and generally just screw with everything to drain the pockets of us fans. I wouldn't have minded so much if he had created some *good* movies from which to do it with, but alas we got Greedo shooting first and Episodes I through III. *sigh*

2011 sets new record for counterfeit electronics


Don't forget the absolutely horrible epidemic of "capacitor plague" that struck the computer and electronics industry about a decade a go due to many Taiwan-produced electrolytic capacitors that had a botched formula for their electrolyte. I mention this as a massive example of "counterfeit" parts causing problems instead of just "defective" parts causing problems because the electrolyte formula was originally proprietary to a Japanese company, but then was stolen and delivered to the Taiwanese manufacturers in question through industrial espionage. Unfortunately, the worker that stole the electrolyte formula copied it down incorrectly, so the millions of knock-off capacitors that the Taiwanese manufacturers produced weren't stable and leaked or exploded after a relatively short amount of time. Dell alone lost around $300-million identifying, repairing, and replacing computer components that contained these bad capacitors. In fact, I personally still have a pile of early 2000's-era computer equipment ranging from computer motherboards to terminal servers with blown or leaking electrolytic capacitors in them that I still need to re-cap and repair if I ever want to get some of them to work again.

I also have heard first-hand tales told to me about the plague of counterfeit steel bolts being produced by places like China. They are labeled as being made of a particular strength and quality of steel when in reality they are made from much poorer and weaker steels. It has gotten so bad that now many organizations are having to take the time (and money) to test all of the bolts that they buy to make sure that they actually meet the specifications that they are labeled for to keep these potentially dangerous counterfeit bolts from being accidentally used in their products.

This counterfeit part problem can be potentially very serious depending on what kind of system the part is installed into and how off-spec the part is. There are real reasons for organizations such as the military, the aerospace industry, and even consumer electronics manufacturers to be extremely concerned-- should the device that fails due to counterfeit parts be particularly important, it could mean the loss of a large amount of time and money, or in some cases possibly even lives.

Ballmer says 500 MILLION 'users' to 'have' Windows 8 in 2013


Why is MSFT forcing a "False Dilemma" with Windows 8?

I just don't understand why Microsoft is so suddenly against allowing users to be able to customize their Windows using experience so that they can no longer use Windows the way that they want to. Why does Windows Aero Glass *have* to now be completely removed from the desktop side of Windows 8? I can understand Microsoft's motivations to want to make the desktop side of Windows 8 look more consistent with the Metro side of Windows 8, because currently the Windows 8 Consumer Preview looks like exactly what it really is-- an operating system with multiple personality disorder. And I can also see how the simplified new Metro-esque desktop-look would benefit battery life and system performance because it would take so much less resources to render onto the screen. However, with that said, I still don't see why customers shouldn't still be given the option to be able to switch back to Aero Glass if they want to, or even the Windows 2000-esque "Classic" look if they want to.

I for one always really loved the look of Aero Glass. I personally thought that it looked fantastic. I have Windows 8 Consumer Preview running on my netbook right now, and I have an orange-tinted Aero Glass theme which color-coordinates nicely with my current primarily yellow and orange-colored desktop wallpaper image. It's aesthetically pleasing to me, and the performance of Windows 8 Consumer Preview's Aero desktop has always been just fine on my little single-core Atom-powered netbook. However, at the same time I realize (even though Microsoft doesn't seem to) that what I think is aesthetically pleasing isn't necessarily going to be true for everyone else-- Some people may prefer the "Classic" Windows desktop look of Windows 2000, others may prefer the "Luna" desktop theme of Windows XP, and still others may actually prefer the brand new upcoming Metro-esque desktop look. As a result, people should be able to choose the desktop appearance that they like the best, and go with it. And while using a sparse, simple, low-on-resources blast-to-the-early-1990's-past Metro-esque desktop interface may make a lot of sense if you are trying to squeeze all of the performance and battery life that you can out of an ARM-based tablet, when you are using a multiple-core powerhouse desktop machine with one or more high-end graphics cards installed in it and running off of wall power, I doubt that the bit of extra resources needed to render the Aero Glass desktop in this day and age would really be of that much consequence. I swear, I think that Microsoft is repeating the mistakes of the past and taking a page out of Henry Ford's playbook where he famously said in 1909, " Any customer can have a car painted any color that he wants so long as it is black."

Microsoft would really make a lot more people happy if they would just allow generous levels of user-customization back into the Windows user-interface. They should really know by now that one-size doesn't necessarily fit all, especially when Windows 8 is going to be running on everything from touch-tablets to netbooks to laptops to desktops with multiple screens. With that in mind, people should be able to choose whether they want Metro or the Windows Desktop to be their default desktop. They should be able to choose what kind of chrome that their desktop has. They should be able to choose whether or not they want to use "ribbons" or classic menus. They should be able to choose whether or not they want a "Start" button, and if that Start Button leads to a Windows "Classic", Windows XP, or Windows 7-style Start Button menu. That way everyone can be made to be at least somewhat happy, everyone could use Windows how they are familiar with using it, and everyone can benefit from the enhancements that have been made underneath Windows 8's hood without having to sacrifice their user-experience to get there. This solution just seems to brain-dead simple to me, as it allows everyone to use the options that work the best for them.

Why in the world is Microsoft so hell-bent on making things in Windows 8 work only one way and then forcibly shove it down our throats? Just what do they have to gain by being so inflexible, and saying that their way is "right," and that all dissenting opinions are "wrong?" Sinofsky and others are acting like there is no other direction to go but their new vision, like they have burned their ships behind them and that there is no turning back. But we all know that this is a "false dilemma" logical fallacy that is simply not true-- the Windows 8 Developer Preview could still have its Start Button re-enabled with a registry setting change, and Windows 8 Consumer Preview still had an Aero desktop. If Microsoft could dike that stuff out in such a short amount of time like they have, then I would think that they should also be able to add it back in and make using it an option. For crying out loud Microsoft, give your customers some customization options and let us choose how we want to work! You can make all of your new GUI developments with Windows 8 be the default settings if you would like, but allow us to be able to switch back what we don't happen to want. What do you care if we happen to turn back on Aero or the Start Button after we buy Windows 8? After all, at that point we have already *bought* Windows 8 so you already got our money. How we configure and use Windows 8 after that point should be our choice. Operating systems are suppose to make our lives easier-- not frustrate or hinder us!

Crytek: Schemes to strike second-hand games biz 'awesome'


This could eventually kill retro-gaming

Like many of you here probably have, I grew up with the Atari VCS and the original Nintendo Entertainment System. However, as fortunate as I was to be able to have those two game systems, I had many friends who had other systems, such as the Intellivision, the Turbo Grafix 16, the Atari Jaguar, the Sega Genesis, etc. I would go over to my friends' houses and play games with them on those other systems, and I find myself loving a particular game and wishing that I could have that other game system just so that I could play that game. However, at the time when those game systems were new they were also pricey, and after my parents had just bought me my NES how could I go to them and suddenly ask for a Sega Master System, an Atari 7800, and a Turbo Grafix 16 too? I couldn't-- we didn't have that kind of money and my having the gall to ask for something like that would have sent them through the roof!

Luckily, thanks to the second-hand game market I can now try all of those games and systems that I never was able to own when I was younger. Even as late as the early 2000's my house was an XBox house, which meant that I never got to play any of the neat-looking games that were only available on the Nintendo GameCube or the Sony Playstation 2. As a result, when I saw a used Nintendo GameCube with a controller on sale for $15 on a dealer's table at a classic gaming convention some months ago I jumped at the chance to pick it up. I've been wanting to try both Starfox Adventures and Starfox Assault for about a decade at that point, and now I finally had the chance to do so and I wasn't killing my bank account in the process. If there was no second-hand gaming market, how could I ever do this?

The sad fact is that if the game console creators and the game software developers find a way to block the sale of second-hand games, eventually the pastime of retro-gaming will cease to exist for future gaming consoles. Imagine if they had pulled this kind of crap with all of the classic Atari, Nintendo, Sega, Sony, etc. systems 15 to 30-years ago? Nobody would be able to pick them up and play them now! Entire generations of classic games, from the original Missile Command to the very first Super Mario Brothers would be lost and unplayable to new generations of gamers. That's what is going to happen if they implement anti-secondhand market measures now-- in 15 to 30 years time no one will be able to play the favorite games of today any longer because the second-hand market for them will cease to exist.

I realize that today's game console makers and game development companies only care about what their bottom line is going to be during the next fiscal quarter, but I think that they will be hobbling interest in gaming in the long run if they kill the second-hand market. How will they get new generations of people hooked on gaming if they kill-off the cheap "gateway drug" that is second-hand games and second-hand systems? Someone playing a second-hand game or console may love playing it so much that they will eventually shell out for the current generation product. The game companies ought to think a little bit harder about that, as they may find that all of the cost cutting, price raising, anti second-hand market, and anti-piracy tactics in the world will be for naught if they erode their own customer base in the process.

Beefy Fedora could use a dash of miracle whip


Re: Unified File System Layout

Thanks, Mangobrain-- that page that you linked to provided some interesting reading. Even more interesting was something that that page linked to, which was a comment written by Rob Landley discussing the original historical justification for the /bin vs. /usr/bin split, which can be found here:


Apparently, all of this trouble with /bin vs. /usr/bin started when Ken Thompson and Dennis Ritchie ported the original UNIX operating system (which they had developed on a DEC PDP-7 two years earlier) to a DEC PDP-11 in 1971. The PDP-11 had a pair of 1.5MB RK05 disk packs for storage. Originally, they had the operating system's file system completely stored on just one of these RK05 disk packs, until it grew too large to fit. So they split the file system across a second disk pack, which was as you might of guessed was mounted as "/usr". They called it "/usr" because they initially moved all of the user directories there, however they soon replicated all of the OS directories, such as /bin, /sbin, /lib, /tmp, etc. over there as well because the first disk pack was out of space. Soon they were running out of space on the second disk pack, so they acquired a third RK05. Once again they initially moved the user directories to the new disk pack, and mounted the new third disk pack as "/home". This allowed the operating system to then fill the first two disk packs ("/" and "/usr"), with all of the operating system directories duplicated on both of them.

So if this comment by Rob Landley is correct, the only historical reason for having the file system split between "/bin" and "/usr/bin" and other such duplications was due to the original small disk capacities of the PDP-11 that UNIX was developed on 40-years ago. The file-system split was never actually meant to become used for all of the different reasons that some Linux distros use it for today. All of those rules about what should be included in "/" versus what should be included in "/usr" were apparently manufactured both by third-parties and standards bodies over the years, diverged between these various groups over time, and thus are now very incompatible and distro-dependent.

Originally coming from the Microsoft Windows world, when I first started messing around with a version of Sun Solaris twelve-years ago I couldn't make any sense out of the layout of the UNIX file system, and why some programs wanted to be installed in /opt, other programs wanted to be installed in /usr/bin, still other things wanted to be in usr/sbin, why there was both a /bin and a /usr/bin, etc. Now it seems that the reason why none of it made much sense was because there really *was* little sense behind it by that time-- it all came about from a file-system layout that was split due to 1970's hardware constraints that made sense during that time yet no longer generally applied in the modern day, and yet developers still kept the convention in place, inventing new and sometimes arbitrary reasons to justify why they installed their files in whichever one of those directories that their personal philosophy considered to be the "Right One."

Also interesting is the fact that Fedora isn't the first UNIX-like operating system to attempt the "/"-"/usr" file system unification-- apparently Oracle did it first with Solaris 11. My Sun SPARC boxes are all still running Solaris 10, so I was unaware of that development in the new version. Hey, you learn something new every day! :)

Republicans shoot down proposed ban on Facebook login boss-snoop


I looked at the final vote results for the amendment, and just as I expected my local Congresswoman voted in the "noes" column. Had I known that this amendment was going to be voted on so quickly I would have mailed her a letter or at least sent her an e-mail informing her of my desire for her to support it. Not that it would have made a difference though-- I don't think my local Congresswoman and I see eye-to-eye on much of anything. I wrote her a letter once to support a pro-Net Neutrality bill many years ago, and she actually wrote me a letter back to explain to me how wrong I was. Naturally, for that reason and many others I have voted for her opponent every time that she has come up for reelection, but for whatever reason my distaste for how she votes seems to be in the minority opinion in my district, as she still gets reelected every time. *sigh*

Senators chime in on employers’ Facebook snooping

Thumb Up

These two politicians may be "cashing into what seems popular" at the moment as you put it, but I'm actually pretty glad that there are two U.S. Senators looking into this issue. It's about time that I hear about members of Congress doing something that is actually in my best interest for a change. In fact, I may have to write to Sen. Chuck Schumer and Sen. Richard Blumenthal a letter of support to thank them for their efforts, if for nothing else then to help remind them that this cause is popular so that they keep on pursuing it.

Americans resort to padlocking their dumb meters


Radio transceivers unhealthy?

I have to wonder how many of the people who claim that they are getting sick from having these SmartMeter radio transceivers around also have mobile phones, 802.11b/g/n wireless routers, and cordless landline phones and yet have never experienced any so-called health problems with them.

Intel next-gen netbook chip to sport Ivy Bridge graphics


Re: Spec sounds unlikely

When I got a brand new Acer Aspire one D255E netbook the other week I decided to replace the copy of Windows 7 Starter Edition that came on it with Windows 8 Consumer Preview to try it out. The Windows 8 operating system seemed to run on the netbook just fine at first, at least until I tried to run a Metro app. When I did, I promptly got the message "This app can't open. The screen resolution is too low for this app to run," displayed on my screen.

As you have probably already guessed, the Acer Aspire one netbook that I got came with a 1024x600 native resolution screen, so none of the Metro apps would open due to that 1024x768 minimum resolution requirement that you mentioned. What blows my mind however is that Microsoft would choose to have 1024x768 as their minimum screen resolution in the first place when there are literally *millions* of netbooks out there that have 1024x600 screens. I mean really, are those extra 168-pixels in the vertical dimension so important that it is worth alienating literally millions of netbook PC's that could otherwise run Windows 8 just fine for?

With all of the mass media scare-artists out there wildly proclaiming that "the PC is dying," you would think that Microsoft would want Windows 8 to be able to run on as much hardware as possible, including the Acer netbook that I bought two weeks ago. Or does Microsoft expect that all netbooks should just be stuck with Windows 7 Starter edition forever? I mean sure netbooks are supposedly "dying" as well, but according to figures from the International Data Corporation, a "global provider of market intelligence, advisory services, and events for the information technology, telecommunications and consumer technology markets," in the United States alone over *4-million* netbooks were sold in 2011, so yes netbook sales are declining from their 2009 sales levels, but there are still a heck of a lot of them being sold-- more so then I would think that Microsoft would be willing to just abandon like this.

For gosh sakes Microsoft-- either make the minimum resolution for Metro apps 1024x600, or at the very least allow those Metro apps to run on a 1024x600 screen with a vertical scrollbar so that I can scroll down to see the unseen 168 pixels at the bottom of the screen. Don't make the Metro apps not work at all, or force me to do a registry hack to enable blurry and distorted "display down scaling" just to see them! Having no Metro apps in Windows 8 just makes Windows 8 into "Windows 7 Pointlessly Annoying Edition!"

Newt Gingrich wants Moon to be 51st US state

Thumb Down

I for one was extremely upset by the retirement of the U.S. Space Shuttle without a viable replacement reusable orbiter already developed, tested, and sitting on a Cape Canaveral launch pad ready to go, but there is a big difference between rallying for the restarting of the U.S. manned space program and *this.* I am all for being optimistic, but how can Newt possibly think that the U.S. could ever be able to even get to the moon in 8-years let alone have a permanent base located there after NASA has already been slashed, burned, and gutted so badly and our national budget deficit is so high that we have no money left to fund such an endeavor? Private enterprises aren't going to fund such a venture unless there is some serious money to be made to make up for all of the risk involved, and I seriously doubt that the materials that the moon happens to be made of is worth that kind of investment and uncertainty from a business standpoint. Either Newt is completely out of his mind, or he is really trying to blow some serious smoke up the asses of out-of-work Florida voters.

As an American, I am pretty embarrassed by both Newt and the impression that he is giving to the rest of the world right now with these kind of statements.