* Posts by RLWatkins

229 posts • joined 14 Feb 2013


Photonic processor can classify millions of images faster than you can blink


Useful, but let's don't work ourselves into a frenzy

If this is what I think it is, an optical version of a neural net built from a block of glass fibres, it has to be hard-wired.

That means that it has to be hard-wired for a specific purpose. At low levels this isn't all that bad: we have hardwired edge and area detection for our own eyes, providing additional information derived from a field of pixels which is useful for recognizing what we see.

Scaling that up won't be one of those square-of-the-number-of-pixels things since that sort of recognition depends on nearby pixels rather than correlating information from distant parts of the visual field.

So far, so good.

The problem comes with hard-wiring the higher levels of recognition.

You want to recognize numbers and letters? Roman letters? Cyrillic? Arabic? Telugu?

You want to recognize faces? Whose faces? Maybe worth hard-wiring to recognize specific features, but beyond that not so much.

So at some point someone has either to create a *programmable* optical neural net to sit atop this device, or to use the device to enhance the input to more conventional software.

Sounds useful, but this isn't the be-all and end-all of image recognition. Recognize it for what it is, to wit a useful step forward, but let's don't get all dizzy over it.

US Space Force unit to monitor region beyond Earth's geosynchronous orbit


Re: For those [who laughed at Trump]

We laughed at Trump, or more accurately winced, because he "created" a name, and then did *nothing else* save for applying it to existing and on-going activities which were not under his purview, and then showed no further interest in the matter, which has been typical of him in the 40-odd years he's been on my radar.

Nothing wrong with having a "Space Force". Lots wrong with giving Trump credit for it.

New Chinese exascale supercomputer runs 'brain-scale AI'


"Brain-scale AI"

Remember when you see that phrase, it's time to go read something else.

The hype never ends.

CompSci boffins claim they can recreate missing lines in log files


No, no they can't.

This reminds me of the image enhancement we see on TV: pure BS. Once one throws information away, one can't then conjure it back up from thin air.

That's the long and short of it Everything else is just handwaving.

'Automate or die!' Gartner reckons most biz apps will be developed via low-code by the people who use them


This is hilarious.

Gartner conflating "cloud" with "AI" and saying that "it will write our code in the future" is typical.

And it's ignorant. And it's sad.

But their principal market consists of businessmen, who want to commoditize programming, just as they've tried to commoditize engineering, medicine, etc. They love to hear this stuff, even when the results have always been disastrous.

If they'd told businessmen, "You'll have to actually pay people to do critical and intellectually demanding professional work, and learn to live with that," they'd have lost most of their audience.

Billion-dollar US broadband bonanza awaits Biden's blessing – what you need to know


We already paid for that...

... several times over in fact. US telcos have been collecting a "broadband tax" for a couple of decades, amounting to over a half-trillion dollars, to fulfil their commitment to "bring broadband to all Americans". They convinced the head of the FCC, an AT&T lobbyist, that GSM was "broadband" and that they therefore had finished the job, without actually doing it at all.

So... we're going to pay for that all over again? How many times, I wonder, before we actually get it?

Microsoft vows to make its Surface laptops, Xbox kit easier to fix by 2022


I have a broken Surface...

... brought to me by a neighbor who asked, "Why has the screen detached itself from the tablet? Can you fix it?"

Mind you, the screen is still attached, but it is hovering about 6mm above the rest of the unit. Why? The battery blew up like a balloon. And no, I can't fix it. It is held together entirely by glue.

The only reason the screen didn't crack is that the battery heated up so much while failing, or inflating, that it melted some of that glue. Now that it's cooled off? The glue has solidified again so there is no way to get the thing apart to fix it.

Do you suppose they'll replace all these units? Not on your life.

You know the saying: Hell is full, and guilt-ridden engineers are walking the Earth.

Microsoft's problem child, Windows 11, is here. Will you run it? Can you run it? Do you even WANT to run it?


More accurately, Microsoft has launched a new shell for the NT kernel today...

... along with some configuration changes intended to encourage users to purchase new hardware.

There is less here than meets the eye.

Honda to build reusable rockets to sling sats, but won't scare SpaceX


Look closely at that VTOL...

... and note that the center of lift is way, waaay behind the center of mass. Good thing Honda has much better engineers than they do artists.

All that said, they do have damn' good engineers. I expect them to take away a good chunk of other commercial lift companies' business.

And you know? That probably isn't a bad thing.

'Quantum computer algorithms are linear algebra, probabilities. This is not something that we do a good job of teaching our kids'


Arrrgh! When did we stop teaching science to science writers?

(1) The process using "linear algebra probabilities" is quantum annealing. That isn't the only way quantum computers are built, although it is a very useful one. Quantum computer programs for, say, factoring use entirely different techniques.

(2) A quantum bit cannot store multiple arbitrary classical bits of information. Saying, "Oooh, ahh, we can store much more information here because it's quantum bits" represents a fundamental misunderstanding of the technology.

We may have made a mistake by calling these things "computers". It's as if we still called modern computers "difference engines". It limits our thinking so severely as to cause many people to completely miss the point.

Samsung is planning to reverse-engineer the human brain on to a chip


Ha ha ha ha ha ha....

... ha ha, ha ha, haha!

I'm so very sorry, but this is one of the stupidest and most blatant "Hey, let's all pay attention to ME!" press releases I've seen this year.

Not only do we not have the technology to do this, we don't even have the technology to design and manufacture any technology which might even remotely approach the necessary capability. Not even close.

How on earth can anyone take this seriously?

As Google sets burial date for legacy Chrome Extensions, fears for ad-blockers grow


I don't have an ad-blocker....

... but rather a HOSTS file which routes hundreds of ad servers to

Works great. About to translate the thing to use on Linux and Android. I have to wonder how much of the OS that's going to kill, but I do have a couple of sacrificial devices to try it out on.

Happy days.

Oh, and by the way, Hell is full and ad executives are walking the earth.

Astronomers detect burps of interstellar cannibal from 480 million light years away



So some stars engulf nearby white dwarves intentionally?

I want to know more about this.

GitHub merges 'useless garbage' says Linus Torvalds as new NTFS support added to Linux kernel 5.15


Using git is like driving a car without a dashboard.

Sure, one can attach a vice-grip to the steering wheel stem to steer it. One can memorize the colors of the wires so one will know which ones to short out to start the car or turn on the windshield wipers. Without pedals one can operate the throttle by pulling a string. (More difficult to operate the brakes that way, but that's why dashless, pedalless cars have hand brakes.)

Sure, one can do all these things.

When we're all old we can tell our grandchildren, "When I was young, we didn't just have to walk to and from school uphill in the snow both ways while fighting off rabid bears, we also had to use git for source control, and we liked it." Bah!

30 years of Linux: OS was successful because of how it was licensed, says Red Hat


Re: Linux is not an OS

To those who voted this comment down:

An OS consists of a kernel (or executive), a shell, and a collection of system utilities.

This is the original definition of the term, which hasn't lost validity just because marketroids are unable to grasp the details of technology.

Linux-based OSs consist of a Linux kernel, one (or more) of a variety of shells, and a bunch of utilities mostly from GNU.

Samsung: We will remotely brick smart TVs looted from our warehouse


Samsung also disables handset cameras when you root the handset...

... and demands calendar and contact access to change the background image on the home screen. The litany of such offences grows with each passing month. It's so sad; I used to be a staunch proponent of their gear. Lately? It would have to be free... and even then I'd install a 3rd-party ROM before I used the damn' thing.

US boffins: We're close to fusion ignition in the lab – as seen in stars and thermonuclear weapons


A big pile of batteries...

... discharging in 1/10 nanosecond.

That's actually pretty impressive, although I'm unsure how useful it is.

Your Computer Is On Fire, but it will take much more than this book to put it out


This is hilarious.

Been writing software for fifty years, getting paid for it for about 45; I don't see a single flaw in this narrative. Needless to say, I'll be reading this one. Always did enjoy a freewheeling and humorous account of current history.

BTW, 'AI would be the "most profound technology" that humanity will ever develop' is a great prediction for the far future, about on par with predictions of reversing ageing, or of tapping vacuum potential for free energy. How very prescient. [yawn]

8 years ago another billionaire ploughed millions into space to harvest solar power and beam it back down to Earth


This is a stupid, stupid, stupid idea.

I know it's been said here already, but I'll say it again.

The people who are pushing this are doing so because it sounds so nifty. But given the cost of getting a big solar array into orbit, and going up there to do maintenance, repair or expansion on it, it is far cheaper just to put the damn' thing on the ground, where people actually need the power, and where the storage for nighttime or peak-load demand can be sited.

I cannot believe that people are even still discussing this boondoggle.

Chinese state media describes gaming as 'spiritual opium' that stunts education and destroys families


Granted, for a fraction of gamers they're right...

... in asserting that some seek the consequence-free world of gaming, which is in some respects like real life, but less difficult and less threatening.

I've met people like that.

But they constitute a tiny fraction of people who play video games. The rest of them are just fine.

Isn't it just like the CCP to pick out a corner case, then use it to condemn an entire population? You'd almost think they were Republicans.

Google: Linux kernel and its toolchains are underinvested by at least 100 engineers


Please stop calling computer programmers "engineers".

Notwithstanding everyone's deeply held desire to please Google by Talking the Google Talk, most programmers are not engineers.

Engineers know how to plan and cost jobs. Engineers identify and address points of failure, figure MTBFs. Engineers document the steps from problem to solution to implementation. Engineers must have a thorough understanding of the science which underpins their disciplines.

For the half-century or so that I've been writing code, I've met maybe three programmers who were trained as engineers, and maybe twice that many not so trained but who understand the principles and practices of engineering.

That's nine or ten out of many hundreds. Programmers are, by and large, not engineers.

LOL ;-) UK govt 2 pay £39m 4 txt msgs 4 less thn 2 yrs


Re: Seems like a good contract to have

Simple enough. There are myriad companies which, given a document and a mailing list, will print copies of the document, stuff them into envelopes, add postage, and mail them.

I've worked for one. It isn't expensive, especially at large volume.

Happy 60th, Sinclair Radionics: We'll remember you for your revolutionary calculators and crap watches


I missed, and miss, the QL.

And that was a shame. At the time the 68000-line of CPUs would emulate a PC faster than a PC would run. Saw a lot of potential, but it just didn't catch on.

These days one can use a handset to do what once required a desktop computer: mirror it to a TV, pair a Bluetooth keyboard and mouse, and you can do word processing or coding or watch movies in a hotel somewhere on I-40.

The QL would have given us similar capability, at least similar for its time, back in the early 1980s, only it didn't catch on so the software and storage didn't develop.

It would have been a game-changer. Que lastima.

Windows 11: What we like and don't like about Microsoft's operating system so far


"old WIndows"

Old Windows conformed to a set of GUI guidelines called Common User Access, which has been under development since Xerox first invented the GUI, back in the early 1970s.

I'm a big fan of CUA, because a CUA-compliant GUI makes it immediately obvious how to operate the shell and any CUA-compliant software which it might run.

KDE, for example, is still CUA-compliant. My mom, a Windows user, and 79 at the time, sat down in front of a KDE laptop and was able to just start doing what she needed to do with no confusion and no wasted time.

I'm not a big fan of "modernization" which entails eliminating UI cues and making the UI more cryptic. The worst example I can think of was the Win8 "feature" that you open the "start panel" by moving the mouse cursor to one corner of the screen.

I'll refrain from listing the hundred-odd other problems of the "modern, aesthetic UI". None of this new stuff is, as an old math teacher of mine once said, "intuitively obvious".

NEC to move its IT into Azure and give staff – all 110,000 of ’em – a cloudy Windows desktop


What is the opposite of "kaizen" in Japanese?

Azure is after all the world leader... in downtime.

Google killed desktop Drive and replaced it with two apps. Now it’s killing those, and Drive for desktop is returning


About that "guided flow"....

I like it. The name sounds modern, even cutting edge. Kind of like "synergy" or "cloud malware", I have a lot of trouble deciding exactly WTF it is they're actually describing. But by damn', it sounds cool. I'll just bet that nobody can do without it.

Seriously, about the only use I have for products such as Google Drive is a place to park friends' photos so they don't have to pay for the Web space to store them. Aside from that, mostly their "productivity"-related offerings serve as a stop-gap until people can set up real mail servers, NAS, etc.

They're OK, they work well enough, but the continuous state of flux in which they exist costs them a few bonus points.

Boffins find an 'actionable clock' hiding in your blood, ticking away to your death


"Actionable clock"?

In business jargon, "actionable" means... pretty much whatever the speaker wants it to mean.

What does it mean in medical jargon? That doctors can bill for it? That isn't entirely clear. (Yes, I'm in the US.)

Pull your Western Digital My Book Live NAS off the internet now if you value your files


Perhaps I misunderstood the original purpose of this Internet-accessible NAS...

... but it was my impression that it was for *sharing* data, similarly to Drop Box, and not for primary, critical storage, so that it getting wiped was more of a minor inconvenience until it was refreshed from primary sources.

What you need to know about Microsoft Windows 11: It will run Android apps


KDE? Android apps?

Well, at least they didn't copy GNOME 3 again, but instead this time copied a shell which people actually like.

As for Android apps, it's called an "emulator". I've been running them on Win7, started... what? Eight or nine years ago? Always worked just fine.

Chinese web giant Baidu unveils Level 4 robo-taxi that costs $75k to make


I'll believe this when I see it actually happen.

Yet another press release from a group of people famous for disseminating rainbow-and-unicorn colored vapor whenever they need to deflect attention from something else... and five years later when you ask them, "Well where is {whatever}?" they reply, "Oh, we didn't do that after all."

Seriously, hadn't they already done this, between mastering genetic engineering and creating the first fully successful magnetically contained fusion reactor?

Price-capped broadband on hold for New York State after judge rules telcos would 'suffer unrecoverable losses'


In the town where my mom lives, someone proposed community broadband WiFi.

Part of the rationale was that if everyone had access to the Internet, they could eliminate a lot of paperwork and personal contact with town employees.

They did a hard estimate of how much it would cost per subscriber, to wit about half of what their two "competing" ISPs were charging, even taking into account that some people too poor to pay taxes would also be using it.

Then someone, whose identity remains obscure to this day, launched a public relations campaign designed to outrage the citizens that poor people would get, for free, what they were paying for with their taxes.

It worked. Council shelved the plan.

Over the next two years, broadband charges roughly doubled.

'Universal Processor' startup Tachyum unveils full-system Prodigy emulator ahead of sampling later this year


"Human brain scale AI."

The term "artificial intelligence", well understood by computer-science types, has been used for years to mislead laymen into believing that we're within a hair's breadth of building machines which are superior to human beings in understanding and creativity.

It's a lie.

We're not even close to envisioning the architecture of a machine which can out-think a person, let alone building one.

I call BS on "human brain scale AI", and note that there's still room in Transmeta's grave for a few more IT grifters.

Google says its artificial intelligence is faster and better than humans at laying out chips for artificial intelligence


Remember Microsoft's "Dot Net" trademark...

... which around 2000 or so they liked so much that they attached it to every last product, to the point where it became utterly meaningless?

"You can use your Dot Net menu to run your Dot Net report from your Dot Net database on your Dot Net server to view on your Dot Net system with your Dot Net spreadsheet...." (While drinking your Dot Net coffee at your Dot Net desk, etc.)

The popular press and their symbiotes, the marketroids, have done the same thing to the term "artificial intelligence". "AI" never actually meant very much, but these days it means nothing other than "Let us remind you to pay more attention to what we're selling here."

Graph databases to map AI in massive exercise in meta-understanding



I'm sorry, but a headline like that reminds me of listening to a glib twelve-year-old who knows nothing about math or physics trying to explain string theory to a crowd of his high-school dropout relatives. It's catchy, but one struggles to find any actual meaning in it.

Google to revive RSS support in Chrome for Android



Why would they do that? Why would anyone?

Microsoft loves Linux – as in, it loves Linux users running Linux desktop apps on Windows PCs


Tried it. Not so great.

Whatever kernel they're using runs OK on Windows, but anything which requires X falls down and dies pretty often. Tried running the Evolution client on an otherwise vanilla Win 10 Pro system, on a bespoke Win 10 box. Not so great.

You're better off running something like LTS Suse, and virtualizing Windows in VBox. That works great.

Greenland's elections just bolstered China's tech world domination plan


"a happy strategic accident that it has never used to make life hard for other nations"


Years ago they put lanthanide mining out of business in two or three other countries by selling theirs at or below cost, "cost" in China being less than elsewhere because they enforce no environmental or labor protection laws.

And while this may have changed in the interim, they then put a policy in place that they would not export them as raw materials, but would sell to any and all as much finished product as anyone wanted to buy.

I'd hoped the US would counter with its own, similar industrial policy, as the stuff is a goldmine. But our leadership happily caved.

Crikey. It's been in the news over the years. It isn't much of a secret.

Yep, the 'Who owns Linux?' case is back from the dead



For the record, when Linux first released the Linus operating system he said it was a PC port of Minix, which is in fact what it looked like.

Does anyone know whether IBM or SCO owned, or claimed to own, Minix? Sounds to me as if it was a clean-room implementation of a POSIX-compliant OS, which means neither has a horse in that race whatever they might claim in front of a judge.

China's top chip company speaks of massive silicon shortage felt around the globe


I can't help but laugh...

... at the phrase "silicon shortage". I know what they're saying, sure, but still, silicon is something like the third most common element in the Earth's crust.

Boffins revisit the Antikythera Mechanism and assert it’s no longer Greek to them


Re: For those who are interested in such mechanisms

Found some:




Re: For those who are interested in such mechanisms

Piling on epicycles to duplicate complex curves is, mathematically, similar to using Fourier to fit a collection of sine curves to a function.

I've seen epicycle demos which, with increasing accuracy, duplicated what looked like closed Hilburt curves. Didn't look like much until you got five or six epicycles, but with enough of them the results were pretty surprising.

One shouldn't wonder that some ancient mathematically oriented tinkerer might stumble across the principle.

Starlink's latent China crisis could spark a whole new world of warcraft


You don't really shoot them "down". You shoot them, but they turn into smaller pieces and stay in orbit.

Likely someone *would* consider that an act of war, as the smaller pieces will themselves destroy lots of other satellites.

So it appears some of you really don't want us to use the word 'hacker' when we really mean 'criminal'


A car thief who called himself an "automotive engineer"...

... would be laughed at by all and sundry, with some justification.

So I'm baffled that the typical script-kiddie who calls himself a "hacker" passes muster by that same press.

Stoll, the guy who detected the first computer criminals, called them "crackers". Let's just stick with that.

FortressIQ just comes out and says it: To really understand business processes, feed your staff's screen activity to an AI


No, not "... an AI...."

"To really understand business processes, feed your staff's screen activity to *OUR* AI."

There. Just needed a bit of proofreading.

Supermicro spy chips, the sequel: It really, really happened, and with bad BIOS and more, insists Bloomberg


"If you can think of it, there are bad guys already doing it."

I realize I'm quoting fiction there, but save for the fantasy genres good fiction stands on its plausibility.

And since we (US) have done it, why assume, or worse, hope, that China wouldn't? After all, they learned the trick from gaffed Cisco routers, among other things, then they learned how to use, and then used, that very backdoor themselves.

Like Apple Computer Co. so frequently says, "Everyone wants to be us." And we don't stint on the lessons.

Machine-learning model creates creepiest Doctor Who images yet – by scanning the brain of a super fan



The brain's V1 area is pretty much a map of the person's visual field. There is some distortion, which assists the brain in compensating for rotation of the field, but otherwise there's a part of the brain which reproduces a picture of what the eyes see.

Being able to read images from V1 is ongoing research, and this is an outgrowth of that.

If you can put a MEG helmet on a subject, you can also point a camera at what they're looking at, so it's not like it's extracting secrets from the human brain. It can't read memories, it can't read visualisations, it can't read non-visual thoughts.

Not much scary about it. Pretty nifty, actually. May wind up helping some blind people.


Re: Someone with access to an MRI machine has misunderstood machine learning again...

It won't be able to *predict* anything, and it can't read minds. The brain's V1 area is a distorted but otherwise pretty much 1:1 map of the person's visual field. Nothing odd about being able to extract images from it, and people are attempting just that. This is an outgrowth of that research.

Workflow biz ServiceNow ServiceWows itself by beating Q4 guidance and posting hefty top line growth of 31% for FY2020


Remember when Larry Ellison sold Oracle...

... not to technical staff, but to executives, by repeating the meaningless mantra "Oracle puts all your enterprise's information at your fingertips"?

The tactic still works.

(Recall also that the original Oracle was a temple where one went to seek answers from the gods.)

The UK's first industrial contribution to the ISS: An end to sneakernet for spacefarers


"Cape Canaveral Space Force Station"?!

Most of us knew when the President (TM) started bafflegabbing about "creating a space force" that he was just engaged in his usual sort con job.

A year later, when they'd done nothing about it, they started adding the words "Space Force" to existing programs and facilities in a classic Jesus Is Coming, Look Busy move.

I'm ashamed to hear the term. I'm even more ashamed to hear it attached to the venerable Cape Canaveral.

GitLab removes its 'starter' tier: Users must either pay 5x more or lose features


Re: Self host git

Most of us don't *need* most of that.

And for those who do, there are a host of free solutions which can run, self-hosted, right alongside a self-hosted instance of git.



Biting the hand that feeds IT © 1998–2022