> protection against hurting myself
[grin] Yeah. At full tilt this CPU consumes a bit more than 1/3 horsepower worth of electricity.
If I want one of these, I want an experienced technician to build it.
153 posts • joined 14 Feb 2013
Sure, ARM makes good chips, but judging by their specs it's difficult to see how they'll "dominate" any "irrelevant" AMD chips, which presently outperform them.
Moreover, anyone who thinks an Apple handset or tablet outperforms "most laptops" then hasn't seen a modern laptop in a while.
Tell us the truth: is this post satire?
Mono didn't "migrate to .net core".
Microsoft doesn't own Mono. Microsoft appears to have cloned Mono to make Ms Net Core.
They've been trying to stuff that genie back into the lamp ever since Novell started the Mono project, and having failed to do so they decided that *claiming* to have done so is just as good.
"Trump continues to reason", inaccurate though its literal meaning may be, is convenient shorthand for what is actually happening: Trump makes the theatrical but unsubstantive decisions, while the substantive ones are made by whomever is actually in charge, passed along by his aides as "suggestions". So while mindless hand-puppets may get things right now and then, there's no real way of learning how or why the decisions were actually made. Don't you love transparency in government.
Actual intelligence would recognize the image and fill in the details from memory.
However, an actual intelligence seeing a face which it couldn't recognize, never having seen it before, would do no better than this.
As much as we'd love to extract from an image details which just aren't there, and aren't anywhere else, it can't be done.
Another promise, one which was never believable to begin with, broken.
Stop connecting all your damn' "Things" to the Internet.
Have yourself a network-of-things instead, monitor it with your own computer, and if you need to ask your computer what or how they're doing, then do that using secure communications via the Internet.
Try to remember that the people who decided we should call networks-of-things "The Internet of Things" are all data-collection companies who'd *love* for you to connect all your things to the Internet so they can snoop on them.
Don't let them get away with it. Hell is already full....
In a sense that's true, or the reverse is true, as courts in the US have ruled over and over the same thing they ruled in the EU: You can copyright code, but you can't copyright behavior.
I'm baffled by this.
What SAS institute has done is to rest on their collective laurels for a couple of decades, then complained when someone else did what they'd done.
What makes it even more puzzling is that there are open-source stats packages which *also* do the same thing, although they are not quite so polished.
You know the saying: Hell is full, and the dead are walking Research Triangle Park.
... are as sure a sign as any that huge corporations are not the ones which create jobs.
They're selling into a saturated market, so the only way for them to "grow", to continually increase profits, is to shed employees, like IBM, or to get people to pay them again and again for things they already bought, like Microsoft, or both, like US telecomms.
... when Star Link and its (now defunct) competitors started launching what will eventually be hundreds of satellites?
And what made the US FCC competent to manage policy regarding space debris? (Or at the moment, to manage any damn' thing at all?)
And did anyone bother asking any of the other dozen or so countries which also have equipment in orbit?
And... is Hell full and the dead are walking the Earth? Seems like it.
... in which one feature of their world was "junklight", the reflection of sunlight from billions of pieces of orbiting junk, which brightened the night sky.
Then again, that one was a million light-years from the nearest star, so they didn't do a lot of astronomy. Here things are different.
The GUI for Win95, as well as earlier versions of Windows, the Mac and the Amiga, were based on something called the Common User Access standard, which has been around for about a half-century.
It was devised specifically to make it obvious to the user how to operate a graphical user interface. It evolved from part of IBM's System Application Architecture standard, which they began to revise when Xerox first created a GUI.
People worked on it and refined it for decades, with the end-result that it was possible to figure out at a glance how to operate GUI-controlled software.
But you're in favor of abandoning decades of user-experience work in favor of "slick, modern" incomprehensible user interface designs?
Well, everybody has the right to be wrong. Exercise it proudly.
I like the hat. It's quaint.
As for the rest, yeah, Milstar is due an overhaul. Good.
And as for Space Force, well sure, it's a white elephant, and discredited by its very origin, but I suppose one can keep the notion alive by slapping its name onto something which was already planned for years and would have been deployed anyway.
Xamarin hasn't been part of Microsoft for 20 years. It hasn't existed for 20 years. It started about a decade ago with a port of Mono for Android, and some Visual Studio plugins to allow coding on a Wintel dev box.
They did *act* like Microsoft: When they released the second version of their dev kit they disabled the first one, stating that one had to download the new one in order to keep using it, then when one fired it up it announced that one had to purchase a second license.
And Microsoft did finally buy Xamarin, but that's been only a few years ago. More on that below....
I know all this. I was there. I used Mono for Android at the time for a few projects. I watched all this happen.
Next, Mono doesn't belong to Xamarin, and it doesn't belong to Microsoft. When the EU refused to continue to use Microsoft's programming products unless they made C# an open standard, Novell promptly assembled a team and made an open-source version: Mono.
Microsoft would *like* for you to believe that when they bought Xamarin they bought Mono, indeed they tried to buy Novell when it became evident that the project would succeed. They've been trying to stuff that genie back into the lamp since about 2005.
Finally, Xamarin's Mono for Android rapidly evolved into a bloated and nearly useless programming platform. They may since have fixed that, indeed it looks as if they're now using Net Core instead of Mono, but they lost a lot of traction in the process.
Someone has been treating Microsoft press releases as if they were actual, recorded history, which the are not. Sad, that.
... it might actually be a good thing if Xerox bought the mortal remains of the Hewlett Packard corporation.
Xerox used to run the premier CS / IT research lab, PARC, the equal in my estimation of the late and lamented Bell Labs. Hewlett Packard used to be one of the, if not *the*, premier electrical engineering outfit in the world.
If one phoenix could arise out of the ashes of two, the world might be at least a tiny bit better off.
... but I am thoroughly sick of hearing about the "Internet of things". An internetwork *connects other networks* to one another. If your things talk to each other, it is a *network of things*.
And 99.9% of the time, despite what various rapacious data harvest... erm, vendors keep telling us, there is no good reason, none, to connect your network of things to the Internet, and a whole compendium of reasons not to.
I can't even believe that people are still debating this. This is like installing a video camera in your bedroom, a monitor on a light pole on the nearest street-corner, then debating about how to maintain your privacy.
Is hell full again? Are those the dead I see, walking the streets? And these people vote....
... in its usual way of generating profits. After all, although in technical terms it is now the world's largest corporation, in fundamental terms the company is way, way, waaay down the list.
Why is this, one might ask? Well, despite its very small market share Apple reaps an enormous profit from the relatively few devices it does sell. (Read: they're incredibly overpriced.)
And how might COVID be profitable for Apple? The illusion of scarcity it creates stiffens the demand curve, allowing them to raise prices even further on what are already fairly cheap devices with already large price tags.
It'll be interesting to see how this works out for them.
Mom is running Win7 w/ good anti-virus... until I set up a Suse box for her.
She had a Win8 VM on her Win7 box so she could show her friends how to do things with their new computers, doesn't regard Win10 as worth the learning curve. She's 85 and just can no longer be bothered with Microsoft's we-changed-the-UI-beyond-recognition-to-encourage-you-to-buy-it-again nonsense.
I'm 62, have been programming for fifty years, and am starting to feel the same way. I'm virtualizing my Win7 dev box and Win10 test box for customer-related work.
For day-to-day use? Who needs them.
This goes back to Windows For Workgroups, a.k.a. Win 3.1.1.
A time bomb installed in the software killed its network functionality in late 1995, some time after the intended release date of Win95, but before the actual (delayed) release date of Win95. Tweaking MemMaker disabled it and allowed the (crude) network stack in Win 3.1.1 to start working again.
How do I know? I found it myself, and saw that it included a workaround (a MemMaker change), which I then deployed for several customers whose networks had mysteriously all failed on the same day. Others found pretty much the same thing, confirmation that we weren't having a bad dream.
My (now air-gapped) Win7 dev box hasn't had this problem. I wonder if not installing last month's patches has anything to do with that?
This is a principle to which Microsoft has adhered since the 1990s. Surely it is no surprise.
We learned this back in the Windows 3 days, when many a patch diskette would break Word Perfect and Quattro. (Yes, Windows "updates" once were distributed on diskette.) I had several customers switch over to Office, which at the time was utter crap, just to avoid that problem.
They learned that it worked, and that they could get away with it. Every time someone sued them for it they'd just switch tactics.
Of course the last few patches are going to bork Microsoft products which they want you to replace. It happens a often, and has for years.
... but didn't get much credit for it. Nerds invented the electronics industry, and made out pretty well. Nerds invented information technology, and a bunch of them wound up taking money to the bank in wheelbarrows. Nerds are busy inventing the genetic and biomedical engineering industry, and stand to profit tidily.
Nerds rule. Psychologists need just to get over that, to make the move from denial to acceptance.
We have an American Medical Association. We have an American Bar Association. We have *five* huge associations serving the interests of the advertising business? Five?
And they're telling legislators that their interests are more important than those of their constituents?
Try to recall, amidst all the hoopla surrounding the irreproducability of psychological experiments, the one which everyone has been able to reproduce time and again: showing that if one lies to people often enough then even the fairly intelligent ones start to treat those lies as facts.
This is how propa... erm, advertising works.
And we have a veritable army of people defending the rights of those who do this
And they're telling legislators that their need to invade our privacy so they can do so more effectively is more important than their constituents desire to curtail this practice?
Wow. Hell is full, etc.
Oh, by the way, advertising is a *business*, not an industry. Industries *make* things. Industries make *things*. Let's give people dictionaries this holiday season.
The only thing that any government agency can accomplish using encryption "back-doors" is mass surveillance.
If one can convince a judge or magistrate that something shady is going on, they will issue an order for a wiretap. And you know something? Even with all this modern machinery the modern version of of a wiretap can capture communications at its source, prior to encryption.
No, the more government agencies ask for weak encryption, the more they strengthen the case that they should not be allowed to demand it. It's only use-case is unlawful in most civilized states.
Did some nitwit executive stumble while trying to remember the word "silo"? And then, pulling something directly out of his a**, say to himself, "You know, that sounds pretty good. Impressive. Visual. Compelling. I think I'll keep it."
This is how the Marketroid Dialect of IT terminology evolves.
Reviving Windows RT? Reviving XAML? Continuing to pretend that Mono is a Microsoft product?
This is, what? The third, fourth iteration of Let's Run Windows On Something Other Than X86/X64-instruction set processors? When, do you suppose, will they abandon it this time?
XAML was a knockoff, or maybe the bastard child of, GTK and Android's XML description language for UI assets, does not and had not ever allowed us lowly programmers to "converge" Web and desktop UI design.. What, exactly, is it "converging" with these days?
Finally, Microsoft's attitude to the New Dawn Of the Multi-Platform CLR, to which Novell and the Mono team beat them by twelve years or so, seems to be "If we take credit often enough, people will begin to believe." Hey, it worked for Herman Goreing and Karl Rove, so why not?
Oh, and have I mentioned that Hell is full and the dead, now imbued with Breathless Hype, are walking the Earth? Yeah? OK then. Still, we need to be reminded now and then.
Another purpose for a $1,500 handset would be this:
If I could put it into a dock connected to a keyboard, mouse, two or three screens, Ethernet, a scanner and a couple of printers, use it as fully functional desktop computer, then un-dock it, put it in my pocket and go, then it would be worth the price.
But it just doesn't have enough horsepower to rival a stock desktop box. So... no.
I don't know what prompted programmers, or worse "developers", to start calling themselves engineers. Pomposity, maybe? Insecurity? A desire to attain a higher station in life without expending the necessary effort? Who knows.
Out of all the hundreds of software people I've met in a half-century of programming, maybe three have a clue what engineering is about. The rest couldn't cost a job, identify a point of failure, or document the chain of decisions from problem to solution to design if their lives depended upon it.
It's OK if Git Lab doesn't hire engineers from China. It doesn't hire engineers to begin with.
Marketing efforts notwithstanding, "hologram" isn't a synonym for "three-dimensional presentation" any more than "motorcycle" is a synonym for "any machine with wheels and a motor". I know that it sounds cool and futuristic, which marketroids love, but it means something specific.
> Maybe don't walk in front of a moving car, Tesla or not, anyway?
In a thread on this site about a self-driving car hitting a pedestrian I suggested that if she'd stepped right into the path of a nearby car driven by a *person* she'd have still gotten hit.
Got downvoted into oblivion. This is a rough neighborhood.
On the contrary, it is most suitable.
Given that we can put 1TB on a USB stick, I like the idea of carrying my computing environment in my pocket and being able to plug it into and boot it up on just about any box, anywhere, that I happen to encounter, then shut it down and unplug it from the CPU, keyboard and screen, and carry it away with me leaving no trace behind.
The more I think of it, the more I like the idea.
Biting the hand that feeds IT © 1998–2020