Crypto "industry"?
Industries *make* things. Industries make *things*.
There are other businesses which are not industries, but instead provide services.
Then there's crypto "currency". Not sure this is even a business.
244 publicly visible posts • joined 14 Feb 2013
This is the lesson of Facebook: start the cheating out on a small scale, and ramp it up as you accumulate capital.
Homes got greedy, ramped up the cheating more quickly than she could build defenses against the victims.
Hate thieves, but we've let the US turn into a haven for them. The big ones are never called to account.
Whole problem with the new types of chatbots is that their training sets include those very sources.
Kind of a vicious circle, that.
Have you ever seen video feedback, from pointing a video camera at its monitor? It's pretty cool. Not so sure how cool this sort of semiotic feedback will be, though. Cryptic, yeah, but still garbage.
I've seen strings of words which are syntactic nonsense.
I've seen strings of words that make syntactically correct sentences, but semantically are nonsense.
I've seen strings of words that make sense semantically, but in reference to the real world are nonsense.
Tools like ChatGPT have reached that third level. But because they're constructing text from things people have already said, without regard to whether those people *knew what they were talking about*, a careful reader can still tell.
But remember that park ranger who said, "There is considerable overlap between the intelligence of the dumbest tourists and the smartest bears"? The same thing is true about the dumbest humans and the smartest chatbots.
This is ridiculous. If they are able to do that, anyone can make a copy of any copyrighted work, change some part of it, then claim that the rights asserted by the original copyright holder don't apply.
How on earth they decided to mix in trademark violations in what was an otherwise open-and-shut license dispute is beyond me. That seems to be confusing the issue for no good reason.
That's hilarious.
My Win10 box is a constant source of update-borne grief. A colleague observed, "I never had any problems. You must be doing something wrong." Then he discovered why: his company had never sent any patch updates to his computer.
What I did wrong? Allowing patch updates.
Makes sense, sort of.
They did indeed.
They sold lanthanides for less than cost for long enough to kill lanthanide mining in the US and Australia, which have large deposits. Then they announced that they would not export raw lanthanides but would sell finished products containing the metals to one and all.
US lanthanide mining continued to languish, last I checked, but the Australians said, "No, no, we'll start mining again."
This all played out years and years ago, so the US may have resumed lanthanide mining also. In any event, it backfired on them.
Same will happen with nickel and cobalt. Plenty of nickel and cobalt elsewhere. And manganese can be found lying all over the ocean floor in MnO nodules.
Tesla's engineers work in AutoCAD and the MCC Materials Handbook. Tesla's programmers work -- We'll agree to refer to repeated failure due to forced and unrealistic released dates as working -- in C / C++.
Given that most of Twitter's back-end data processing is also written in those languages, and that it uses some "AI", those Tesla programmers might have an inkling what they're looking at.
Not sure why they'd want to change it, though. I've no use for Twitter, but it works as well as something like that can. It isn't broken, merely useless.
Mentally read the post title in the Beavis and Butthead voice.
Now, that said, Fujitsu has "launched" computing-as-a-service, the very thing which service bureaus have been doing since the 1960s -- Anyone remember GE's Genie? -- and on it they run a linear programming solver to which they refer as a "non-quantum annealer".
Well, if it walks like a hey-look-at-me press release, and quacks like a hey-look-at-me press release, and smells like a hey-look-at-me press release, then it must be a hey-look-at-me press release.
Instead of spending the first eight or nine paragraphs saying the same things over and over, how about actually describing the thing?
"The command line includes an auto-complete feature.
"Each command launches a small shell window within the main window in which that command executes, allowing the user to compose another command, launching another small shell window within the main window, while the one above it continues execution."
It would have taken a few seconds to read that, as opposed to wading through a few hundred words of "It's new! It's different!" and watching a video.
(I hate videos. They save the presenter a small amount of effort, while wasting a lot of the audience's time.)
People are constantly discovering new minerals. So what?
*Everything* on the surface of the Moon has a tiny bit of 3He in it. So what?
3He is a terrible fusion fuel; it's difficult to fuse. Sure, fusing doesn't make your reactor parts radioactive like Tritium does, but if you can't get it to fuse then so what?
Don't get me wrong: it's neat that they brought samples back from the Moon. But this is a tempest in a teapot, a transparent effort to get a headline.
Yawn.
Great headline, but "crypto haven"?
Wasn't the whole point of cryptocurrencies that they were utterly decentralized, not reliant on specific computing hosts, and not subject to any central authority or geographic boundaries?
So now we have companies which endeavor to control crypto transactions, and places like "crypto havens", and my irony meter has burned out again.
This is typical of organizations which can afford to keep a flying wedge of lawyers on perpetual retainer: If the case goes their way they continue with it so it will become established precedent, i.e. case law, but if it looks as if they'll lose they settle out of court to prevent the loss becoming established case law.
How to prevent this? Record an out-of-court settlement as a loss for the defendant. That would level the playing-field just a bit.
Meanwhile, and yes, I know, I repeat myself, Hell is full and zombies in suits are walking the Earth. How's that working out for us, eh?
If this is what I think it is, an optical version of a neural net built from a block of glass fibres, it has to be hard-wired.
That means that it has to be hard-wired for a specific purpose. At low levels this isn't all that bad: we have hardwired edge and area detection for our own eyes, providing additional information derived from a field of pixels which is useful for recognizing what we see.
Scaling that up won't be one of those square-of-the-number-of-pixels things since that sort of recognition depends on nearby pixels rather than correlating information from distant parts of the visual field.
So far, so good.
The problem comes with hard-wiring the higher levels of recognition.
You want to recognize numbers and letters? Roman letters? Cyrillic? Arabic? Telugu?
You want to recognize faces? Whose faces? Maybe worth hard-wiring to recognize specific features, but beyond that not so much.
So at some point someone has either to create a *programmable* optical neural net to sit atop this device, or to use the device to enhance the input to more conventional software.
Sounds useful, but this isn't the be-all and end-all of image recognition. Recognize it for what it is, to wit a useful step forward, but let's don't get all dizzy over it.
We laughed at Trump, or more accurately winced, because he "created" a name, and then did *nothing else* save for applying it to existing and on-going activities which were not under his purview, and then showed no further interest in the matter, which has been typical of him in the 40-odd years he's been on my radar.
Nothing wrong with having a "Space Force". Lots wrong with giving Trump credit for it.
Gartner conflating "cloud" with "AI" and saying that "it will write our code in the future" is typical.
And it's ignorant. And it's sad.
But their principal market consists of businessmen, who want to commoditize programming, just as they've tried to commoditize engineering, medicine, etc. They love to hear this stuff, even when the results have always been disastrous.
If they'd told businessmen, "You'll have to actually pay people to do critical and intellectually demanding professional work, and learn to live with that," they'd have lost most of their audience.
... several times over in fact. US telcos have been collecting a "broadband tax" for a couple of decades, amounting to over a half-trillion dollars, to fulfil their commitment to "bring broadband to all Americans". They convinced the head of the FCC, an AT&T lobbyist, that GSM was "broadband" and that they therefore had finished the job, without actually doing it at all.
So... we're going to pay for that all over again? How many times, I wonder, before we actually get it?
... brought to me by a neighbor who asked, "Why has the screen detached itself from the tablet? Can you fix it?"
Mind you, the screen is still attached, but it is hovering about 6mm above the rest of the unit. Why? The battery blew up like a balloon. And no, I can't fix it. It is held together entirely by glue.
The only reason the screen didn't crack is that the battery heated up so much while failing, or inflating, that it melted some of that glue. Now that it's cooled off? The glue has solidified again so there is no way to get the thing apart to fix it.
Do you suppose they'll replace all these units? Not on your life.
You know the saying: Hell is full, and guilt-ridden engineers are walking the Earth.
... and note that the center of lift is way, waaay behind the center of mass. Good thing Honda has much better engineers than they do artists.
All that said, they do have damn' good engineers. I expect them to take away a good chunk of other commercial lift companies' business.
And you know? That probably isn't a bad thing.
(1) The process using "linear algebra probabilities" is quantum annealing. That isn't the only way quantum computers are built, although it is a very useful one. Quantum computer programs for, say, factoring use entirely different techniques.
(2) A quantum bit cannot store multiple arbitrary classical bits of information. Saying, "Oooh, ahh, we can store much more information here because it's quantum bits" represents a fundamental misunderstanding of the technology.
We may have made a mistake by calling these things "computers". It's as if we still called modern computers "difference engines". It limits our thinking so severely as to cause many people to completely miss the point.
... ha ha, ha ha, haha!
I'm so very sorry, but this is one of the stupidest and most blatant "Hey, let's all pay attention to ME!" press releases I've seen this year.
Not only do we not have the technology to do this, we don't even have the technology to design and manufacture any technology which might even remotely approach the necessary capability. Not even close.
How on earth can anyone take this seriously?
... but rather a HOSTS file which routes hundreds of ad servers to 0.0.0.0.
Works great. About to translate the thing to use on Linux and Android. I have to wonder how much of the OS that's going to kill, but I do have a couple of sacrificial devices to try it out on.
Happy days.
Oh, and by the way, Hell is full and ad executives are walking the earth.
Sure, one can attach a vice-grip to the steering wheel stem to steer it. One can memorize the colors of the wires so one will know which ones to short out to start the car or turn on the windshield wipers. Without pedals one can operate the throttle by pulling a string. (More difficult to operate the brakes that way, but that's why dashless, pedalless cars have hand brakes.)
Sure, one can do all these things.
When we're all old we can tell our grandchildren, "When I was young, we didn't just have to walk to and from school uphill in the snow both ways while fighting off rabid bears, we also had to use git for source control, and we liked it." Bah!
To those who voted this comment down:
An OS consists of a kernel (or executive), a shell, and a collection of system utilities.
This is the original definition of the term, which hasn't lost validity just because marketroids are unable to grasp the details of technology.
Linux-based OSs consist of a Linux kernel, one (or more) of a variety of shells, and a bunch of utilities mostly from GNU.
... and demands calendar and contact access to change the background image on the home screen. The litany of such offences grows with each passing month. It's so sad; I used to be a staunch proponent of their gear. Lately? It would have to be free... and even then I'd install a 3rd-party ROM before I used the damn' thing.
Been writing software for fifty years, getting paid for it for about 45; I don't see a single flaw in this narrative. Needless to say, I'll be reading this one. Always did enjoy a freewheeling and humorous account of current history.
BTW, 'AI would be the "most profound technology" that humanity will ever develop' is a great prediction for the far future, about on par with predictions of reversing ageing, or of tapping vacuum potential for free energy. How very prescient. [yawn]
I know it's been said here already, but I'll say it again.
The people who are pushing this are doing so because it sounds so nifty. But given the cost of getting a big solar array into orbit, and going up there to do maintenance, repair or expansion on it, it is far cheaper just to put the damn' thing on the ground, where people actually need the power, and where the storage for nighttime or peak-load demand can be sited.
I cannot believe that people are even still discussing this boondoggle.
... in asserting that some seek the consequence-free world of gaming, which is in some respects like real life, but less difficult and less threatening.
I've met people like that.
But they constitute a tiny fraction of people who play video games. The rest of them are just fine.
Isn't it just like the CCP to pick out a corner case, then use it to condemn an entire population? You'd almost think they were Republicans.
Notwithstanding everyone's deeply held desire to please Google by Talking the Google Talk, most programmers are not engineers.
Engineers know how to plan and cost jobs. Engineers identify and address points of failure, figure MTBFs. Engineers document the steps from problem to solution to implementation. Engineers must have a thorough understanding of the science which underpins their disciplines.
For the half-century or so that I've been writing code, I've met maybe three programmers who were trained as engineers, and maybe twice that many not so trained but who understand the principles and practices of engineering.
That's nine or ten out of many hundreds. Programmers are, by and large, not engineers.
And that was a shame. At the time the 68000-line of CPUs would emulate a PC faster than a PC would run. Saw a lot of potential, but it just didn't catch on.
These days one can use a handset to do what once required a desktop computer: mirror it to a TV, pair a Bluetooth keyboard and mouse, and you can do word processing or coding or watch movies in a hotel somewhere on I-40.
The QL would have given us similar capability, at least similar for its time, back in the early 1980s, only it didn't catch on so the software and storage didn't develop.
It would have been a game-changer. Que lastima.
Old Windows conformed to a set of GUI guidelines called Common User Access, which has been under development since Xerox first invented the GUI, back in the early 1970s.
I'm a big fan of CUA, because a CUA-compliant GUI makes it immediately obvious how to operate the shell and any CUA-compliant software which it might run.
KDE, for example, is still CUA-compliant. My mom, a Windows user, and 79 at the time, sat down in front of a KDE laptop and was able to just start doing what she needed to do with no confusion and no wasted time.
I'm not a big fan of "modernization" which entails eliminating UI cues and making the UI more cryptic. The worst example I can think of was the Win8 "feature" that you open the "start panel" by moving the mouse cursor to one corner of the screen.
I'll refrain from listing the hundred-odd other problems of the "modern, aesthetic UI". None of this new stuff is, as an old math teacher of mine once said, "intuitively obvious".
I like it. The name sounds modern, even cutting edge. Kind of like "synergy" or "cloud malware", I have a lot of trouble deciding exactly WTF it is they're actually describing. But by damn', it sounds cool. I'll just bet that nobody can do without it.
Seriously, about the only use I have for products such as Google Drive is a place to park friends' photos so they don't have to pay for the Web space to store them. Aside from that, mostly their "productivity"-related offerings serve as a stop-gap until people can set up real mail servers, NAS, etc.
They're OK, they work well enough, but the continuous state of flux in which they exist costs them a few bonus points.
... but it was my impression that it was for *sharing* data, similarly to Drop Box, and not for primary, critical storage, so that it getting wiped was more of a minor inconvenience until it was refreshed from primary sources.
Yet another press release from a group of people famous for disseminating rainbow-and-unicorn colored vapor whenever they need to deflect attention from something else... and five years later when you ask them, "Well where is {whatever}?" they reply, "Oh, we didn't do that after all."
Seriously, hadn't they already done this, between mastering genetic engineering and creating the first fully successful magnetically contained fusion reactor?
Part of the rationale was that if everyone had access to the Internet, they could eliminate a lot of paperwork and personal contact with town employees.
They did a hard estimate of how much it would cost per subscriber, to wit about half of what their two "competing" ISPs were charging, even taking into account that some people too poor to pay taxes would also be using it.
Then someone, whose identity remains obscure to this day, launched a public relations campaign designed to outrage the citizens that poor people would get, for free, what they were paying for with their taxes.
It worked. Council shelved the plan.
Over the next two years, broadband charges roughly doubled.
The term "artificial intelligence", well understood by computer-science types, has been used for years to mislead laymen into believing that we're within a hair's breadth of building machines which are superior to human beings in understanding and creativity.
It's a lie.
We're not even close to envisioning the architecture of a machine which can out-think a person, let alone building one.
I call BS on "human brain scale AI", and note that there's still room in Transmeta's grave for a few more IT grifters.
... which around 2000 or so they liked so much that they attached it to every last product, to the point where it became utterly meaningless?
"You can use your Dot Net menu to run your Dot Net report from your Dot Net database on your Dot Net server to view on your Dot Net system with your Dot Net spreadsheet...." (While drinking your Dot Net coffee at your Dot Net desk, etc.)
The popular press and their symbiotes, the marketroids, have done the same thing to the term "artificial intelligence". "AI" never actually meant very much, but these days it means nothing other than "Let us remind you to pay more attention to what we're selling here."