Re: Innovative alternative anywhere?
An "Elon Musk approach"? You mean buy some company and pretend you invented whatever they're doing? Yeah, that could work.
13172 publicly visible posts • joined 21 Dec 2007
Hey, if my truck is on the move with nothing in the bed, it's going somewhere to have something put in the bed.
But then my truck is 30 years old (so the "bound energy" of manufacturing has been well amortized) and gets around 20 mpg (US gallon, ~ 0.12 l/km), which is pretty good for a pickup anyway.
But, yes, there are a lot of large, inefficient, mostly-empty vehicles on the streets in the US. And elsewhere, of course, but it seems to be particularly prevalent here.
SciFi movies have clearly shown that reality coming.
Well, now, that is a compelling argument. Why, what predictions of the science-fiction-film genre have not come to pass? They're basically a window into the future.
Honestly, the things some people write.
how challenging San Francisco is to drive in
SF is "challenging"? I suppose compared to, oh, Phoenix, that might be true. But I've been many places that are a hell of a lot more challenging to drive in than San Francisco, and that's just in the US.
Waymo is in, what, SF, a small area (63 square miles) of LA, Austin, and Phoenix? (Not sure about that last one.) Last I checked, only one self-driving-vehicle vendor (Motional) was trying to take on Boston, and they were only doing a couple of routes. And Boston's reputation for difficulty is exaggerated.
Let's see how Waymo fares in, say, Michigan, with snow and distressed roads. Not that they currently operate anywhere there's likely to be snow, of course, so they've hardly solved that problem, have they. Or in many rural areas, where many people live on private roads which aren't well-mapped. How are Waymo cars at navigating Forest Service roads?
Waymo and the other autonomous-auto firms are cherry-picking the easy markets, not the hard ones. San Francisco is an easy market.
Not just cabs.
The last time I was in San Francisco (I think, unless I've forgotten some other trip), around a decade or so back, my wife and I went with private livery — a livery-licensed town car. It was, in fact, a Lincoln Town Car,1 immaculately maintained by its owner. I scheduled our airport trips ahead of time, which requires a bit of planning, but that meant there was no need to wait in the taxi queue or call for a taxi, much less deal with some accursed app.
Polite, door-to-door service in a reasonably luxurious vehicle. Sure, a Lincoln Town Car isn't a Jaguar, or a BMW or Mercedes (which is what I had for ground transport when I traveled to the UK); but it's pleasant and comfortable and has ample luggage space. Or was and had, since Ford stopped making them. But whatever.
And it wasn't much more expensive than the standard taxi rate from and to the airport.
As far as I'm concerned, Uber already enshittified hired cars. I'd be just as happy to see them driven out of business.
1Stock, not stretched. I'd rather walk than ride in a stretch limo. Horrible, tasteless monstrosities. And even worse these days now that they're being made out of SUVs.
Sure, and it'd be an even bigger breakthrough if someone invented a magic wand that prevented aneurysms altogether, or cured all heart disease, or eliminated cancer. Lots of things would be "bigger breakthroughs" than the actual breakthroughs we have. Imagining bigger breakthroughs isn't how progress works, though.
The Natural-Born Citizen clause is vague, inasmuch as it doesn't define what "natural-born" means; but no reputable Constitutional scholar doubts that it excludes Musk. (The main question is whether it excludes people who are born outside the country but had US citizenship from birth due to being born on diplomatically-US territory such as the grounds of an embassy, or for the various other reasons set out in USC 8 §1401.)
The current justices of the Supreme Court certainly have some contentious opinions, and I wouldn't trust Thomas or Alito on almost any question, but there's no way current SCOTUS would throw out the NBC. And even the justices appointed by Trump don't appear to particularly care for him or his ideas, and I wouldn't expect them to be fond of Musk either.
England and Wales, Shirley? Or was Wales just treated as part of England at the time? The "union of the three crowns" for James the VI+I was Scotland, England, and Ireland, with no mention of Wales as being in any sense separate from England. Yeah, I know Wales and England were "unified" by Henry VII back in the sixteenth, but these days Wales is described as "a country of the United Kingdom".
And, of course, James VI/I referred to himself as the ruler of "Great Britain and Ireland", even if Scotland stubbornly insisted on continuing to be a separate country with the same king. So I suppose there were probably still people in 1640 referring to the whole island as "Great Britain" politically as well as physically, even if that was rather a gloss.
Ah, the history of the British Isles — so gloriously unkempt. Makes that of the US seem positively straightforward.
Annoying as the "just make sequels and reboots" trend in the film industry is, I understand it. The economics are pretty obvious: a large portion of the audience is making its spending decisions based on opportunity cost ("I want to see something I believe I'll enjoy, so I'll seek the sort of thing I enjoyed before"), cognitive load ("watching something unfamiliar requires cognitive resources that I may not feel like marshaling during my leisure time"), and psychological rewards like nostalgia ("I enjoyed some variant of this years ago"). And studios have increasingly conditioned much of the audience to expect big-budget noisemakers, so films are expensive to produce. And early buzz is important for studios' returns; simple "popcorn flicks" don't require a lot of reflection from audiences, or opportunity to read reviews, etc.
I mean, personally I'm not attracted to that sort of thing, but then I watch very few movies anyway. I am clearly not the target demographic.
And it's not like during cinema's "golden age" the studios weren't churning out masses of unoriginal Westerns and romances and gangster films and whatnot.
Arguably, for people who like film, it's good that there are studios making successful familiar, stupid sequels, because those provide the capital for studios to take a chance on the occasional interesting release.
the IBM SOFTWARE was "Sequel Server". And I REFUSE to allow IBM marketeers from the 80's/90's
IBM had a product in the '80s and '90s1 called "Sequel Server"? Do you have a citation for that? I don't remember one, and a search didn't turn anything up.
Are you thinking of Microsoft's "SQL Server"? That's often pronounced "sequel server", but it isn't named "Sequel Server", and it's not an IBM product.
1The apostrophe replaces the elided characters. It's "'80s", contracted from "1980s", not "80's".
The second season of Space: 1999 surely rivaled anything in Star Trek (at least the original series) for "very bizarre". And I say that with a certain measure of affection, or at least tolerance.
(These days I can barely get through half an hour of television at a time, except for a handful of specific titles that for whatever reason can hold my interest. But even there I have to make an effort. I can read for hours at a time, but with anything synchronous I quickly become restless.)
Eh? I admit I've paid little attention to Star Trek since, oh, maybe the first couple seasons of TNG, but the original series, at least, was notable for two types of UIs on the Enterprise: voice assistant ("Computer, analyze the data on this slab of plastic") and panel of Big Glowing Buttons which would emit sparks at the slightest provocation. Neither of those were common with MS-DOS machines. Or with IBM's DOS for the S/360, for that matter.
Ah, Roku, the firm that patented injecting advertisements into the stream. Yeah, no.
I expect in not so many years I'll just give up watching television entirely. There's some good content to be found, but it's just not worth it anymore.
Transformer models are significantly different from SLP networks, in quite a few ways. Claiming they're the same thing is a vapid argument, frankly. Even deep convolutional stacks are very different from SLPs (or other single-layer networks, such as single-layer RNNs or CNNs or SAMs or what have you), and transformers are quite a bit different from deep convolutional stacks.
I am well on the record here for disliking LLMs and gen-AI in general, and for questioning the AI/GAI claims of its fans. But ignoring major technical details and dismissing the research does that side of the argument no favors; it just shows that argument from ignorance is always possible.
My, what impressively foolish and turgid prose you produce.
Do you actually know anything about transformers, ANNs, other ML models? It certainly doesn't show.
Of course it's true that the standard Reg commentariat line of "it's not intelligence" is vapid flag-waving — I've yet to see anyone making that claim illuminate it with a usable definition of "intelligence", and few show any familiarity with the research in transformer (or diffusion) models, or even much understanding of ANN stacks. But neither are "I use LLMs and they're great" nor "you're soaking in it" persuasive arguments; they're barely arguments at all.
And, in fact, most smartphones do not have a Google Tensor chip — that's a proprietary SoC that only appeared with the Pixel 6. Most Android phones are not made by Google, and a shocking 0% of iPhones are. And the TPU in the first couple of generations of the Tensor SoC was not impressive; Google didn't even start making "AI" claims about the Tensor SoC until G3 in the Pixel 8. (And their claim for the G3 was "run more than twice as many ... models", which is both underwhelming and amusingly vague.)
The Tensor Soc's TPU is suitable, and used, for running relatively small models for things like text-to-speech and speech-to-text, basic still image processing, background removal for live video, and so on. It's not doing inference on a billion-parameter transformer. There are technical arguments (though I've yet to see a terribly good one) for calling frontier LLMs "AI", but those do not apply to little embedded TPUs.
Hell, the Thinkpad I bought used (refurbished) from Newegg a few years back does just fine at playing the handful of games I want. Those are undemanding — the most graphically intensive is probably the Spyro trilogy, which were PS1/PS2 games originally — but then that's rather the point, isn't it? "Gaming" means different things to different people.
I don't see the appeal, myself. But then I also don't see the appeal of HD, much less 4K and its successors.
And lord knows I'd love it if I could force all sound to mono, with proper mixing. Stupid Dolby 5.1 encoding is everywhere, and with just the set's built-in speakers dialog is often drowned out by the SFX and incidental music that directors insist on cramming onto the sound track.
Yes, that's fine. But the decision here applies to the one-digital-copy-per-one-hard-copy program as well; IA have already agreed to a proposed settlement for the copyright violations (per the article); and now they're trying to establish a legal right to restore the paired-copy system.
Your feelings, frankly, are irrelevant to the question of whether the publishers should agree to license IA to resume the paired-copy lending, or some other mutually-agreeable scheme, which is what IA is currently asking with the petition. The petition is addressed to the publishers, and that is what it requests.
Personally, I'd like to see USC Title 17 amended to expressly permit paired-copy schemes like the one IA was using aside from the "Emergency" one they (unwisely, and in my opinion unnecessarily) tried during lockdown. I've never borrowed an ebook from IA, and I have no need to do so; I have discretionary income for purchasing books, and I have a public library. But IA is important and while there should be some consequence for breaking the law — and indeed there already has been — it needs to be proportionate and not excessively damaging.
(And JFTR, I am also a published author, in my own small way, as are a number of my relatives and friends, and I have other IP. And I support authors in general, and several of them specifically, beyond simply purchasing their work.)
Why should the family benefit from something they didn't earn themselves?
That is the entire basis for the family arrangement. It is an economic system for sharing the resources possessed and accrued by individual members. Perhaps you are familiar with the concept of non-paid spouses and partners? Of children?
Really, that's just an astonishingly broken argument.
One problem is that the "digital age" has already seen some rather extensive updating of IP law, and much of that is a mess. Anyone who's ever had to argue DMCA exemptions in front of the Librarian of Congress — and I have a friend who has done that — knows that the current state of the law is bad in both senses: unfit for purpose, and counter-productive.
Now, if you were to argue that we should just roll back the DMCA and similar laws entirely, and return to the state of IP law just after Sony v. Universal, well, that I'd entertain.
And there's a problem with not allowing copyrights to be inherited ("copyright dies with the author") — it's unfair to authors who die during their term of copyright, versus those who die after their copyrights expire. And there are obvious hacks that will be encouraged, such as naming one's children or other potential heirs as co-authors.
It'd also destroy the value of computer-generated specialty works, which have existed for decades and serve a useful function in their limited market.
There's a lot of adobe construction around here, and I mentally wince whenever I think of the term, thanks to association with this dreadful firm.
Postscript was pretty good. Display Postscript was an interesting idea, if not really successful. PDF was ... well, broken in a bunch of ways, but useful. Acrobat, in its various forms over the years, started bad and steadily got worse. I'm fortunate not to have had to deal with the rest of Adobe's terrible decisions.
We already have AI-generated email being processed by AI recipients, with no humans in the loop; you can find multiple accounts of that sort of thing online. There's already significant use of AI in processing job applications, and in generating those applications for processing.
We've successfully created a new system for automating the waste of considerable resources. Productivity achieved!
Yeah, while we've seen some use-waste-heat-from-datacenter schemes, the economics either have to be forced (subsidies, regulation) in all but very special cases. You could in theory build cold-climate datacenters within single-building settlements like Whittier, Alaska, but you'd have to convince people to live in them. I personally think it'd be interesting to live in that sort of data-arcology for maybe a few years, but I don't know that I'd want to do it permanently.
Technically, money X has to pay. And as someone pointed out above, they might file for reorganization (chapter 11) or liquidation (chapter 7) if things get bad enough. Apparently some legal judgements can be discharged in a personal chapter-7 bankruptcy, but I have no idea whether that's true for a corporate one, or if so whether it would be true for any judgements that come out of any of these cases, or how it would apply to any settlements, or if not discharged where such debts would rank in the list of creditors to receive payment.
While it'd be lovely if the court decided to make Musk personally liable for some of this, I don't see much chance of that happening.
But then a lot of his nominal wealth is in stock in his companies, so bankruptcies and loss of confidence and thus stock value could both hurt him quite a lot.
SCO never really innovated or improved Xenix/SCO Unix/OpenServer after they took it over.
I don't think that's fair. They essentially re-implemented Xenix on top of SVR3 and the iBCS ABI. With ODT they added X11, NFS, and other major features. They added in Merge 386 for running MS-DOS applications (though that was developed by Locus and had already been ported to some other UNIXes; it wasn't a SCO project); and then later incorporated Platinum's Merge 4 with Win95 support.
In subsequent years they merged in a number of features from SVR4.
Even before the Calderafication, there was talk at Real SCO of merging OpenServer and Unixware. That didn't actually happen until the SCO Group body-snatcher era, though.
Sure, this is good work and all, but at US R1 universities, doctoral students typically publish original work, present at conferences, etc. In some technical fields, a doctoral dissertation is often basically a collection of a few published papers lead-authored by the candidate.
That's the Stephen Hawking version
I'm not sure if this is meant to be a joke, but if it was a serious attribution, it's incorrect — or, at any rate, Hawking was only one in a long line to use the phrase. (Not sure where my copy of Brief History is. Also, "recently famous"? That book was published 36 years ago.)
The earliest publication of "turtles all the way down" cited in Wikiquote is from the mid-nineteenth century. The "rocks all the way down" you refer to may be the William James version, which was published later.
Yes, some actual data and analysis, from reputable sources, rather than "guy who, let's face it, is not particularly careful in his punctuation and capitalization, on some random Internet forum", might be a slightly stronger argument.
I'd also be interested, if someone wants to make this sort of claim, in a believable analysis showing motor-vehicle pollution is significantly more dangerous (to people, at the present moment) than other sorts of pollution. In places that are still using coal-fired power plants, for example, I'm not sure the cars would be my first worry. (Well, it depends on how many cars there are in my immediate environment — but that's rather my point. Absolute claims like "car traffic and fumes" and "modern food" are the greatest dangers to human health tend to fall apart when the range of conditions under which people live are taken into account.)
Microwave ovens are faraday cages, which are very well understood, and do not leak radio waves unless faulty.
I haven't checked OP's links (because, ugh, why would you), but I assumed they were related to anecdotes of putting small living creatures in microwave ovens and then activating them. (The ovens, not the creatures.)
For the record, I'm not agreeing with the OP — I understand signal strength and SAR — but I think some people may have misunderstood the bit about microwave ovens. Not that it supports his argument at all, of course. (Even a small microwave oven typically uses around 600 W. If someone's phone was emitting a 600W signal, I'd be rather concerned about the severe injuries they'd sustain as the phone combusted vigorously.)