Probably also worth mentioning that they're trying to spool up a small nuclear reactor program for use in VA and export to other states.
Safer than coal, at least.
128 publicly visible posts • joined 25 Jan 2019
iOS has tried to do this as well, it fights you hard to keep you from seeing the filesystem structure. They claim it's for 'security' (which surely means 'so you can't jailbreak it') but it only really makes sense if you understand it as being philosophical.
Makes it impossible to use iPhones/iPads to get any serious work done.
The 'turn on and go' and 'jump into basic' thesis seems to be completely covered by RISC OS, right? Boots instantly, hit a key to reach BBC BASIC, runs on Pi.
Weird I searched all the comments and only one person mentioned it. Checking the Raspberry site, I also see they've removed it from their downloads page and from the list in the Imager app. Perhaps that'd be the place to start.
Ultimately, the point is that you can base your entire company infrastructure on an open source software tool or library, but there's no obligation for the maintainers to actually fix it when it's broken. If there's a zero day, they aren't obligated to do anything about it, especially not on anyone else's expected timeline. It is the perfect example of 'a lack of planning on your part does not constitute an emergency on my part'.
The people who are advocating for this kind of financial support for open source developers to maintain the software recognize this and are trying to provide an incentive, not because of a moral obligation but because to not do so leaves them totally at the mercy of the maintainers, who may consider that maintenance a low life priority. If that software is mission critical for your business, it's a smart business decision to convince those devs that it should be a much higher life priority for them, perhaps even their full time job.
Apple was working on computer-grade ARM CPUs for a long time before they switched off of Intel, and may not have ever made the switch had Intel not followed a diminishing price/performance curve into oblivion for the foreseeable future. Sure, Apple functionally has infinite money right now, but it's not as if Qualcomm can't finance parallel RISC-V development as a moderately short term drop-in replacement insurance policy if ARM tries to pull something similar.
RISC-V's issues at the moment in comparison to ARM would probably be more related to the swift development of newer standards of implementation, to the point where the newest performant RV chips are already either falling behind or not managing to live up to expectations. See the SpacemiT K1/M1 RVA22 situation. It doesn't help that most RISC-V implementor/manufacturers at this point are somewhere on the level of Rockchip, Mediatek or Allwinner in the ARM world. Building actually compliant and performant RV SOCs should not be a problem for larger manufacturers who are able to take their time getting product to market.
Chrome is particularly bad about this, which means that every web thing that relies entirely on Google's frameworks like Steam and those stupid Electron containers may have problems (even Edge because MS have absolutely no foresight), but much like the folks in the comments telling people to switch to Linux: Firefox still works on Win7 without issue.
I recently switched to Floorp, a Firefox fork with a silly name and an obsession with privacy. All the updates, none of the weird nonsense Mozilla is trying to sneak in now that Google has totally abandoned any pretense towards privacy and user control and Mozilla thinks they can get away with it.
As much as I appreciate and support what Ocean Cleanup is doing, this is the only realistic solution. Plastics are not recyclable and never were, and no amount of innovation will make them sustainable.
https://www.theguardian.com/us-news/article/2024/sep/09/us-voters-distrust-plastics-manufacturers-claims
"Research published earlier this year found that plastic producers have known for decades that plastic recycling is too cumbersome and expensive to ever become a feasible waste management solution, but promoted it to the public anyway."
"In fact, just 5% of plastic waste generated by US households in 2021 was recycled, one study found."
This is figure is not a result of people not sending plastic to be recycled, it is the result of plastic being expensive to recycle at best if not literally impossible to reuse in new products after the first generation.
SCO never really innovated or improved Xenix/SCO Unix/OpenServer after they took it over. They even wound up purchasing Unixware from Novell and doing very little with that other than maintenance. The driver situation was particularly appalling.
The reason is that they were only interested in massive corporate distribution and the hefty maintenance contracts that came with. By the 90s, Xenix and descendants provided the point-of-sale backbones for a lot of retailers and lots of commodity small-to-medium business servers. However, they put no thought in to potential competition and just assumed that customer lock-in would ensure at least a couple more decades of easy money, just as it had up until that point. They did not think that companies would ever be willing to consider using Linux, even as companies like VA Linux and Red Hat started selling and supporting hardware for the growing market and IBM began spooling up programs to investigate the potential.
So within a few years of their full takeover of the license for Xenix they were on the back foot, then absolutely on the skids, then reduced to desperate lawsuits accusing Linux of intellectual property theft.
Well said. There is one very particular thing that would have wound up being the MSX's Achilles' heel had the standard reached the sort of market penetration that Microsoft hoped (comparable to IBM compatibility - remember that this was not just a hardware standard but a software/OS standard designed by Microsoft and ASCII Corp).
The MSX memory map was left relatively free and open in order to accommodate varying amounts of RAM across model lines and allow compatibility between low-end and high-end systems. Many of the very earliest systems had 8k of RAM, which was pretty much useless but it kept costs down, allowed some BASIC programming and let you play cartridge-based games. However since the MSX standard didn't hit the market until 1983, RAM was cheap and these low end systems didn't really appeal to anyone, so dictating a minimum of 32K if not 64K would have allowed them to nail down a standardized memory map.
The result of the memory map is that different regions (and occasionally different companies within the same region) wound up settling on different memory maps. Now, the MSX standard made allowances for this and properly written software generally shouldn't have problems, but by-and-large what really happened was that software makers just assumed that all MSX machines used the same map as the dominant manufacturer in the region. This meant that, unlike MS-DOS software, a significant amount of MSX software just assumed the RAM was at a certain 'Slot' in the internal mapper and then crapped out when it turns out something else was mapped there. At the time this would have been inexplicable from a consumer standpoint and likely resulted in complaints to the software company, who probably would have just dropped the product and any future product intentions.
This gets even more fun when you take into account the other big highlight of the MSX standard, and the reason for the international popularity that wound up highlighting this problem - non-roman character sets. Since one of the main companies involved in the original standard (and the biggest market) was Japanese, they implemented a fairly thorough Kanji input system. However this wasn't quite standardized, so when they made Korean and Arabic MSX systems, things also broke in new and interesting ways thanks to the memory mapping.
My MSX2+ is a Korean one that I imported a few years back when someone found a cache of them in an abandoned school. Great specs for the system, but despite having 128k it has a very odd memory map that means that software is a crap shoot.
There's a more in-depth write-up here: https://www.msx.org/wiki/MSX_compatibility_problems
The MSX still has a massive following in the Spanish retro-computing community and a significant fanbase in the Russian retro community.
I am firmly convinced that changes like this are the product of middle managers desperate to justify their jobs, which should in truth never be anything more than maintenance. Maintaining something that works fine doesn't get you a bonus or a promotion in a dysfunctional company (something that absolutely should get you paid and respected in a company that isn't poorly managed). Breaking something that works fine, in a flashy way, will net you that bonus and recognition.
The problem is that when the company/person the code is lifted from looks for someone to sue, the buck stops with the user of the LLM.
So your theoretical engineer could get the blame and get hung out to dry, but if you use an LLM and it steals code, you're that engineer and you probably don't even realize it.
AI code is unreliable on both a functional and a legal level.
When I started getting in to building hobbyist computers a few years back, the CP/M version in ROMWBW was the only real way to go, so I have several such systems sitting around. I don't have much use for them (the building was the fun part, fighting with CF card formatting was not) and there seem to be far more options now, but I still keep a little Easy_Z80 plugged in via USB-Serial just to poke at it now and then.
In the US it is. Bette Midler vs Ford Motor Co. set the precedent, Tom Waits vs Frito-Lay expanded it, it's established law ever since. OpenAI just did it to a woman who had the stones to sue Disney and force them into a settlement, participated fully in the recent WGA/SAG strike where AI and these fakes were a huge issue, and her statement indicates that she is more interested in setting legal precedent than any monetary outcome.
The 6502 won in the US consumer market, the Z80 won in most of the rest of the world. MSX, CPC, Spectrum and the dozens of clones. The 6502 was chosen as the basis for designs prior to the point where the Z80 became cheaper, hence it was retained for the major American brands for years as they kept updating their late 70s designs. The rest of the world came a bit later and chose the Z80 because it was more cost-effective and they didn't have the baggage (also the eastern bloc countries figured out how to clone it, which is why they took to the Speccy so much).
Tandy/Radio Shack could have helped make the Z80 more ubiquitous in the US, but instead they threw compatibility to the wind and just picked whatever chip struck their fancy for every new model. Hence, the Dragon.
Z80 remained the processor of choice for business computing, but that's a whole other market really and CP/M was all that mattered there.
It'll be interesting to see how heavily they lean into the RISC-V side of things going forward, since MIPS seemed to hit a wall the last time Loongson tried to push their own CPUs.
Given RISC-V's open architecture it may be more workable, but also given how wildly configurable RISC-V can be in implementation it could wind up being an overly complex boondoggle.
He is truly on the forefront of the sector of the entertainment industry that could be replaced with an LLM and some editors, since his movies and TV shows appear to be made using the mad libs methodology.
Open up a previous script, either his or someone else's, and swap some nouns and verbs. If it makes sense, shoot it and send it straight to streaming. It already doesn't cost enough to make to catch the eye of a Zaslav who would cancel it for the tax write-off.
They exist, but are only marginally faster than stovetop kettles and considerably more expensive than the chunk of aluminum that comes with the set of cheap pots and pans you get when you move in to your first apartment/home. Kettles also don't get used as much in general.
More common in recent years if you accept those Keurig machines with a separate hot water spigot as an electric kettle.
I have an electric kettle in the US, purchased a few years ago when I lived in a very small apartment with an ancient electric stove (rusting and older than I) and didn't want to leave the traditional aluminum kettle on the hob 24/7/365 but wanted to be able to boil water quickly. The electric kettle gets water to boiling faster than the hob and wastes far less radiant energy, since the stovetop kettle is incredibly inefficient, but it's rare to find them outside of hotel rooms because they're comparatively expensive given that the vast majority of the time when one needs to boil water it's for cooking, not a beverage or cup noodle.
As a matter of fact, my uncle gave me a ridiculous copper-gilded stovetop kettle for Christmas last month that he'd picked up at some discount store like Costco or Sam's Club. Didn't have the heart to tell him I went electric years ago and had to clear some space in a cabinet for it to rest in for a few years before it goes to charity.
I make my tea the NATO way except in the cup because nuts to your teapots, and I use Twinings because all of the available alternatives are awful.
I didn't really experience the era firsthand, but you (or someone else) may know: why have I read people hating on OpenLook? I've used it on Solaris 2, along with DecWindows on VaxStations and Alphas and Motif on a variety of systems, I grew up using Gnome 2 and KDE back in the 90s, and of all of them, OpenLook was the nicest and most coherent GUI on *nix in the 90s. Was it really awful under the hood or something?
Youtubers are already angry that Google won't provide them with metric data to show this isn't exactly what's happening. Since they make their money on length of video views, an AI that jumps through various videos to hit them means they get fewer meaningful (i.e. profitable) views.
"Not a bug, but a feature."
This is how they make their money, it's the tiny piecemeal search results that add up in to the huge tons of search 'grain' they sell to people. Except unlike with physical grain, search engines never let you actually see the grain or know anything about it other than general figures so the buyer never learns that it's 90% useless gravel. Some people have realized that you can even make money with the gravel, others just keep buying it for fear of missing out, but only the most unsavvy people would actually be fooled by it.
It's something youtubers have begun complaining about, because when you're that focused on getting your metrics right to make sure you're not getting screwed, it begins to become obvious when you're missing the truly necessary data to do so.
Hah, interesting that you found the Q68. I purchased one from Derek a bit more than a year ago after a few years in the queue, it's presumably one of the only ones in the US. He's a gent, takes time to build batches but he does it for the community. The Q68 is a solid, extremely compact little black metal box, uses PS2 ports for keyboard/mouse and VGA, however it plays happily with a good HDMI converter.
I purchased it out of curiosity, since the QL never made it to the US. I haven't delved too deeply into it but it's fairly obvious that it just didn't have a specialty that would have commanded the loyalty of the Amiga's multimedia capabilities or the ST's rock solid MIDI timing. Add that to the flaws with the original hardware design and if it weren't for the extremely basic design it really didn't have much of a reason to be kept alive by the community. Since it was a simple enough design with off the shelf parts, it made it easy to clone it and re-implement it in ways that weren't really possible with Amigas based on specialty chips.