i mean, why? why run it when you can run openjdk, or temurin, or azul, or microsoft, or whatever... Oracle's java build is not different. The only purpose in running it is if you actually need to access their support. so why run it if you're not already paying for that?
Posts by RachelG
89 publicly visible posts • joined 2 Sep 2009
Oracle Java police start knocking on Fortune 200's doors for first time
Asahi's Fedora remix dazzles and baffles on Apple Silicon
Re: Why?
my feeling on the relative speed of linux on the same hardware is not necessarily up to date. it was true when I ran a hackintosh. :-) and it was true on that G4 Mini.
opencore for keeping at least Intel machines running on the latest (until the latest doesn't even support Intel) is something I look forward to trying when my last Intel mac falls out of official support - otoh i might well be sticking linux on it then for preference. i might even do it sooner though: I run PVR software that needs linux on x64 to run (v4l driver support needed), and it failed to work reliably with that linux in a virtual machine, but the old PC tower is just too noisy to leave running these days.
macports is a name i haven't heard in a long time. the cool kids use Homebrew these days. first thing i do on a new mac. (I lie: second thing, after updates) ;-)
Re: Why?
It was when I ran the same version of Linux Mint on both my G4 Mac Mini and my then-already-not-current (32-bit) Raspberry Pi, and the latter completely stomped on the former, that I finally, *finally* felt it was time to let the G4 Mini go. Found a copy of Leopard to put on it and *sold* it even then, someone paid actual money for it! Those things are just immortal.
Re: Why?
I'm mostly with you on the "Why?" - they got me a long time ago Winston. ;-) And for the Linux I do need, Fedora 39 running in VMWare is very nice indeed already.
But a couple of counterpoints nonetheless.
Typically in the past Linux is faster on the same hardware than macOS. If that matters to you, and once hardware support is complete, there's that.
More importantly maybe: Linux is what you traditionally put on a Mac when Apple don't want to support it with software updates any more. And in my own experience as far back as PPC, they can find themselves being Linux systems far longer than they were ever Macs... because yes, they're just made that well, they go on forever. Especially the Mini. We don't yet know how long Apple are planning to support Apple Silicon macs for: They're still selling the M1 MacBook Air *new* on the site for a start, so it's not going to be for a few years yet. But when it happens, it's good to know Linux will be there, keeping these machines useful and out of landfill for a few more years yet.
Saving a loved one from a document disaster
Re: Imperrfect
it might also have been that the people whose jobs involved lots of typing were more likely then to have actually trained as typists, and therefore avoided RSI through better technique. (eg: the tops of your hands should be level, so you can balance coins on them, and your wrists not resting on anything.) Also this was literally the era of the IBM mechanical keyboards, when IBM model Ms were new and had competitors.
I was a young coder at the civil service at the time and I was warned not to make it too obvious I could touch-type as a: I'd be given more of that kind of work and b: the secretaries wouldn't like it. hell with that. hell with those days.
Is it broken yet? Is it? Is it? Ooh that means I can buy a sparkly, new but otherwise hard-to-justify replacement!
Re: 16GB should be enough for anybody...
nah it doesn't. I'm using a 5K screen, and slum it with a 4K second screen next to it only because I can't *get* a second 5K screen... For coding.
I have a colleague who works slouched over a laptop screen using vi. He seems to like it that way. It's perverted is what it is...
(el reg doesn't let us add emojis. Imagine them. I can't be arsed to turn off the auto-replace that puts them in when I type them old-skool)

16GB should be enough for anybody...
Let's be honest, a lot of us have been in this for long enough that the old reflex of always trying to get the maximum spec at the time of purchase is still hard to shake off.
But actually... I had a 2013 Haswell system with 16GB that's now DEAD. Its replacement also has just 16GB and is under no strain. I had a 2015 Macbook Pro with 16GB that's long since been packed off to someone through Ebay, and the XPS13 that replaced it has 16GB too and is... also not exactly bursting. And I can still do anything I need to do in 16GB except maybe run a couple of generously specced virtual machines in my iMac which... I hardly ever need to do, and your options that way in M1-land are limited anyway.
I think we may have really hit the moment where ... dare I say it... unless you have specialist needs 16GB actually is enough and has been for years and years.
(Ability to add multiple monitors though, *that's* holding me back from M1ness...)
I'm not even going to make the point about x amount of memory in an M1 behaving like more in other architectures. Maybe, because of the reduced need to copy it around the place, but frankly 16GB has been enough in MacOS, WIndows and Linux on Intel/AMD systems for years now. If you really do need more, you probably have specialised needs.
... or you could maybe stand to close/bookmark a few chrome tabs...
Can a 21.5-inch iMac beat the latest-and-greatest M1 model in performance? Kinda
Leopards...
It's even worse than you said: Leopard was the last PowerPC MacOS X. Snow Leopard was Intel-only. (The Intel switch happened during the Tiger period.) My first and last G4 Mac Mini got an upgrade to Leopard, and after that it was a Linux box. (Which I sold only a couple of years ago for actual money, which was shocking, I thought...)
That said, the Mac OS X release cycle then was every *two* years, not every year. So the first Intel iMac arrived in 2006 and the first Intel-only Mac OS X was 2009. The same gap equates to two or three more MacOS releases for Intel, which doesn't sound *quite* so bad...
Facebook finally finds something it thinks is truly objectionable and needs to be taken offline: Apple
Linux 5.10 to make Year 2038 problem the Year 2486 problem
We won't leave you hanging any longer: Tool strips freeze-inducing bugs from Java bytecode while in production
COBOL-coding volunteers sought as slammed mainframes slow New Jersey's coronavirus response
COBOL coders came out of retirement to help fix the y2k bug.
That (the work done) was more than twenty years ago. They came out of retirement to do it twenty years ago. A lot of them probably just aren't around any more.
Technical debt always comes due, but it can be paid in installments. Maybe "if it ain't broke don't fix it" isn't such a great idea after all.
Built to last: Time to dispose of the disposable, unrepairable brick
Bada Bing, bada bork: Windows 10 is not happy, and Microsoft's search engine has something to do with it
Cheapskate Brits appear to love their Poundland MVNOs as UK's big four snubbed in survey again
Good thing about GiffGaff: Very easy/low commitment to give it a try.
Even better: When you do, you find out very quickly that the bandwidth available to data is absolutely appalling. IIRC less than a tenth of what I get through Three at the same location/time.
So, you don't waste a lot of time and commitment finding that out. Although you will be spammed for some time afterwards with offers for that hour or so of experimentation.
Micron's new 9300 SSDs are bigger, faster and simpler... which is nice
Microsoft: New icons, new drivers, AI! Everything is awesome!
Microsoft slips ads into Windows 10 Mail client – then U-turns so hard, it warps fabric of reality
Upgraders rejoice! The 2018 Mac Mini heralds a return to memory slots!
Re: I really don't care that much about upgradability....
time machine backup will be completely adequate for that, and it doesn't even have to be a *fast* external drive for that.
and that would be the main answer to whether to make RAM or SSD internally upgradable: There are plenty of options for external storage, from your basic USB attached drive to a 10Gb USB-C gen2 drive or a thunderbolt3 raid array.
But you can't do that with RAM. It has to be on the main board. So making *that* upgradable is the one that really matters, if it must be one or the other.
I do think they could have put in a M.2 socket for an NVMe card. But i can't bring myself to care all that much.
Counterpoint: My XPS 13 came with a 500GB SSD. It's split into two partitions, for Windows and Linux. Both of them are still well under half full. If/when I do buy one of these Mac Minis it's likely to be with the 256GB SSD option and that's only because I think I *might* be running VMWare Fusion on it one day.
You need to be editing 4K video in quantity to start really needing huge amounts of *really fast* storage. And in that case you'll probably want something external anyway.
it looks like the space was used instead for cooling. That "hulking great fan". That seems to be the other reason for sticking with this form-factor as opposed to going smaller (the other being easy swap-in-ability when replacing previous models in datacentres and other tight spaces). I'm hoping for a machine with the highest cpu option that won't be throttled back to oblivion if you actually ask it to do some work for more than a few seconds at a time. It's looking hopeful, but hoping for a bit more corroboration than one reviewer's informal ffmpeg test. Almost ready to fork out. It looks like the Mac I've wanted for some time.
Apple WWDC: There's no way iOS and macOS will fully merge as one
Does my boom look big in this? New universe measurements bewilder boffins
*Wakes up in Chrome's post-adblockalyptic landscape* Wow, hardly anything's changed!
The eagle has been grounded: Dutch anti-drone squadron retired
Intel stuffs extra cores into latest mobile Series U Core i5 and i7 chips
Software dev bombshell: Programmers who use spaces earn MORE than those who use tabs
What about spaces *and* tabs?
Yes, there *is* such an indenting style. Java Style Guide used to mandate this in ye olden days, not any more: tabstops set to 8, but indent by four. So one indent is four spaces, two indents is tab, three indents is tab, four spaces, four indents is two tabs, etc.
No, I don't use it, not now I have a choice. I have nothing but hate for it. Most editors can't even support it for both viewing and typing. But some nutters seem to actually prefer it.
Me? Used to be all-tabs, now all-spaces, but it doesn't seem to have much impacted my salary.
She cannae take it, Captain Kirk! USS Zumwalt breaks down
'Leave EU means...' WHAT?! Britons ask Google after results declared
Adblock wins in court again – this time against German newspaper
The Nano-NAS market is now a femto-flop being eaten by the cloud
AdBlock Plus, websites draft peace deal so ads can bypass blockade
LogMeIn adds emergency break-in feature to LastPass
Re: AgileBits
@JimmyPage then what's the difference between [1Password] and LastPass?
Well the one that clinched it for me was not having the vault stored on their servers. So removing the "emergency access" questions above (although also not having that feature of course). I'd ideally have it in my owncloud but make do with it in dropbox for the sake of syncing to my iOS devices.
No Linux support in 1Password. I use macs now but this is one reason I've lost being able to say I stick to platform-agnostic software. (The other being Ulysses.)
Doctor Who: The Hybrid finally reveals itself in the epic Heaven Sent
Chips can kill: Official
Re: What about that 'High Fructose Corn Syrup' then?
Actually I think there's been recent research showing it might be the artificial sweeteners contributing to T2 diabetes in obese people. After all (and speaking from experience) obese people *do* make continual efforts to lose weight, and one of those efforts is to use stuff with artificial sweeteners.
Nothing about any *specific* low-calorie artificial sweetener; rather the general effect of consuming something which your body is initially fooled into thinking is sugar, triggering a release of insulin to process said sugar - which never arrives in the gut. And then we wonder why our blood-sugar regulation goes on the blink...
If you want something sweet, probably best to have something with normal cane sugar in it, or honey or somesuch. Just not too often.
Multiple fondling on the MIGHTY 12-INCH iOS 9 SLAB — so, so close now
Force your hand: Apple 13-inch MacBook Pro with Retina Display
Bite my shiny metal Ask: Java for OS X crapware storm brewing
am a Java Dev, albeit mainly server-side. But this pisses me off so much. I mean, client-side Java has *enough* problems with end-user acceptance already, much of it ill-informed (say as valid as refusing to use Windows now because of my experience with Windows ME) but nevertheless there, without *this* too. It's really, really unhelpful. I can't believe (especially after the Lenovo debacle) that they really get enough from it to be worth the reputational damage. No-one, but no-one wants the bloody ask toolbar!
I think if I was to be trying to develop something for the desktop now I'd be looking at the Packager stuff, to just create it as a standalone app, that bundles a minimal JRE for its own use, work hard at seamless OS integration, and really just quietly not bother the user with even the knowledge that it's Java. But if you're going to do that you're probably better off going native anyway.
BTW if you install the JDK rather than the JRE from Java.com, no ask toolbar. And there's a setting in the Java control panel to turn off future prompts to install ask, but it's really not good enough. They reserve the worst experience for the poor bloody end user.
Just WHY is the FBI so sure North Korea hacked Sony? NSA: *BLUSH*
Google+ architect: What was so great about Reader anyway?

The fact that it was *just* an rss reader, without all sorts of irrelvant social media crap, and can be used seamlessly across many devices.
Trying to get used to Feedly for the last 24 hours. It's so... bloody... annoying...
But I think we're reminded of the saying: If something's free, you're not the customer. You're the product.
Review: The ultimate Chromebook challenge
Re: Multiple accounts?
Haven't used this latest version of chromeos, though i doubt it's worse. In the original, when you turn on, or log out, you find yourself at a login screen where you use your google login. So switching is a matter of logging out, then logging into the other. I expect current chromeos would be the same, but don't actually know that.
Which means having both active on the screen at the same time probably isn't going to work.
Latest exoplanet discovery is a virtual CLONE of Earth
Review: Apple Mac Mini 2012
agree, as a multiple-mac user myself; the mac mini under my tv is running linux, but only because it's spare and capable. i wouldn't be buying a new one just to be a media player.
Although I did once, running Plex; but I was feeling richer than I was in reality, and it didn't work out (aforementioned HD3000 bug wrt 23.976Hz) so that machine ended up on different work anyway, and I a little wiser. :-}
Re: If you want cheap
any old thing these days can do 1080p, including raspberry pi; the difficulties is in dealing well with interlaced material. Ideally you want temporal/spatial deinterlacing, for which the gt520 is needed. The 320m in my mac mini can't do it, only getting as far as vdpau-bob (although to my eyes there's not enough difference to be worth the extra outlay).
Needed if you watch a lot of BBC HD output through it. :-)
the 2009 mac mini (which looks identical) has an nvidia GT320M on it and as such runs XBMC on Linux through HDMI just perfectly; although thanks to their odd EFI implementation it's a bit of a fight getting the initial linux installer to boot.
Buying a media player outright now, I'd rather get a Zotac ID80, with an onboard GT520 - rubbish for gaming apparently, but therefore cheap and perfect for media-playing. Even if cost isn't an issue, these Intel HDx000 integrated GPUs seem to have been a bit of a disaster for media-player-type use. The HD3000 models couldn't even lock to 23.976Hz for movies; I don't know if they finally fixed that for HD4000, but shipping with this HDMI bug doesn't give confidence that they're paying attention to that use-case.
I think the current crop of mac minis are great, but not as media players. They're fine desktop machines if you want to choose your own monitor, and great little servers.
Half of us have old phones STUFFED in our drawers

the two old iphones i have in a drawer are dead, nonfunctional, in different ways that have resisted repair. (one was dropped on a slate floor; computer still works but screen is dead. the other had a dead mute switch which i tried to treat by jailbreaking (to replace it with a software switch) but ended up bricking it.
would they be interested in those?
Nvidia heralds Steam for Linux debut with 'double-speed' drivers
not a gamer - but i always presumed the game-players fps figures were with respect to a benchmark example gameplay of some sort; and thus that if it could play *that* at X hundreds of fps, it could play newer stuff with far more detail at the monitor's actual refresh rate with ease.
Is that actually so, or am I making the fallacy of assuming a logical explanation when it's probably as arbitrary and variable a unit as women's dress sizes?