Re: "we understand..."
That sums it up fairly accurately.
91 publicly visible posts • joined 6 Jun 2014
I was a rabid OS/2 user back in the mid to late 90s until Linux wooed me and eventually swept me off my feet. It was, indeed, the better DOS and Windows that IBM promised and was a literal godsend for my use case of managing a mid-size Novell network over both ARCnet and TCNS with Windows clients aplenty. I'm aware that many banks and ATMs used OS/2 in its various incarnations until the mid 2000s but is it still being used in a professional capacity today at a volume worthy of keeping alive? Being stuck at 32 bit (which was hailed as a revolution at the time) severely limits its use today except for hobbyists like me that will buy a copy. That said, I'm glad OS/2 is still around and getting the love it deserves.
Judging by my two 20-something daughters I'm inclined to agree. Whenever I call them to verify that their vocal cords are in fact functional I receive a disapproving tone of voice that suggests "Jeez Daddy you're so old". My teenage son on the other hand refuses to communicate with anyone unless it's through TikToc or SnapChat.
Agreed, some of my favorite games are produced by independent studios. The last AAA game I bought was Forza Horizon 4 and it took ages to sort out that mess. Benchmarks ran just fine but benchmarks didn't show the horrendous input lag that made it unplayable or Microsoft's cavalier attitude in fixing it. Never again.
The caveat "1.7x faster than its predecessor, when using its frame generated DLSS 3.0" is typical Nvidia marketing behavior and is apples to oranges regarding gen vs gen performance increases. Of course DLSS runs faster when turned on, that's the whole point of if it if the games support it. No mention of 3060ti vs 4060ti non-DLSS performance increase coupled with comparing it to a card 2 generations behind it makes me even more skeptical of their claims. A 10-15% actual gain is much more likely but doesn't sound as sexy in the marketing material.
>> goodbye, nVidia
You might want to explore Intel's Arc 750/770 as the drivers are maturing week by week. The just released 6.2 kernel has Arc support baked in so no need for twiddling or running Ubuntu to make it work properly and the price/performance ratio beats anything nVidia or AMD is offering. Everything I've read is very promising and Arc might do very well indeed on Linux.
I worked in a NetBSD shop around the time of the Dragonfly split and through the years I've heard many variations of the argument (sometimes in raised voices) about which approach works best. Consensus seemed to be they both are and since Net and Dragonfly (mostly) peacefully coexist I don't believe there is a "right" or "wrong" approach. As Liam said above, choice is good.
I'm embarrassed to admit it's a matter of pure convenience on our part. It is much easier to say "Alexa play white noise" than it is to muck around with Bluetooth and plugging in yet another device to save the battery. Trust me we have tried myriad ways and this is the easiest and best solution for the both of us. The app has a ton of settings to sculpt the sound to your liking as it's not easy to get two people to agree on what constitutes "white noise" as apposed to just "noise".
We have a number of Alexa devices in the house and we use them just as the study suggests: music, reminders and the occasional argument settler. The only Alexa service we pay for is the advanced white noise app that we use for sleep and it is worth its weight ($2.99 a month) in gold. I'm willing to bet Amazon will increase the cost of Prime for Alexa devices over and above the cost of regular Prime service to keep the accountants happy. When this happens we will bin the lot and go back to earplugs.
I was using the iPad as a reference, obviously they are for completely separate markets. What exactly are you going to be filling that larger storage with short of media? It certainly won't be touch native applications. I'm no Apple fanboy (I champion all things Linux and have for years) but the value of Apple's massive ecosystem can't be overstated. Adding a mouse or other input device such as a keyboard would make it far easier to use normal Linux apps but now we are getting away from being a 'tablet' and closer to Microsoft's Surface. At $400+ this is outside the realm of 'nice to have plaything', at least for me. I wish Juno the best of luck but I don't see a compelling market niche for it.
..for a no-name, non-upgradable Chinese tablet with a slow, low-end processor. I'm all for more Linux based appliances but Apple's base iPad is $100 cheaper, boasts far higher specs, has an OS designed from the ground up to be touch native and a vast ecosystem. Tablets are not flying off the shelves these days and I'm struggling to see what market niche this would fill other than developers working on touch native applications.
>>Decision-by-committee is the worst death by a thousand cuts
Sounds like you have been subjected to one too many "product planning" or "customer focus" meetings, as have I. Herding Jell-O cats is easier than getting a firm decision by a group of people with conflicting interests.
I'm thinking more or less along similar lines: what happened to the push for efficiency? When AMD announced their Ryzen 7000 series and the large boost in TDP I thought it was a misprint at first and now Intel is doing the same thing. So much for efficient and cool running CPUs and their less costly cooling requirements. It's no wonder ARM is making inroads in the data center, small though they may be.
>> I'm not interested in sharing the minutia of my life with strangers.
That, in a nutshell, is why I've stayed away from almost all forms of social media. If I wanted to know what you ate last night or what film you watched I would ask, I don't need to read about the next day. Years ago B3TA did a "show us your useless inventions" Photoshop challenge and one of the winners was Twitter enabled bog roll. "I'm wiping my ass with Tesco Ultra Soft" is not too far removed from the drivel that populates most Facebook or Twitter feeds.
More like Talky Microwave. I have a mid-range micro that announces every key press with a shrill and incredibly loud beep with no option of turning it off. Why? Ostensibly it's for the visually impaired but a closer inspection reveals that to be false: the number pad is perfectly smooth (so no tactical feedback) and the tone is the same for every key press (so no audible feedback). The best I can come up with is the micro has taken it upon itself to inform the entire household that Dad is back from the bar and reheating that last piece of pizza at 2 am.
I remember using A/UX 3.x back in the 90's. It was lovely, the stability of Unix (with some BSD goodies) paired with the Apple finder grafted on top. It's a shame Apple never continued development and restricted it's use to Nubus equipped 68k Macs, it might have replaced the mess that was classic MacOS with something far more modern.
Back in the day I asked one of our Dectites (the DuPont name for those brave unfortunate souls) a technical question and he referred me, with some reverence I might add, to the VAX/VMS documentation library holding forth in its own vast storeroom. Had to supply my own breadcrumbs to find my way back. Never did find the answer.