An ex-coworker once riffed on O'Reilly ...
76 posts • joined 28 Sep 2011
Honestly, other than more rows in an Excel spreadsheet, it would be really difficult for me to name an "improvement" to any of the Office Suite products since 1997. Not that I've been able to use them for a while (had to abandon those old 32-bit Windows apps after ... XP?) so maybe it's just nostalgia, but they pretty much lost me when they introduced "the ribbon" (2003?).
In fact, I was just looking for the link to a radio snippet about the hominid adaptation that allows us to metabolize (yes "Z" or "Zed" -- I'm a Yank) alcohol, allowing hunter gatherers to digest fruits & berries past their prime. Instead I found an equally interesting snippet that explains why many of my E. Asian friends get flushed when they try to keep up: https://www.livescience.com/61845-evolution-may-decrease-alcohol-tolerance.html.
This story reminded me of the Harley-Hampton twins (Devonte & Deante), talented young high-school (American-) football players. Word on the street was that they were offered a scholarship to play for the local Catholic school rival to our public (state-run) school, where my kids went. They wanted Devonte to cut his dreads. The public school, OTOH, told the boys they'd take them, "as-is". Sold! Those boys took us to two state championships in their varsity years. They gave it their all. They left nothing on the field. I understand they went on to (so far) success in the NCAA (university level league). The Catholic school has since dropped to a lower division.
... you'll find that in addition to an accentuated reliance on the likes of Amazon for goods you might normally just pop out to the local big box for, the CORONA virus has inexplicably ushered in a dramatic rise in the incidence of car-jacking, or "grand-burglary auto", the taking of a vehicle using the threat of those deadly weapons which are so ubiquitous in the US. I have to think that jacking an Amazon delivery would be a bonus for one of these thugs.
Apologies to Winston Churchill. I used to hate Salesforce, till I tried some of the alternatives... With #slack, it's rather the opposite. After years of complaining about chat systems (for 90% of all use-cases I still prefer email to getting a real-time interrupt that says "Joe is typing...." .... still typing ... ....; plus, I long ago drank David Allen's Getting-Things-Done "the fewer inboxes the better" kool-aid) I have come to hate #slack much less than the others.
Not that there's any age-ism in our field but ... a few years back I read a post (likely on LI, but I can't be sure; memory's the 2nd thing to go you know) about how to write a resume that doesn't divulge your age. Lose the double-spaces was number 1. Along with omitting a snail-mail address and of course the date of your University graduation.
Those of us who have been doing this for awhile will also leave off the side job we had in school overclocking Apple II's, and other ancient history. If done correctly they won't distinguish yours from someone whose career began when your reverse cron CV ends.
By the time you show up in that suit & tie you haven't worn since your niece's 2nd wedding they will hopefully be invested enough in your skills to look beyond your attire and the grey beard.
I thought that was strictly a MacBook thing?! I've owned or been assigned (by my employer) dozens of Windows lappies over the years (not to mention Linux ones, but they were generally loaded with Windows when I got them), and the only time I've had this issue was on one of the two MacBooks I've carted around during that same time period.
.... after high-level languages to run the same 3GL across hardware from multiple vendors, came the rise of the operating systems. Where once the PHB might have asked "But will it run on our Burroughs ( / Honywell / ....)" the question later became, "Do they make a compiler / runtime for ... CP/M .... MS-DOS .. Windows or Unix ... Linux... This article made me think of that though; I forgot who locked in we once were to specific hardware vendors. Now if only I could find a version of the audible.com client that runs on my Dad's old windows phone so I could share this book with him....
>> for most people the OS they use is pretty irrelevant.
This is probably truer than it once was, but still not quite.
Having made the switch from Windows 10 to High Sierra within the last year I have to say there was a definite learning curve, and this from a guy who has been using bash (, ksh, tcsh, csh, sh) for 30 years on. If I logged my wife or kids (3 Windows users and one Chromebook) into my MacBook Pro I doubt they could do much with it. YMMV.
Only been using Linux since the late 1990's; before that I used commercial Unices. That said, there are *still* things that are just easier in Windows: Streaming audio & video. Tax software (in the US most of us spend a weekend / year, or pay someone to, trying to keep as much of our hard-earned income as possible, since it's not like the Guv'ment actually provides any services in exchange for it; we're on our own here for health insurance, university education for the kids, transport, etc.). Building and using a toy database. (This one shocks me; even Oracle used to be easier to install and run on Linux than WIndows; but recent experience with 18 seem to have completely flipped).
So at home my Windows 10 desktop gets at least as much use as my Ubuntu box, and of late, my MacBook Pro even more, though since I'm new to it there's a certain Bright Shiny Object appeal there.
YMMV, but for me it's different tools for different jobs.
I almost upvoted you until I considered the disrespect our Mango Mussolini has shown *science*. And journalism. And democracy ....
I don't think respect is earned. I grant a modicum of respect to everyone I meet. I give them the benefit of the doubt. Disrespect is earned, and the Don has been earning it since the beginning of his public life, and continues to do so with every speech, every tweet.
It's not 1 April, so I don't get it. Things Palm did well:
1) Run hundreds of free or nearly free apps.
If it's an Android, this should do that.
2)Keep a handy copy of all your immediately essential data (contacts, calendar, to-do...) *offline*.
3)Sync copies of that data to a "real computer" (desktop, laptop) with nothing more expensive or tethered than an occasional serial connection.
Really? Z80-based Unix? A 32-bit O/S on an 8 bit chip? I seem to remember that Xenix (when it came along) was a 16-bit port, and that Zilog did make more powerful chips based on the venerable Z80, but not quite as popular. All the Z80s I ever touched ran CP/M, and I'd have saved a month or so of beer money for a Unix that ran on them.
No, I remember well that you could take down certain SPARCstations by unplugging the keyboard. Weird, but true. I was working for a trading firm where the Traders were in fact quite smart, and did have access to the glassed in room with the raised floors, so nothing about this story sounds at all improbable to me (except the duly noted anachronism of the buzzword 'DevOps'). About that same time, working late I accidentally did an rm -rf ./src in the wrong place, deleting all the source code staged for our current build. As I frantically called my boss (for visibility), I remembered that our NAS took frequent snapshots, and was able to restore the company's tens of millions of lines of proprietary source code, and kicked off a validating build that came up clean. This was way before "git" so source control systems were centralized, and our developers were not overly dedicated to committing code frequently, so this no doubt saved my job, if not the company. Ah, the good old days...
Have you actually seen / held the iPad at BestBuy? If I had a nickel from every time Best Buy had a great deal on a fondle slab that didn't exist in the inventory of mine or the 6 US states East of me (wanted one for Mothers' Day, and was willing to pick it up en route) I'd have 5 cents; because after that their ads went straight to the recycle bin. No, they would not give me a raincheck.
Goal setting for example: my twin brother & I decide over a pint that we both need to lose weight. I (Leisure Larry) start eating better, exercise a lot more, drink fewer pints. My neighbor (Donny Disciplined) goes with an elaborate plan to lose X by date Y, and 2X by Y+7, etc. Who ends up losing more weight? Conventional wisdom says that Donny bests me in terms of both weight and time, but has anyone done the controlled experiment? What if his goal was too low, and I exceed it for not having set one at all?
The goal of agility after all is not to build software more cheaply, or produce software of higher quality as the article seems to imply. It is simply to adapt to a world in which requirements change more quickly than software can be developed using prior (e.g. waterfall) methodologies. Given perfect requirements you can build higher quality software more cheaply using waterfall than agile. I was surprised to hear none other than Allistair Cockburn admit as much a year or so back at a Groupon lunch & learn (Geekfest). Good luck getting perfect requirements.
I'm old enough to remember waterfall, and UUP (nee RUP) with it's 125 pre-coding artifacts, and a number of other pre-agile methodologies, but have been working with agile teams (various flavors) for a good dozen years or so, and for most projects I would refuse to go back. It's not perfect, it can be light on design & documentation, but you deliver something sooner rather than later, and the something ends up being more useful to more users, in my experience.
I don't need a controlled study to prove it to me.
My own relationship with perl is complicated. I used it for everything that was too laborious to do in C (and later java) for a good dozen years or so. I used to ask the python evangelists, "What can I do in a .py that I can't do in a .pl..." just to watch their enthusiasm dim as they skulked away.
Then something changed. Perl 6 was supposed to happen, but didn't. I had to begrudgingly admit that I could read almost anyone's python code but had a hard time making sense of my own 5 year old perl hack.
And bugs! Constructs I'd been using for years suddenly stopped working, on Linux even, not just Windows & Mac.
So perl is like that old Honda that used to be my go-to way to get lots of places, and suddenly it's no longer as reliable, and I am starting to covet my neighbor's Hyundai.
There's actually something called https://www.enterprisedb.com/ which provides an Oracle (the flagship dB, not the company :-)) compatibility layer between the application and it's data. I've only played with it, but the idea is that you could build your app on free postgreSQL, limiting yourself to only the features also supported by the big O, and if / when you need to "upgrade" you migrate your data and your good to go. I'm surprised by how little uptake / attention this seems to get. I doubt that many user orgs actually get to the migration phase, which may be the point of it.
As for the old Sun product line, I have found memories of Solaris / SPARC / Java and the rest of their product line. It was great to be one of their customers. Somewhat less so for Oracle I'm afraid.
Remember when Sun Microssystems first rolled out Tomcat (er, uh, Catalina), as a "reference implementation" or a wheel they expected others to reinvent? Remember how since it was plenty good enough for it's time, most of us just used Tomcat?
Same thing here. Docker completely owns the container space, so there's will be the reference implementation for the spec.
... but also at math; he being a Russian, me being a Yank and all but um....
“Twenty years from now our children will look at us driving cars and be baffled,” he said. “Look at the death rates from driving. Now, when a computer driver makes a mistake it’s big news, but look at how many die on the roads due to human drivers.”
While I do think self-crashing cars are inevitable, given the infinitesimal number of them currently on the roads, the fact that they're already making (sometimes fatal) mistakes does leave the impression that we're off to a rather bad start with them.
>> What's your point? There is Java , Java 2, Java 3, ..., Java 8; C++ 3, C++ 11, C++ 14, C++ 17.
Yes, but by & large they have backward compatibility. At least java does. I haven't touched C++ in years but as has been amply pointed out in this thread (I'm paraphrasing a bit) "You can write K&R C in any language" (including C++ last time I checked) at least with a few gcc flags and ignoring the warnings; . This is *not* true of Python (or for that matter, my own favorite scripting langauge perl, though in fairness I'm guessing that the vast majority of all perl ever run in production anywhere was perl 5.x).
Then what's the point? If self-crashing mode requires the driver to be able to take control at any point, then why pay the big bucks for it?
I don't even like cruise control for this very reason: Some fool pulls out in into the left lane (typically without signalling a lane change -- I digress) doing five under the limit when I'm doing five over, and the brake disengages the bloody thing.
I don't see the point.
You mean the internet is supposed to be even slower today than yesterday!??? I was WFH yesterday in Chicago and every single time I hit a URL for the first time I initially got a DNS failure, followed a second or so later by resolution. Please tell me El Reg is on vacation in Singapore or somewhere the date is the 13th, and the *Day of slowness* (which I am actually just hearing of!) happened YESTERDAY!
I was making a living in IT for at least five years before anyone ever made a pointing device part of my standard hardware setup, so .... I'm pretty comfortable with a keyboard.
Apple, MS & others have been shoving pointing devices & GUI's down my throat for so long that I've finally gotten to *like* them! Then along comes the phone and the tablet and "gestures" which originally just caused browsers to scroll oddly if I was too far right on the touch pad ... (better stick to that keyboard after all) and you know what?: I got used to them too! In the meantime, they brought multi-tasking to the masses by introducing the notion of windowed interfaces which could be minimized, and brought up side by side on the same screen, and...
Then Windows 8 comes along trying to capitalize on the fact that the Apple juggernaut had two completely different interfaces (iOS & MacOS X) even as they were rapidly converging their hardware offerings. Unfortunately, by trying to be all things to all folks (<Alt>-C & start typing! OR touch the far right side of your screen, OR...) they failed everyone. (Metro of course puts paid to the whole multi-tasking as multi-windowing paradigm as well of course).
That said, my Win 8 notebook (never upgraded to .1 but configured to be desktop-centric: I literally never see the Metro interface) is starting to flake out, and rather than get a Win 10 replacement I bought my first Mac Book Pro. It's got a decent bash shell, a nice responsive UI, .... what's not to like (other than Safari, but it served to download Chrome & Firefox quite well)!?
Now, if only it had a touchscreen....
... for what I always call Deep Sh*t Nine, .... they don't "Boldly go ... " anywhere. It kind of ruined it for me. Also, they have money. Both Kirk & Picard pontificate often about the nastiness of societies where money makes the world go 'round, even as Riker goes off to play poker and Paramount kept milking the cash cow that ST became. DS9 might have been a very nice series on it's own but it's a black sheep of the ST family, IMHO.
> The USA grew into a surveillance state under Obama, ...
As you mention yourself, W. and the neocons did this. The wars in Iraq & Afghanistan, the "Patriot" bill, the whole nine yards. Has it gotten worse? Yes, gradually and over time, but I stopped flying "for pleasure" well before 2008, since it just wasn't pleasurable anymore. Now I only fly for work, and only when absolutely impractical to do otherwise. "Sorry boss, to save 3 hrs of humiliation by the TSA goons, I'm driving to NY so I'll need an extra day and lodging on either side of the trip -- oh, and I'll being taking Friday's meeting from the car...."
Your argument is equivalent to blaming Truman for our involvement in WWII.
If you check the tiobe index I think you'll find that java peaked in 2004, a good four years before any of us ever heard of Android. My own experience (wrote C in 1984, started switching to java in '96, switched completely in 2000) confirms this, YMMV.
Lost in the religious wars here is the number of programming concepts java "democratized" (dumbed down if you're a cynic): multi-threading & concurrency, object-orientation, safer memory-indirection, inter-process communication & networking, design patterns, embedded documentation [javadocs], generated unit-tests [junit]). I for one never really adopted C++ because it was easier to write structs & functions than learn about classes & methods; Java did provide those options. With this all built-in to a cross-platform language (no #ifdef LINUX in java), not to mention runtime memory management, boundary-checking and the like, there are of course associated costs. Nothing is free(). :-)
The language is nothing if not resilient, having morphed from Oak's set-top language, thru webtone "java-stations", thru the darling of the WWW, then the place we put business logic when we abandoned client-server, now REST services and [Android] mobile apps.
If I had it to do over, I don't think I'd have abandoned java for PHP or Ruby or ... whatever the next fad may be.
Biting the hand that feeds IT © 1998–2021