Re: I hate to say it, as I don't like the way they work...
In fairness, they do often change things that worked fine before.
692 publicly visible posts • joined 3 Aug 2020
I can do you one better: I have a Windows 98 machine running an SSD. That's through a SATA->IDE bridge of course, so max throughput is limited, but the improvement in random R/W is just silly. It's such a nice OS to use under those conditions, and you can really ring out the Duron.
Of course you don't get the clicky-click experience of a cheap old HDD, but those things are failing so often I don't even bother installing them.
To me it implies significance, but not subservience. My primary objective is more important than my secondary objective, but that doesn't mean my primary objective ACTUALLY CONTROLS my secondary objective. Further, primary may simple mean "first line" (primary defensive structures, primary power system, primary safety system...). It does not de facto mean "one thing controls the other".
Some of those are utterly insane. Since when does the word "native" have primarily negative connotations? They are just looking for press (which they got).
I understand wanting to get rid of Master/Slave, even though I disagree with doing. What I have noticed, is that generally anybody trying to remove those terms also bans any other terms which indicate dominance and subservience and they wind up with things like "primary and secondary". But...in computing quite often one machine or piece of software IS dominant over something else. That's what we want to indicate, and the term "peer" does not capture it.
What this reveals is that the ban is not on a word, but an idea. Attempting to ban an idea in social discourse is already unacceptable IMO. Attempting to ban an idea in technical discourse where the idea is essential to the subject is just dense.
In a recent networked application that I wrote, I used to terms lord and serf. Is that acceptable?
Read your John Stuart Mill (I know, he's like wallpaper paste, but it's important stuff). One underlying principle of western democracy is that the government cannot impose consequences for non-violent action, but private individuals can. This means that ethics and cultural norms can still exist, but they are driven by the people as a group instead of by a central authority (which can still serve as a moderating force to prevent ground-up enforcement of morals from being too violent or otherwise extreme).
That said, when we have business entities more powerful than governments, it is my view that they must be treated as the central authority and not as "private individuals". But that's not strictly related to this article.
> but things like drag and drop to copy paste aren't present.
Which is pretty funny when you consider that, in the 80s, copy-paste was considered such a significant advancement that computer chronicles did an entire episode on it (I think they called it integrated software or something, but it was literally just copy-paste between applications).
So, we are really making progress :eyeroll:
I'm sure Dell would love for everybody to think that a 4 year old laptop "needs to be refreshed". And yet, the CPU in my 8 year old laptop is still on par or better performing than most mobile offerings. The computer I use to connect to work is perfectly satisfactory with a low-end Sandy Bridge part.
So why exactly does my mother need to replace her laptop?
A friend who is a safety engineer gave me an answer I find very believable: the copper industry had a lot of power at that time. The lower voltage you use, the thicker cable you need to handle the higher current draw. If we ran on 220V, everybody could use less copper. That wouldn't line Mr. Shiny's packets, so it wouldn't be acceptable.
I was very curious about this, since it could apply in some unexpected situations. I went to the EULA on my work machine and found this:
Remote access - No more than once every 90 days, you may designate a single user who physically uses the licensed device as the licensed user. The licensed user may access the licensed device from another device using remote access technologies. Other users, at different times, may access the licensed device from another device using remote access technologies, but only on devices separately licensed to run the same or higher edition of this software.
What I am inferring is that multi-user RDP is only a problem if you are
A: Using Linux or MacOS to remotely access the machine or
B: Using something less than Windows 10 Pro to access the machine
Does that jive with your understanding?
And he says:
"Honestly if the processor is the reason people are getting the phone they're getting the wrong device...The article also does not mention that we're also getting double the ram and, higher resolution screen and cameras, and a few other minor things. So a cheaper processor means we get more other stuff."
I, on the other hand, have backed the f(x)tec Pro1-X, so if/when our devices arrive we can have some kind of niche phone based combat (though really, they are different devices for different purposes).
The removal of Wi-Fi 6 is unfortunate, since that seems like a feature that people will wish they had in 3 years.
Similar case; I recently nabbed a full Duron motherboard from a thriftstore with CPU/Cooler/RAM. Hooked it all up, worked first time, very excited. After a few minutes, dies. I think I'm lucky, this was pretty early for thermal cutoffs.
Pump effect my arse, that thermal paste was completely dried out.
I can confirm. I actually have more time playing with IRIX than using Linux, and usually when I search a problem I find some trivially easy solution for Linux (which just makes it harder to find that olde style UNIX solution :p).
In fact, usually when I'm searching for system command or C function references, I wind up on pages for Linux. Usually works just fine.
Complains about X11 being over-complex and difficult to work with go back to the 90s. Even at that time, there was a general feeling that X11 should be torn down and built again. Many wished that NeWS had been open-sourced so that it could have been the dominant UNIX-like windowing system. Hence, I think we are well-passed the point where the problem with X11 is "not enough people working on it".
I'm sure that Windows does do stuff under the hood, mainly with converting to different formats instead of requiring you to provide each format yourself. Good, that improves interoperability with others applications and reduces headaches for every single developer who ever touches the system.
And as to being multi-user, if X11 can't handle giving each user session it's own instance of the clipboard system, then what horrible things is IT doing underneath?
I recently built an application that interfaces a windows clipboard with an X11 clipboard.
Windows side - "push contents to clipboard, pull contents from clipboard".
X11 side - "Send a request to the window that owns the select, wait on a conditional mutex for the message to come in on the X11 message handling thread, read the data length and type from that message, tell the other window to delete it, request the data again, wait on the conditional mutex again, now you may read the data (but don't forget to delete it). Oh, hope that the other program doesn't crash or change selections partway through this process."
So...
I'm curious, how does this differ from the copyright law surrounding an ISA? Everybody seems to agree that in order to implement x86 or x86_64, you have to licence them from Intel and AMD respectively. AFAIK, an ISA is really just a very precise listing of very tiny functions. So what's the difference?
In addition to the excellent rebuttal made above, observe that security through security does work AS AN ADDED LAYER in the context of other mitigations.
"I finally figured out their nonstandard encryption algorithm, but it requires just as much compute time to break as AES-256. Two weeks of my life wasted!"
So now, not only is your data secure, but it wasted two weeks of some poor cracker's life ;)
As a consumer, if a fixed-cost item says "as a subscription", I walk the other way.
I mean, ok. If Photoshop was sold as a subscription dirt-cheap, that might be ok because it provides a means of staying up-to-date for the same price as buying every few years. Hell, if you use a lot of creative cloud it's a really good deal. But if it were cheap, Adobe wouldn't be making bank. So nuts to them.
The same applies to almost every subscription-for-fixed-cost-good I have ever seen. It's only cheaper if you were already on the upgrade treadmill.
No, I'm not bitter. Just learning Krita 0_o
It's almost like...
the same people who create large parts of the internet...
and who fund larger parts of the internet...
and are funded BY large parts of the internet...
should not also be running your gateway to the internet.
I have never understood why somebody would choose to use chrome. It's like hiring a burglar to install your locks.
I am currently learning C++. I have wished for this. For example, I only just learned why visual studio keeps trying to replace #define with constexpr.
Though that said, part of my code also needs to compile in C++98. This really highlights one of the strengths of the language; if you write a C++ program that just uses POSIX and the STL, you sure do have a lot of valid build-targets out there.
I have an alternative, maybe naive view (I was just coming into existence when you were debugging the 6502).
I don't think I have ever been aggressively chastised for the aftermath of helping somebody fix their computer. Perhaps this is because for most of my life, computers have been commonplace and my family has a better core understanding of how they work (even if they still need help). In fact, my experience is the opposite. How many times have I said "Mom, I'm only here for a week, please give me your bloody laptop so I can look at XXX". I'm either pointing out problems they don't notice, or insisting that they let me fix the thing.
The people in life have given me a lot, often things I find tedious or don't know how to do. I spent a lot of my time doing something I found fun, and turned it into something useful. If I can repay them with those skills, then I am getting a very good deal. I also often learn things, because their systems and needs are different from mine.
Certainly I get frustrated (mostly when ̶m̶y̶ ̶m̶o̶t̶h̶e̶r̶ people ask my technical opinion, ignore me, and then wind up in exactly the situation I was trying to avoid, or when ̶s̶h̶e̶ they repeatedly make the same mistake and delete all of their DUCKING photos without a backup AGAIN), but that's not limited to computers.
From my experience writing official documentation for a C library, another possibility is that the exact circumstances under which the files are removed from memory are more complex than anybody wanted to explain.
There is always a push to KISS. Judicious use of the word "typically" us often the compromise between giving the full story and saying something untrue. "Typically, when you XXX from XXX the XXX will be deleted and replaced with the XXX from XXX".
Ideally we would have enough information in play for the user to take an educated guess at the true ruleset, but it's hard to tell your editor "I included this sentence so that the reader can solve an exciting puzzle in search of deeper knowledge!"