Re: Stupid UI
Point noted.
40 publicly visible posts • joined 17 Apr 2015
Apple had the trash icon. It seemed to work, and people understood it.
OS/2 had the shredder. Not very forgiving, but you knew damn well what you were doing when you dragged something to it. I think there were add-ons you could download so it'd drip animated blood.
Recycling? I remember thinking they'd tried to stretch the analogy too far. Too open to interpretation. But that was about the height of the time when Microsoft assumed everybody thought like them and would use a computer like them...
The significant factor here should be that there was a much higher correlation of acceptance with reputation than there was with code quality.
Even if code quality isn’t being measured perfectly, it is a metric that is much better than no metric. You may be able to argue particular data points (but damn few, if PMD’s reputation is even halfway correct), but the trends are inescapable. If you’ve got a better metric, feel free to repeat the study using yours.
Okay, I'm lucky enough not to have ever found this out first hand, but doesn't shooting a bullet 'proof' vest ruin the vest, at least at the point of impact and maybe the surrounding area?
I mean, great, this not inexpensive piece of kit would have saved you had you been accidentally shot while out hunting or whatever. Not now, though. But at least you'll have confidence in the next one you buy.
I also don't get why, if you absolutely have to test it, you have to put a human being inside it. A watermelon would do nicely. But that's been covered above.
All are interchangeable around these parts, and while "ice box" is by far less common you're not likely to raise an eyebrow. I haven't noticed any regionalisms, such as with "soda" vs "pop". I haven't been watching for them either. Personally I suppose I'm mostly a "fridge" guy.
You have to make it clear that you cannot and will not physically stop them from searching, but any search they do perform will be without your consent. Whether or not that helps is another matter, but at least you're making it clear they have to come up with a probable cause, and hopefully doesn't make you any more likely to get roughed up. "We searched because they said we could" is far too often sufficient probable cause.
(I'm not a lawyer, though.)
Perhaps the article is taking the wrong slant. Everybody knows (or should) what ECC's limitations are. These guys are saying they've figured out a way to (eventually) breach those limitations. It would involve digging into their paper (can't be arsed) to see if this technique could be used to identify memory locations that are vulnerable, AND worthy of exploit (are you flipping bits in a sys call table, or in a bitmapped image), AND can be successfully changed by a precise number of bits to a much more desirable value, from the attacker's point of view.
It sounds like you could, after a week or so, achieve some minor data corruption. If you're really lucky, that corruption might cause another process to die. Super lucky, you might get a kernel panic. Super one-in-a-1-with-many-zeros-after-it chance lucky, you might be able to use it to run malicious code or gain permissions.
Personally, I would think the odds are significantly higher that the whole computer would be stolen in an Oceans' 11-style robbery. Or obliterated by a meteorite.
In subdivisions in America (most suburbs), the local municipality typically owns the road, the sidewalk, and some of the land beyond the sidewalk. You, as the adjacent property owner, may be responsible for sidewalk cleanup after snowstorms or whatnot, depending on local ordinances. Most people seem to think they own everything up to the sidewalk or even the street, when they do not; a quick look at the plats will confirm it. It's not just a right-of-way.
That said, the presence of rights-of-way to allow people access to their own property is an established legal concept (heck, there's one on my land which my neighbors use daily, and I don't begrudge them). This guy was a complete prat. I'd also recommend steering clear of his lawyers, who seemed more interested in billing him hours than offering him the legal advice that he had a snowball's chance.
Right. I'm not going to bother to examine their business model just to form an opinion of it, but I wholeheartedly support the general goal of helping ordinary computer users (i.e. nobody here) get more out of their machines.
And if I had both Windows and MacOS workalike interfaces, I know which one I'd charge money for.
I know enough about machines that if one were to beg for mercy, I'd know that it was simply programmed to do that. Still, the novelty of the situation would make me pause, because hey, that's not normal. To ascribe 'empathy' to my actions would be a mistake.
And not just because I'm a sociopathic bastard in general. This time.
The solution seems quite simple: fill the onboard ethernet port with glue, and drop in a non-Intel network card. My understanding is that the ME network interface is only exposed through network interfaces provided by Intel chipsets (and only certain ones at that?). Laptop users may have to resort to using an external dongle if they can't replace the built-in Intel wifi.
Does it ... disable speculative execution? Change how kernel code is cached? Does it do F- all? What kind of performance penalties might I see if I install it?
It clearly doesn't eliminate the problem, or OS patches wouldn't be necessary. Does it improve the performance of the OS workarounds? The microcode is listed as being applicable to a huge range of processors. Is there a breakdown of what it tweaks for each processor type/family?
Up until this moment I always felt a bit paranoid when I disabled items such as "Allow Firefox to send technical and interaction data to Mozilla" in my programs. But it turns out if you disable that, then the 'studies' are also disabled. Who can I say "told you so" to?
(Probably only a matter of time before that switch becomes "advisory" only...)
My understand (correct me if I'm wrong) is that these problems appeared when Intel switched over from a version of ME based on ThreadX to a competing one based on Minix. (The underlying OS isn't to blame, of course, it's just what the crap code happened to be written to run on.) The current version was chosen for flashy new features, in hindsight at the expense of real security. ME version 7, if I recall, was the first Minix one. The switch from an OS very much focused on embedded systems to a more generalized one probably reflects the two teams' proficiencies, and the wrong team lost.
TL;DR - they lowered the bar for their internal developers, and got the expected lower quality internal developers.
Will they at long last finally dispel those unexplained absences of information that have deprived us of the satisfaction of knowing the full Star Wars story?
... what were Boba's political leanings with regards to the Empire?
... did he have any hobbies? Did he perhaps play an instrument?
... which brand of toothpaste did Boba prefer?
... what was his inseam?
... how might he have fared against the Ewoks, who outwitted many genetically equal Storm Troopers? ... What if he were shrunk down to Ewok size? If his gear scaled with him? If it stayed human sized? ... What if he stayed the same size, but his gear shrank to Ewok size?
I seem to recall Douglas Adams having a few things to say about fans' tendencies to want to belabor every detail of a fictional character to death...
For many jobs, an employer doesn't want creativity. They want a replaceable cog, to replace the previous cog that burned out/fell into the mixing vat/somehow made it to retirement age. Even if creativity could be a benefit, that's a crapshoot far outweighed by the dependability promised by ML (whether delivered or not). Hence, there will always be the incentive to automate it.
It might even work, from the manager's perspective. Sure, if the ML is discriminatory some worthy candidates get binned, but who cares so long as it produces enough (blond, blue-eyed male) cogs to keep things churning?
Well, we as a society care, or should. ML right now serves to detect patterns and, in the context of this article and discussion, reinforce them. Long term, this is bad for everybody not in the historically favored group, which is bad for everybody who IS in the historically favored group. A truly rational ML system needs to be able to determine which trends are important ("is literate") and which are specious ("is a member of X ethnic group"). Direct discrimination is easy to squash - tell the ML to ignore which gender box is checked or what the surname is. But indirect discrimination isn't so easy.
Even if there is a fundamental disparity in the abilities of different groups of people (redheads, in your example), when evaluating an individual you must acknowledge that they may be an outlier. Being PC is a lazy solution. Being decidedly non-PC is just as lazy. I would argue that PC-ness shouldn't be a consideration; find the best person for the job (I'm still speaking in an employment context, obviously). ML will by design maintain the status quo, perhaps with greater efficiency. Do we want to change the status quo (I think we do)? Do we want to use ML? If the answer to both these questions is yes, we've got a lot more work to do on ML before it's ready to pass judgment upon us.
ML is absolutely crap at differentiating between cause and correlation. If you train it on data that contains a discriminatory bias, it will learn that bias. That is fundamental to its very design. Unfortunately such biases are abound in most real world data. A relative dearth of female engineers in the field may lead ML meant to evaluate new hires to turn away women, perpetuating the cycle. (Please save the debate about whether we need more female engineers or not. I think we can all agree that even if she is an exception, there could be a woman candidate more than qualified for such a job, and if your ML turns her away it's discriminatory and has done both her and your company a great disservice.)
Until ML can ask "why" about the correlations it finds, we're going to have to evaluate human beings as the individuals they are and not rely on shortcuts. Problem is, clearly nobody wants to have to do that.
Left-pondian here, so I won't be participating. But in general, this is exactly the kind of thing that would get me engaged - give me a chance to stroke the ol' ego and dangle something shiny in front of me at the same time. I heartily encourage more companies to spend their advertising dollars this way.
(I learned Java under similar circumstances and have no regrets even though all I got out of it was a t-shirt ('participant'). Oh, and later on a career.)