The Ghost Of Clippy
I freaking hate software that thinks it's smarter than you are, especially if it doesn't make it easy to turn that functionality off.
660 posts • joined 29 Aug 2010
Why are there no hardware fail-safes that are physically hardwired into these things? Like a thermistor hardwired to a relay that cuts the power to the print head if it exceeds, say, 270c? Or a physical fuse that blows when it draws too many amps?
Come on guys, this is undergrad stuff. Literally. My computer science BSc included a module on engineering ethics, wherein we covered the infamous Therac-25 radiotherapy machines where wonky software allowed them to operate in modes that cooked the patient with high-energy electron beams. Previous machines had included hardware interlocks and fuses which blew when the machine was activated in a dangerous configuration, but they'd been removed in the 25 which was dependant entirely on software for safety. Said software was full of bugs that the previous machine's hardware interlocks had covered up, but those interlocks no longer existed, and people died as a result.
He's going to fight any challenge tooth and nail. Trump hates "elites" (by which he means people with an education, not people who own a sky scraper in New York and environmentally dubious golf courses in Scotland. For some reason that doesn't count as elite as far as Trump is concerned), and he hates non-Americans, especially the non-Caucasian variety. This policy allows him to attack both kinds of people at once.
I recall hearing a story once that Mehdi Ali had posted a resume online that included the claim that he "oversaw a major operational turnaround at Commodore International".
For those not in the know. the "major operational turnaround" in question was going from a mildly profitable going concern to a memory. Yes, he was the guy in charge when Commodore went bankrupt. All evidence points to him being an asset stripper and said bankruptcy was a direct consequence of that.
Damn, if students don't get drunk then where will the Daily Mail get those photos from that prove that students are all debauched hedonistic brats who spend all their time puking in the street instead of studying? I mean they've already had to recycle images from 2017 for their annual "students suck" Fresher's week story in 2018
The only thing I used Bootcamp for was playing games, so that's not a big loss.
What is more worrying, however, is losing x86 virtualisation. I make a lot of use of both VirtualBox and Docker for my job, and losing the ability to run x86 stuff in a VM is a much bigger blow than losing Bootcamp is.
One of the major motivations for me finally giving Windows the boot and switching to Mac (aside from Apple going Intel and Vista sucking) was seeing the accessibility tools in OSX Tiger in action. They've provided similar features in their phone OS at a time when Android treated accessibility as an afterthought (and still does, honestly). As a visually impaired user I really appreciated the effort and find that the Apple accessibility tools are still some of the best out there.
It sounds like Apple dropped the ball here by overpromising on what they could do with voice recognition tech (they really should have known better, voice recognition is still far from a solved problem), and not having a way to use a call with a voice command is definitely a big oversight, but the thing is Apple are at least trying to make their products accessible above and beyond the minimum requirements mandated by law.
I honestly wish other tech companies were failing at accessibility the way Apple are failing at it, because Apple's failures are at least not down to complete apathy. As somebody who needs to use the tech they've supplied all I can say is that I appreciate the effort
This is what happens when you treat education as an afterthought for generations, where learning to recite from the text book and tick the right box on a multiple choice test is more important than learning how to think critically, evaluate sources, and apply logic and reason.
America is a lost cause, but make no mistake, unless we fix our education system we're headed the same way. We're a couple of generations away from also being a nation of bleach drinkers, at best.
Catmull-Clark subdivision surfaces is probably the big one. It's a technique that allows a single polygon to simulate multiple smaller polygons with a smooth gradation between the joins.
This allows a model to appear to be high-polygon when it is in fact relatively simple and low-polygon.
What's more the technique is open and as such is widely used. Blender includes it as a modifier. Its user guide also includes some nice example images of subdiv in action.
Really? My 6s+ manages that just fine. It can go 2 days before hitting 50% (which I consider the level where a recharge is necessary) with light use. If I really push it I can stretch to 3. Of course it depends on not making a lot of calls or hammering YouTube but if you stick to listening to music on the device itself for entertainment it's perfectly doable.
As for non-physical SIMs I seem to remember Apple pushing really hard for these but the carriers digging their heels in. I'm sure Apple would love to kill the SIM tray but until the carriers relent it probably won't ever happen.
Doesn't matter if it's Windows or a Mac, there's only so much you can do to defend users against their own utter stupidity. It was a really bad decision putting macros in MS Office but even then it does warn you when you open a file with macros in it. If you still run it anyway then anything that happens next is on your own head.
Eh, Between "cloud", subscriptions, crappy data security and a slew of gimmocky new "features" while bugs that have existed for years go unfixed I've ditched Photoshop in favour of Clip Studio Pro anyway. It's far cheaper and better adapted to an illustrator's needs than Photoshop is these days.
If you can find a tool that does the job you need well enough to ditch an Adobe product I suggest you do so. If Adobe lose enough market share maybe they'll start treating their customers as people again instead of as cash cows.
What do you mean a resurgence? SQL injection never went away. Just look at questions people on Stack Overflow are asking related to database querying from an external program/script. At least 90% of the people asking on that topic are building their queries by concatenating user input into query strings. It's like prepared statements don't even exist for most developers.
Because as far as I'm concerned it's the perfect field test of anti-net-neutrality advocates' claims in the wild. If everything is flowers and rainbows then fine, no problem. If it's not (which given the long and inglorious history of the relationship between American-branded capitalism and its vict^H^H^H^H customers seems far more likely to me) then it will hopefully serve as a cautionary tale to the rest of the world who can then avoid going down that route. Who cares what happens to a bunch of Americans in the meantime? They're just the lab rats as far as I'm concerned. It might even teach them a valuable lesson about just how powerful and dangerous a vote really can be in the hands of the ignorant.
He asked if he needed to disclose. The reply was no. Therefore he didn't disclose. He was told one thing but they actually meant another. That doesn't strike me as very fair.
Having said that I don't think I'd feel the same way had it been something more serious like a violent assault or organised criminal activity or something.
"Some hackers are bored, have the time to spare, and welcome a challenge, and hardened targets are as ostentatious as open doors to them"
No question that there are people like that out there, but the era of the hacker who does it for the challenge being the norm is long past. The vast majority of hacking is done these days for profit, either by installing malware or spamvertising, spreading ransomware, etc. For this breed of hacker the value of hacking a system is inversely proportional to how much effort is needed. You might never be able to slam the door shut but you can make it tough enough to open to make it not worth it for the hacking-as-a-business brigade.
"No, we can't accept that software WILL be vulnerable because that ALSO means we must accept that all software must be COMPLETELY vulnerable"
Like it or not, that's the reality we live in right now.
Pretending it's not so will not change the fact that it is. Everything is vulnerable. Of course every care should be taken to avoid coding practices that lead to vulnerabilities, and of course every time a vulnerability is unearthed it should be fixed, but while we pretend that software isn't all vulnerable we won't design systems to be able to resist attack. When we accept that all software is vulnerable we'll start applying better practices such as compartmentalising it so the damage can be contained when somebody finds a way into a system that they shouldn't have access to for long enough to prevent the attacker gaining further access.
It's like the ant colony that defends itself heavily around its perimeter with warrior ants but if you get past them you have unfitted access to the queen, the food stores, the nursery, etc etc.
The problem is software is complex and with the best will in the world a non-trivial system is always going to contain bugs. You could take every possible precaution in your development process to avoid security holes and still end up with one exploitable bug in the system that may go unnoticed for years. Is it fair to toss people in jail for that? Especially given that most developers are fundamentally creative by nature and struggle to think in the same way a fundamentally destructively-minded hacker would and might not notice that the fantastic new feature they've just implemented could be hijacked and used for nefarious purposes?
No, it's better to simply accept the fact that all software is going to be buggy to some extent and have mitigations in place to limit the damage that said bugs are capable of causing by compartmentalising systems so a compromise in module A doesn't allow you to cause further damage by manipulating the behaviour of module B.
I'd say that'd debatable. Nest might be the best known IoT brand but it's stagnated for years and from what I hear Google are no longer taking it very seriously, in spite of the ludicrous amount of resource they poured into it.
Biting the hand that feeds IT © 1998–2020