
unsafe at any speed
or, in this case, with ANY executable format (ELF or EXE)
10507 publicly visible posts • joined 1 May 2015
yes, it was really cheap a few years ago when I got an inexpensive 'pre-N' wifi router. It's got some quirks, for sure, but didn't think it could be THAT insecure.
fortunately, not one of its ports touch the intarweb. Not only is it behind a proper firewall, its IPv6 addresses are statically assigned and all incoming IPv6 traffic is BLOCKED from it's IP ranges.
I've been considering getting a new one, though, and running something I can configure myself, turn off IPv6 routing on the LAN side, etc. [because I manage that with OTHER things]. I actually have to plug the WAN port into the LAN port and monkey with it a bit to keep it from trying to take over all IPv6 routing on the network. Fortunately THAT workaround "works" but yeah. flaky. However, in its current state, I don't need to buy another one (yet) and wifi works throughout the house [router on one side, client on opposite side of the house >50 feet away and through several walls].
So as far as wifi operations go, it's not bad.
I also disable things like UPnP, wifi admin, and other security CRATERS that are typically "left on" by average users. But having a possible LAN back door and some pre-defined admin keys is potentially really bad...
"It is based around the concept of 'significant market power'. Do you have the power to distort the market or not."
Well, I don't think it should be ILLEGAL to "have the power". Wielding it unfairly, however, SHOULD be punished as hard as possible.
"Would be nice if we could get a citation for that number..."
you're absolutely right for two separate reasons:
a) you want to get your facts right, when you're being challenged
b) you don't want to ruin credibility by spreading 'fake news'
"Readability and understandability of code is not just an issue of code formatting: is it?"
actually, it is. high level management, "dive in without seeing it before" contractors, and people who don't want to read piles of docs before getting something done, prefer "readable code".
The most readable style of all is Allman Style. It has a lot of white space in it, which means that you can clearly see where the boundaries are. It works best if you enforce curly braces around things like this:
if(something)
{
....do_this();
}
(using dots instead of white space - the editor doesn't represent them properly)
even though a lot of people might be tempted to:
if(something) do_this();
The first example is MORE READABLE. A coder might not like it because it "takes up too much space on the screen" but too bad. For someone skimming code [not reading every! single! line! and! detail!] it's a LOT easier to see things this way. It's *EFFICIENT* in other words, for reasons not obvious to the K&R fascists nor to the hard-tab nazis. Oh yeah, no hard tabs either. Then your tab settings won't affect what it looks like...
/me wants to be able to view it with 'less' and have everything line up EXACTLY! THE! SAME! as it does in an IDE, or a simple editor like 'nano', or a GUI editor like 'pluma', or something like vi, or whatever.
graphic being 'style nazi' alert this time
"it would be so much better if they'd just use some { }"
yeah but if THAT happened, then a bunch of K&R extremists would put '{' on the same line as the control statement, and use syntax like "} else {" and it would drive ME (even more) insane, requiring me to re-format the code just so that I could read it... and YES, I do that. A *LOT* [even with JAVA code, take THAT you K&R extremists!]
I deliberately re-do the IDE settings in the Android IDE to look like Allman style. Then I set it as the project default. Then I reformat things as I go over stuff, or if I'm particularly frustrated, auto-reformat everything [once the right settings are in place]. I also get rid of the hard tabs, too, spaces only.
Anyway, those who aren't familiar:
https://en.wikipedia.org/wiki/Indent_style#Allman_style
And, in many ways, Python kinda reminds me of Allman Style. So yeah, let's leave it as-is.
[lousy coding practices and ABuse of "objects" and "signals" and member functions 3 miles deep and everything promiscuously playing with every other object's stuff are bad enough, but at least the pure language syntax is readable from a scoping perspective - the CRAP code itself can be dealt with in the usual manner. And I shouldn't need to reformat it in order to read it]
"Python is growing fast mainly because it is heavily used in a lot of fast growing fields, including web applications, high performance computing, analysing big data sets, machine learning, system management, etc., etc "
not sure I like where this is going. And not because Python is a BAD language [it is, in fact, a GOOD language]. Python is just NOT SUITED for anything "high performance".
When you use some of those libraries, they're coded VERY inefficiently. There was one specific example I ran into, either in matplotlib or numpy. It was GROSSLY inefficient the way it was implemented. I basically re-wrote code to avoid using it as much as possible, and made a significant speedup just from that. I can't remember what it was (sorry) but I remember doing this. I shaved 15 or 20 seconds off of a file upload operation JUST doing that. Then I shaved OVER A MINUTE off by using the external C program. What WAS taking 2 minutes (or more) is now taking around 10 seconds [because I didn't want to re-write ALL of the python code in C, but if I had, it would probably take less than 2 seconds].
Now, for generating charts, other than matplotlib's hideous API, it's "convenient". Sort of. OK maybe not, and I might have to go to using something else like Cairo but still...
In any case, watch your CPU utilization. If you write Python code and are charged by cloud services for CPU utilization, you'll save a LOT of cash by going wiht a C language utility instead, at least for the parts that do the number crunching, "high performance", etc..
"as far as I'm concerned the new cool way of doing things by cobbling an app together using 15 different libraries from all across the Internet isn't the superior way of doing things."
I made a similar *kind* of point back in the 90's, regarding Visual Basic, and "cobbling together" an application by using VB with a bunch of plugins, modules, 3rd party components/libraries, etc..
And at the same time, rather than mangling my perfectly good C++ code to work with some 3rd party graphics library, I wrote some simple Windows GDI-based algorithms to create 3D looking bars and "did it right" so my bars looked better than their bars in a side by side comparison. And it took less time. yeah, so much for 3rd party library "make it fit" and having to pay a license fee...
[and I'd thumb-up your post except you said favorable things about ".Not" - ".Not" was one of the biggest reasons I shifted away from windows coding]
"A lot of elitist dinosaurs evident in these comments"
get off my lawn, you young whippersnapper! [heh]
Seriously, don't use an interpretive programming language for ANYTHING that requires performance, especially one that has built-in garbage collection and "duck typing", regardless of whether it does 'just in time compiling to pseudo-code' or not.
And more often than not, "object oriented" is _HIGHLY_ overrated [especially when it comes to system performance]. This goes TRIPLE for VMs and shared hosting... "oh but it doesn't matter because CPUs are so fast these days, and memory so cheap" until you try and run a bunch o' stuff simultaneously in multiple VMs or shared hosts, and then you find out what kind of impact inefficient code has. Yeah.
I wouldn't write a massive system in Python. I'm having to FIX one, at the moment (uses Django) and it's seriously in need of a MASSIVE re-write. Fortunately, I quickly figured out how to invoke an external program [written in C], and that solved MOST of the performance issues.
Yeah, I _do_ keep saying that about Python invoking a C program. It _is_ 30 times faster this way, with that one specific example. And it makes a very valid point that I want to express to as many people as possible: Do NOT attempt to force Python into a 'C' shaped hole. You'll go bat-guano insane trying. Use a language that's more suited to the task, and Python as the glue if you want to [for which it works very, very well, in my opinion].
but writing a massive system in Python? I'm old enough to remember how *CRAPPY* RSTS/E was, and it was written in COMPILED BASIC. So, "NO" to 'massive system in Python'. That is, if you want PERFORMANCE out of it.
'children' icon because, youngins these days...
"I would read this as an indication that the language is simply being used for purposes it is not a good fit, and by people who think they can wield it without learning it first."
that's a distinct possibility. I only hit 'Stack Overflow' when I'm trying to get past a particular problem without spending WAY too much time "learning" - I'd rather be SOLVING, thanks. And that's the point. To me Python is "yet another lingo" that I have to deal with because "customer decided" so there you go.
And given my experience with fixing existing python code, which looks like it was written by undergraduate students that learned to code in JAVA first, or worse, C-pound... given my experience with having to fix THAT code, I end up just writing an external C program to do the REAL work, then hit the search engines looking for "how do I run an external program from within python and return the result code and/or stdout back". And then I find out the magic code lines, and "make it so".
And _EVERYTHING_ _RUNS_ _BETTER_ _AFTERWARDS_! [amazing, right?]
But yeah, hacking out a working solution, especially without "paying the dues" with all of that UNNNECESSARY reading of documentation (from beginning to end) first, is what us hacker problem solvers do best.
yeah, the world really IS "results driven"
But as for using Python, it has its uses most certainly, but is _FAR_ from being a "panacea language" especially because it is SO inefficient for SO many things... [30 times faster when done in C in one specific case].
I just can't believe the hysteria that some people will go through, because, FUD.
"The Russians are coming, the Russians are coming!"
"Eemeargencie. Eemeargencie. Everybody get from striiit."
[ok I can't remember the details THAT well, did anyone NOT get that reference? Maybe I missed something...]
next thing, maybe quote Bill Murray from the Ghostbusters movie. "Cats and dogs, living together" etc.
"It's worth flagging up that US S.S. numbers are not the analogue of UK National Insurance numbers"
their TRUE usage is as "taxpayer identification numbers". There are similar numbers for corporations. You can't legally work without one, because your income is reported to the IRS using "that number".
aside from the fact that "Social Security" is in itself a misnomer, oxymoron, etc. - there is NO security, and it's not "social" at all - it's a tax collection number.
"They claim they utilise non-linearity in the microphone/electronics. I suppose it is possible for a loud modulated ultrasonic sound to be demodulated by the non-linearity making the phone 'hear' and audio signal that isn't really there."
that would be the 'heterodyne' effect.
https://en.wikipedia.org/wiki/Heterodyne
and also there's this, for digital sampling:
https://en.wikipedia.org/wiki/Nyquist%E2%80%93Shannon_sampling_theorem
The Nyquist frequency is the maximum frequency above which (in an analog to digital converter) you get an "artifact" of some kind instead of a usable signal. Knowing the Nyquist frequency of the phone would give you the ability to generate targeted artifacts and thereby an actual signal, because of the digital sampling itself. Normally an A:D will have a low pass filter to prevent this, however, unless it was designed by a complete idiot or someone who was trying to make it "as cheap as possible".
Note that "nonlinearity" is one of the requirements for heterodyning to work...
also should mention this:
https://en.wikipedia.org/wiki/Intermodulation
"Only extremely high-end amplifiers have good slew rate symmetry between the positive half and negative half of the waveform"
a simple fix might simply be to use a low pass filter on the microphone...
but it wouldn't stop a signal that's based on heterodyne effect between two ultrasonic signals. In fact, using a phased array, you could shoot the signal that way for quite some distance...
/me points out that ANY nonlinearity will create the 'beat' frequency, sometimes known as "intermodulation distortion". So send two ultrasonic signals for which the difference in frequency is "the desired signal". Or get REALLY creative (and highly directional) and use a multi-emitter phased array.
"The bad is that they seem to have aimed to create a febrile atmosphere in the US and that effort appears to have succeeded"
a better way to put that, is "creating chaos". But it was like that ALREADY. The Demo-Rats and OBAKA saw to THAT... [and 'Establishment' Republo-Crats as well with their all-too-empty promises].
OK Russian counterintelligence has been accused of using destabilizing tactics before, and maybe they DID do some of that, but it was just tossing a few extra logs on an already ROARING fire.
Everyone knows the U.S. electorate is really pissed off at the moment. But only the leftist ideologues (including the lame-stream media) blame Trump or Russia for it. We put Trump in the White House to FIX IT and the *ANGER* was already there, LONG before he announced his candidacy. The reasons for the electorate being pissed off go back to the Tea Party and things like OBAKA"care" and being called a RACIST for just disagreeing, getting socialism jammed up our as... down our throats, along with an artificially stagnant economy, high unemployment [that was statistically reclassified so it didn't look bad], high 'social service' dependency, and a general sense of frustration that NOTHING was being done about it [except to make it WORSE]. AND we were being told to "get used to the NEW NORMAL".
In other words, it was a "slow boil" ready to erupt into a raging VOLCANO. I can't imagine what would've happened, had Mrs. Clinton won, but it would have been *UGLY*.
(and now the howler monkeys will call their friends to downvote me en masse, the intarweb equivalent of throwing poo. *kisses* to my fan club!)
"I do not see Fartbook claiming to be innocent, but they clearly regret being naive."
I bet what they regret the MOST is they THEY didn't think this technique up FIRST...
(or maybe they did, and this is an attempt at obfuscating everything, blame it all on Russia, etc.)
I think it's a 'Silly Valley Syndrome' of some kind.
Silly Valley companies are becoming too arrogant, too big for their britches, and have (in the past) engaged in immoral practices that were DELIBERATELY keeping wages lower [anti-compete contracts for each other's employees], FIRED PEOPLE over political views (Mozilla, Google, ...), engage in government WAY too much through PACs and contributions, yotta yotta yotta, and now THIS bit with a kind of 'moving target' contract and their sales employees to deny them what was promised because they're being CHEAP.
I used to live in the Silly Valley area, decades ago. I'm glad I left.
Now it seems they've formed a kind of "social bubble", filled with people "just like them", within which they've lost touch with the rest of the world (like Washington D.C., Brussels, ...) and *FEEL* as if it's "just ok" to engage in what everybody ELSE sees as heinous, disreputable conduct, just because they CAN [or so they think*FEEL*].
"No-one pronounces 'Oracle' as 'OR acle' or 'whore' as 'horr' so it doesn't work."
huh? my brain just froze for a minute... illogical... illogical... obligatory Star Trek reference in progress... illogical... illogical...
'whOracle' works for me. yeah, whatever.
(I still can't figure out how NOT to prounounce them 'OR acle' and 'horr')
"So even in the (probably unlikely) event that Oracle goes titsup (which I definitely see happening some time in the future) then the good stuff (Java, ZFS, Solaris) wouldn't have to go to total waste."
No, they'll be sold off as independent 'business groups' to others, like Google, Micro-shaft, and IBM. Precedents have already been set, for a very, very, long time.
"There's always a bigger fish" (fun Star Wars quote from Episode I)
In any case, "Big Data" is highly overrated. It's a fair bet that IBM's 'Watson' division wants Oracle's data business.
Icon, because, FreeBSD
"It seems really stupid to piss off the very people that bring money into the business via sales."
Yes, it DOES.
And that 'keep working until you pay off your debt' stuff reminds me of an old song:
"Ya move 16 tons, what'da get? Another day older, and deeper in debt! St. Peter don't ya call me cause I can't go, I owe my soul to the company store..."
(Tennessee Ernie Ford and Sam Cooke both did this one, I think)
Oracle needs to re-organize their sales management, and make sure their sales force gets PAID, ON TIME, before all of the good sales people stop working for them.
Silly Valley and their anti-honest-business attitudes. Maybe they should just move to another state where average wages are lower because LIVING EXPENSES (and taxes, etc.) aren't INSANE...
not sunspots, just solar activity - i.e. if the sun puts out MORE light+heat+other_radiation then it kinda tends to WARM the PLANET, ya know?
because, regardless of your model, Mr. Sun will be the heat source.
a big 'thumbs down' for the 'wanker' pejorative. It's as bad as 'denier'.
well, as true as it is that humans aren't DIRECTLY warming the earth, the assumptions of the warmist 'CO2' model is kinda like this: CO2 is assumed to be a greenhouse gas; that is, it absorbs incoming energy (as gamma), converting it to heat, and then keep that heat from escaping the earth at night (by radiation).
So, since the sun is still involved in the process when humans [allegedly] cause the warming, having the sun go out tomorrow would be an unrelated problem. That's where that specific logic breaks down.
ON THE OTHER HAND... since the model centers ENTIRELY on the idea that CO2 is a greenhouse gas, then I like to focus my arguments on THAT little problem... because CO2 is a *REALLY* *LOUSY* greenhouse gas! It only works for temperatures below about -50F, or above about 130 or 140F, and even THEN, you have to go pretty far outside of the extremes to get 100% absorption.
When the earth emits gamma radiation at night, it's in the infrared range. "Black body radiation" puts out most of its energy at a wavelength that's close to the 'indicated temperature' wavelength, which has a simple calculation. in short, those thermometers you use with lasers in them simply figure out what the IR emission is, and 'guess' the temperature based on that. And they're pretty good.
So in short, if CO2 doesn't really absorb ANY of earth's emitted radiation between -50F and 100-something F, then CO2 isn't doing DIDDLY SQUAT to affect world temperatures, now is it?
Similarly for solar radiation, it may actually have a BLOCKING effect rather than a warming effect [and the ground absorbs the heat way better than the atmosphere does... so it's not that effective at all is it? Consider a cloudy day, for example, and water absorbs nearly ALL of the infrared, hint hint, being 100 or more times as effect ive as a greenhouse gas than CO2, in my opinion]
In any case, what would happen if the sun went out tomorrow? We'd be totally SCREWED.
"This issue really revolves around energy, the energy needed to run modern civilization. Like it or not, the vast bulk of that energy comes from carbon burning right now."
if it weren't coming from carbon burning, then you'd STILL see GRIPES from various enviro-wacko groups, along all of the "religious claims" and chicken-little fear mongering that goes with it.
a) Nuclear energy - not a carbon in its footprint, yet NOBODY seems to be asking for more of it. why not?
b) Fusion energy - if we had it working [instead of just working on 'research' - you get what you pay for!] there'd be SOME kind of griping going on about THAT, too.
c) wind farms kill eagles and condors, and take some of the energy out of the wind, affecting "something" if you search hard enough to find it.
d) dams for hydro power flood the landscape and it's no longer "pristine". oh well.
e) black solar panels heat up from solar energy, causing localized "hotter" weather if there are enough of them collecting sunlight in a given area, kinda like the "concrete jungle" effect.
In short, no matter WHAT kind of "sustainability" you have, SOMEONE is going to gripe about it, throw a tantrum, get a bunch of activist types to join their cause, make a CRAPLOAD of noise, and generally disrupt society and try to force EVERYONE (except themselves) into a 3rd world quality of life.
"Personally I can chip that a lake I hike to every so often had a glacier all the way down to the shore in the summer. A glacier, not just snowpack. 20 years later, that glacier never reaches it anymore."
this deserves separate commentary.
1900: cold
1935: hot
1970: cold
2005: hot
2040: cold <--- your glacier should reappear by then
also consider that some effects are 90 degrees out of phase with this cycle, quite possibly your glacier being one of them. The reason is 'heat up rate' vs temperature. Yeah, it's a calculus thing. It's also why late summer is hotter than early summer, even though days are longer in early summer.
Anyone else have a better explanation? Mine's based on recorded temperature data in the northern hemisphere... and one projected estimate based on the apparent cyclic trend
"We're experiencing gradually increasing climate events in many parts of the world"
No. we are not. Check out some of the data on storms of the early 1900's for example.
http://www.longrangeweather.com/global_temperatures.htm
You said "Irma is showing up as pretty much the strongest Caribbean storm on record." Recent hurricanes are NOT necessarily "the strongest on record". Consider the 'labor day hurricane' of 1935
Additionally, humans cannot cause "global climate change" in any significant amount from burning fossil fuels. You would have to completely ignore 1) chemical equilibrium between atmosphere and ocean, 2) biological equilibrium involving algae and other plants, 3) precipitation of CO2 as carbonates on the ocean floor, 4) geothermal effects on measured CO2 in key locations, and 5) the actual IR absorption spectrum of CO2 in order to come up with "a computer model" to show out of control climate change. However, REAL science in the REAL world WILL INCLUDE all of those things in the analysis, and as such, would easily demonstrate that CO2 from human activity does NOT cause global climate change in any significant amount, even any MEASURABLE amount.
If climate is changing, it is because of the earth, the sun, or something similar. Humans couldn't do it even if we WANTED to.
ack on "let's put a skeptic in charge".
<quote from article>
Meanwhile, on NASA's own website you can find data showing human civilization has likely had a profound effect on our world's climate.
</quote from article>
And _THIS_ is why we want A SKEPTIC in charge, and NOT an ACTIVIST!
[because the idea of man-made climate change is PURE BULLCRAP, and I can easily prove it, and have done so on multiple occasions already, so I'll spare this article's responses from me repeating myself again]
than Micro-shaft shipping 'Windows Lame" with Surface, and oh-by-the-way it'll cost you another $49 toll to run things that aren't CRapps from "the Store".
Except, maybe, a SUBSCRIPTION version of Win-10-nic, for which you MUST pay annual money to be able to run Win32 appLICATIONS or something YOU wrote yourself.
<sarcasm>
Good. Job.
</sarcasm>
[NOT self-fulfilling the 'Linux Prophecy' from the first post, heh]
minimal eyestrain comes from black text on #FFFFE8 background.
This is based on its similarity to PAPER, and the fact that blue light depletes the orange pigment in the macula, resulting in macular degeneration. It's why you should never use 'cool white' or 'sunlight' lamps in your workspace, but only "soft white" [which is like an incandescent light].
now, if El Reg could fix the font size in the edit box for comments [it's TOO FRICKING SMALL] I'd make fewer mistakes and wouldn't have to re-edit everything I type all of the time... but if I hit the alt+ to zoom, then the document text is way to freaking big. cannot win...
I'm seriously wondering what is meant by a "flat" UI.
simple explanation: all 3D effects are eliminated. The buttons in the upper right corner have degraded into a plain underscore, box and X with no effects around them to indicate they're buttons. And they're farther apart so that fat fingers can mash them more easily on a touch screen.
Additionally, the borders around various 'boxes' are gone, including THE WINDOW ITSELF. If there is a border, it's a 1-pixel-wide line, barely distinguishable from the content surrounding it. White-on-white is NOT uncommon. Usually the UI item has a different color, but doesn't always. Sometimes a button has no border or color, and is merely text placed "wherever".
So in short, the old visual cues of a 3D-looking border, combined with a standard color scheme, are gone. They were typically replaced with BRIGHT WHITE [which is hard on the eyes; they need to use #FFFFE8 instead], with sometimes grey or other "hard to see" color text on the UI elements, and NO BORDERS [or in a limited case, a light grey border on white, which is also difficult to see].
I'm not bilnd, but I'm old. my eyes don't easily distinguish certain colors with similar luminocity, at least not without causing them to hurt a lot. They look like "blur" to me. And a lot of the "bright bluish on white" combinations that Micro-shaft seems to like so much are EQUALLY HARD TO SEE. And this is what you get a LOT of in Win-10-nic. It hurts 'old eyes'. [ultimately it will cause eye strain on YOUNG eyes, so those young whipper-snappers better get a clue, or get glasses at an earlier age].
If the new would-be trend setters had even bothered to read old facts they'd have ignored them because, you know: OLD!
(mega-thumbs up for that)
You are SO right! The millennial generation decided "it's OUR turn, now" and, like a copy of Arthur C. Clarke's "Superiority", they went on to re-invent *EVERYTHING* and "do it THEIR way".
with predictable results.
"IMHO the rot set in when someone decided that what works best on a low-res 4-inch phone screen would also be the right UI styling for desktops, laptops, TVs, and every other human-computer interface under the sun"
That 'someone' would be THIS person.
What does surprise me is that "creatives" just follow the trend like zombies when its obvious its a step backwards.
> You don't spend much time with "creatives", do you? Mindless trend-following - it's what they do. Well, 99% of them, anyway, to be fair.
yeah, like "academic arrogance" at its WORST.
I bet that MOST truly creative people spend time CREATING, and it's their "creative" BOSSES that tell them to "do it that way". At least, that's how _I_ see it. (In effect, they're rewarded for 'bad behavior' and therefore are encouraged to propagate it).
"Just imagine having a room with an invisible touch sensitive sensor instead of a clearly visible light switch."
I know EXACTLY what you mean.
I'm currently working on a project that has 'an interface' that requires users to do something specific with it, in order to control the device (while simultaneously rejecting 'accidental' activation of the user interface). However, the amount of feedback that could previously be given to the users [to help them 'get it right'] was inadequate. I recently fixed this, after making some architectural changes to support it (priorities, priorities), so that the feedback was a LOT more intuitive (and gives more information to the user to help him use the interface). NOW it's much easier to use, and hopefully will be acceptable to the end-users. That's the goal (usability).
So yeah, it's extremely important for the UI to be as intuitive as possible. Otherwise people will have a hard time using it, and are likely to complain or "just not use it".
And, naturally, any influence I have over web design, icons, etc. WILL contain more 3D skeuomorphic and veer away from the 2D FLATSO and 'hamburger' menu. And I'm passing a link to the article along to "the powers that be" to help justify it.
Slower is NOT better. FASTER is _ALWAYS_ better!
Besides, the article clearly points out that eyeball focus went to things that weren't "productive", like looking longer at titles [maybe to recognize it as a title?] or buttons or other things that are NOT content, apparently to recognize them for what they are. You know, those thermal plots near the center of the page...
So this "extra time on the page" is just inefficiency, and is NOT an indicator that your page is better, or more important, or more interesting. It's just HARDER TO READ when it's 2D FLATSO FLUGLY, vs Elegant 3D Skeuomorphic. (if you ask me, I want EFFICIENT design so I can get more done).
As for the article itself:
THANK YOU THANK YOU THANK YOU THANK YOU THANK YOU!!!
It's ABOUT DAMN TIME that the TRUTH about this 2D vs 3D came out, and I have instinctively HATED the 2D FLATSO since it was excreted from the evil bowels of Redmond's "force the world to change" department. I particularly blame THIS person, though ti's possible that not ALL of it was her fault, directly. Sinofsky was merely a convenient scapegoat, and being male, easier to fire.
"Except when they run out of generation capacity and need to lose some load."
this is the core of the problem: 'running out of generation capacity'. This is the 21st century. A shortage of generator capacity should NEVER happen, It seems CONTRIVED to me [in order to avoid doing things the RIGHT way, maybe, like has happened in Cali-Fornicate-You with Grey-out Davis, for example]
So yeah I've heard this song before. It sucked then, and it sucks NOW.
If they do the smart meters correctly, it would work like this: when capacity requirements are such that it costs more to generate electricity [because peaker plants come online], then the cost should simply vary based on demand and the actual cost of production. And they would LET YOU KNOW when the price goes up, maybe an indicator you could remote-install on certain outlets and switches, or something.
If instead they're being used to DROP YOUR POWER, like you aren't smart enough to turn a few appliances off and wait until after-peak, then it's ANTI-FREEDOM.
But warning you when the price goes up might be a necessary part to making this work... and I don't see ANYONE out there trying to mitigate the STICKER SHOCK that could result when your electric bill arrives, if the smart meters are simply used for 'Time Of Use' billing.
Last week we had a nice heat wave in Cali-fornicate-you, probably the last one of the year. The usual 'flex alert' warnings were out there. I didn't run certain appliances and kept my A/C and fans blasting, with the internal house temperature hovering in the 80F range [above their 78 degree imposed "limit"] because I don't have a really powerful A/C but it's good enough to keep things 'liveable'. And I expect a higher power bill. But I want to use the power, so I should be able to, right? But yeah, you can just ask people to "not do certain things" and as a general rule, they won't.
But if people would simply build more "peaker" generators, close to where the peak demand will occur, this wouldn't be a problem any more. Not at all. Those are usually diesel engine type generators, running on natural gas. Start 'em up at a moment's notice, run for a few hours, and shut 'em down. A bit expensive but it keeps the lights on. Extra cost is passed to the customer via variable rates. Nobody has to 'cut back', everybody's lights and air conditioners stay on. THAT is how modern society SHOULD be, not 3rd world "oh crap oh crap oh crap everybody shut things off, the 19th century power grid can't handle it." And then doing 'rolling blackouts' without warning...
"Pretty sure their lips move when they're reading, and that they use their finger to keep track of where they are in a sentence."
I should go all SJW regarding the 'having fun at the expense of people with dyslexia' but I can't stomach the thought of actually going through with it...
So let's just pretend I did, as a part of "illustrating absurdity by being absurd", while I pray to the porcelain god 'Ralph' and his son 'Barpholomew', and the car he drives, 'Buick'.