What are the odds on TCF using James Damore's memo in their defence?
105 posts • joined 3 Apr 2008
Of course those with £299 to spend on a hair dryer don't understand that measuring the air temperature won't actually STOP the singe.
The control of air temperature will do that; and that doesn't change nearly quick enough NOT to be controllable by a £29 hair dryer.
Dyson mostly invents new stuff to be considered important. Brilliant marketing that lightens the wallets of the shallow thinkers, ready to blow their dough on fashion and folly.
Platters on a long, "horizontal" spindle into the "depth" of the form factor. It should go without saying the the spindle would be simply supported at both ends.
Also several head actuators to access groups of platters on the same spindle.
The "imbalance" problem is a load of bollocks. Give the design task to a proper mechanical engineer.
HDD effective transfer rates are affected by spindle speed, actuator seek and settling times; and contention where requests for data encounter a physical conflict for resources; be that a head assembly to get to a particular group of platters or data in different parts of a platter.
Lots of small platters reduces the probability of platter-based contention. Many head assemblies to access disparate data on different platters reduces head contention. Spinning the platters as fast as possible (>15krpm) substantially reduces the latency of access, returning data earlier and therefore the probability of encountering requests that lead to contention. Likewise; using small head assemblies reduces their individual inertia, facilitating faster seeks and shorter settling times.
While track buffers should be retained in addition to the request and data queue space, high-level storage management such as the balancing of data across the platter groups should be left server-side.
The potential throughput rate for e.g. 12 platters with ca 22 data surfaces spinning at 15krpm calls for better than SAS-4. (160 MB/sec*22)
"(more Oxygen from the air, less Nitrogen from the fuel)"
The Nitrogen comes from the air. Air is 70% Nitrogen.
When combustion temperatures are high due to "excess air" producing a fast burn, The N2 molecules can "split" and become "adopted" by the residual oxygen (O2). It's the fuel-saving, high-efficiency, lean burn that is usual for diesel engine combustion at low to moderate loads, that makes them more likely to produce nitrogen oxides.
There are a bunch of ways of preventing or at least reducing the emission of NOx. An earlier method was to force the engine to breathe its own exhaust gases (exhaust gas recirculation "EGR") which'd still have enough oxygen to burn the fuel, but by being the reduced, the combustion is slower, peak temperatures lower and NOx is less likely to be produced. Of course some fresh air is required because the available oxygen eventually gets used up. Efficiency is lower and the specific power at that engine load is lower; but it's only necessary to use it under light to moderate engine loads; idle to gentle cruising.
That technology's not unique to diesel engines. It's also applicable to spark ignition engines.
There are parallels with "driving" of cars. The ones headed for "autonomous" control are likely to produce the same situation; with all those on board becoming payload in a confused missile.
Autonomous vehicles fail to cope in certain circumstances and a human is required to jump in; to act quickly and correctly to respond to the situation. Absent sufficient situational awareness, the human reactions will tend to be counter-productive, useless or disasterous.
Another situation where a pilot's "extra-curricular" experience saved hundreds of lives is the "Gimli Glider" incident; where pilot's experience with a sailplane gave him the skill and understanding to "slip" a very much larger glider to approach a runway that was too close for a normal approach.
"Hold on, wasn't the whole EU superstate your lot's big clever idea?"
My lot? You can't blame that on Australians!
You have only yourselves to blame for permitting the tyrannies to blossom around you. Ask the government to do more for you and it will take more, including freedom.
@ Bronek Kozicki
I understood. Years ago. The fact that you don't agree with me doesn't mean that I don't understand.
The few times that I've visited London, the air was pretty clean. That was when traffic flowed freely and there was no congestion charge. (1997 and 1999). Then, to create a need for a congestion charge, Red Ken and others dedicated the roads to the dirty diesel buses and dirty diesel taxis; then de-synchronised the traffic lights, just to make sure that traffic would be stopped.
Poor air quality, engineered by nincompoop do-gooders is one of the least of reasons why I would never move to London. As an Australian, I'd be unwelcome and immediately suspected of stealing British jobs. I've had "the lecture" at Eefrow a couple of times where they didn't even have the courtesy to respond to my "Good Morning" in kind. Add to that that I'd be paying virtually a million bucks in rent for a dog-box above a "restaurant" that's probably cooking stray cats and dogs at 3 in the morning …
If you'd done any research, you'd have discovered that the following are the only times that I've been "intimate" with a VW Polo.
But then it seems easier for you to enjoy your prejudice and have an opinion without any rational basis.
Biodiesel cannot be used in modern diesel engines because if lacks the lubricity required for the high pressure injection systems. Why does nobody RTFM?
As to scalability, here's something that I wrote earlier: https://contrary2belief.wordpress.com/2012/04/02/deep-fried-qantas/
"However then you're in the situation of arguing that laws aren't valid unless you can show the crime directly hurt someone, and that's a very slippery slope."
I did not and do not seek to excuse the violation of regulation.
There are already penalty provisions per vehicle without the necessity of a law suit. The statutory fine is up to $3750 per vehicle; in addition to making every vehicle actually comply with the rules.
When King Ælfred of Wessex laid down the Domboc, he defined a consistent penalty for each type of offence. This meant the end of arbitrary, sometimes vindictive penalties of what was to become Great Britain. It's the Rule of Law.
"Slippery slope" is btw a logical fallacy.
If there is no physical evidence required for prosecution; simply an allegation of a possibility of harm in the indefinite future, then you've built yourself a happy tyranny.
"MIT attributes 59 premature deaths in the US from the illegal extra emissions from VW/Audi diesel cars."
But then; there could be as few as 10 people (out of a population of hundreds of millions) whose lives may be 10 years shorter; allegedly due to the (actually unknown) increment in VW emissions.
And they worked it out using a sacred computer model. No measurements. No autopsies. No messing about doing anything in the real world; they plugged in a set of EXTREME VALUES obtained from a poorly documented other source (ICCT), added their own and EPA estimates. to produce the "results"; with a 95% confidence interval.
"The team had developed GEOS-Chem as a way to provide “rapid response” to evolving policy conversations." i.e. a way of shutting down the conversation. As does "peer reviewed study"; often used to prop up bad science. 80% of peer-reviewed medical "science" is retracted because it's found to be wrong.
Retrieving the MIT paper, it seems that VW's fix of the emissions controls would be creating new and wonderful live because, whereas the defeat device has a median estimate of 59 earlier deaths; the fix will avert a median of 130 early deaths. http://iopscience.iop.org/article/10.1088/1748-9326/10/11/114005/pdf It's in the abstract.
The report concludes: 'Finally, we note that while the 18 September 2015 EPA letter to VW cites ozone exposure resulting from the excess NOx emissions as a concern, we find that 87% of deaths are due to fine particulate matter exposure, with 13% due to ozone."
If you need to be told again about the hokey estimates of deaths from particulates("PM2.5 mortality impacts consistent with EPA practice."): http://junkscience.com/2016/01/how-stupid-is-air-pollution-science/
It may surprise/distress some people to know that emissions limits aren't set even by gazing into a digital crystal ball. Nobody tries to WORK OUT how much may be emitted by a vehicle. Limits are set arbitrarily and poltically by the regulators with little if any consultation with engineers and people who understand the technology; nor with medical professionals actually skilled in the statistics of epidemiology to set specific air quality targets.
They can't tell if tighter limits will have a social benefit. It doesn't seem to be case in the German cities that've banned "filthy cars".
This analysis referred to one vehicle only; one which had been "chipped" for performance by an after-market performance tuning company. Nevertheless, the researcher considered that that would have no effect on emissions. (It's on one of the presentation slides.)
Further, it analyses a VW Sharan; not marketed under any brand in the USA and already equipped with a selective reduction catalyst.
The cost of such a sub-system is not insignificant. There's not just a catalyst, but also a reservoir, pump and heater for the reduction agent, and an exhaust gas sensor to minimise ammonia slip from injecting too much reduction agent. The heater is essential because the reduction agent; urea solution in water, freezes at about -15°C and the engine isn't "allowed" to be operated until the reduction agent can flow. Sensors, etc are hooked back to the engine management system and to the instrument panel. The cost of this sub-system is of the order of a third of the cost of the "entry level" diesel VWs and is therefore deployed on up-market models; which also have bigger and more powerful engines.
The test-cycle detection results in the selective reduction agent being used to its full potential. On the road, the engine management system conserves reduction agent. It's expensive stuff if you buy it at a VW Pharmacy. (Very much cheaper from heavy vehicle maintenance depots; but you need to get a nozzle to fit.)
VW's advertised fix for the diesel engines in Germany is reprogramming of the 2-litre units and a small change to the intake ahead of the air flow meter in addition to re-programming on the 1.6-litre engines.
The fix for Australia, where the cars don't have to meet Euro 6 (yet) and only the 2.0-litre engines are imported, is a reprogramming to remove the defeat device.
Yeah… that's right. I must be a shill.
All other manufacturers' vehicle also do worse, on-road.
You claim that there are injuries. How many are attributable to the "defeat device" in VW vehicles?
Do you have testing information of emissions from the cars after the defeat device is properly disabled? Without measurement, one cannot make any rational assessment; and then it's only based on test cycle operations; nothing to do with the real world.
Are you aware of NOx emissions according to WLTC (conducted by the ADAC) are already being measured to be up to 15 times higher than allowed by Euro 6; with the "dirtiest" VW Group vehicle (Auidi A8 3.0 TDI) being 3 times higher. Jeep Renegades are 10 times higher. FIAT might be next for the chop. And there are man VW Group vehicles below the Euro 6 limit.
EPA doesn't seem to think particulates are dangerous otherwise they would be contravening the Nuremberg Convention by supporting research that involved people deliberately inhaling air containing "hazardous" particulate matter; without first obtaining their informed consent.
THere are many sources of NOx and particulates other than cars; many natural sources (ambient NOx levels are only half anthroprogenic in the open) and many other sources from human activities in cities.
German cities which have restrictions on the vehicles allowed to enter, based on their emissions class, don't have cleaner air than similar cities without such restrictions. i.e. cars make no more difference. https://contrary2belief.wordpress.com/2012/01/27/264/
You're not meant to take theconsternation.net literally. It's satire.
There are, nevertheless only 4 inventions in the article. There's a comprehensive (unpublished) bibliography for the basis of the story based on facts.
If, as pundits claim, VW is entertaining the idea of adding another catalytic converter to the smaller engined, higher-volume cars, then that will not only damage VW, but will also hurt the cars' owners because the vehicles will be more complicated and expensive to run.
I'm surprised that other carmakers are just standing around, quietly watching and the EPA and other government agencies conduct lawfare and punishment by process on the VW Group. Don't they understand that any one of them is next in line?
Uncle Sam is demented. Insane.
vis e.g. 85.: "found on-road NOx emissions from two 2.0L VW light duty diesel vehicles … were significantly higher than the applicable emission standards established by EPA regulations."
That would and indeed DOES apply to all cars with internal combustion engines. WHY PICK ON VW? It is a nonsense to compare test cycle emissions with on-road emissions. Nobody operates their vehicles according to the test cycle. Test cycles at best allow for comparisons between vehicles.
EPA has also conveniently "forgotten" that it agreed with carmakers in the late 1990's to set standards according to what is reasonably achievable with affordable technologies.
VW's actions were as much in the interests of its customers; maintaining affordability and driving pleasure; as well as protecting their own profitability.
Only regulations were injured.
The "PRAYER FOR RELIEF" is for egregious, vindictive penalties for breach of holy EPA scriptures. The amounts have clearly been pulled out of their collective @rses.
Other "remedies" include STUPID things like enjoining VW not to break the regulations again … a sure sign of insanity.
Perhaps VW should simply bug out. http://theconsternation.net/2016/01/08/scoop-volkswagen-group-of-america-shuts-itself-down/
I bought some M-Disc DVD media to archive data of a business being shuffled off. Tax records need only last 7 years (IIRC) so the 1000 years seems like a good bet.
The data were "engraved" in much the same way as any data DVD. Only "downside" is that they are not as easily labelled as conventional DVD-R(W) with no "printable" surface.
At 4.6GB, it won't store more than a couple of thousand decent-resolution snaps. It's not even enough for some people's snap-happy holidays where evrything is photographed; just in case. Then to be sure. To be sure.
There's Blu-Ray media available as well; in 2 capacities (so far). Not all M-Disc drives will engrave those; but you'd fit tens of gigabytes onto each disc.
For pictures worth saving, the DVD size is probably an easy, affordable option.
My oldest colour film negatives are off-colour; some a flaking pigment. Some of the older collection of family shots back to the 1960's is "infected" by spots. I'd rather leave the scanned films "uncorrected" other than for the obvious colour shifts. If somebody wants to subsequently re-touch digitally, that art's up to them.
By all means, put your pictures on your NAS, or whatever. Choose the ones to keep "forever" and back them up to M-Dics if they are worth preserving. Don't worry about how somebody might have difficulty recovering them in 750 years. Or even 50 years. You can only do what you do and it's betetr to back up now than after the NAS has taken a tumble off the shelf.
Physical colour films, unless stored carefully, will be largely useless after a century; monochrome's a bit more stable but don't count on it lasting more than a century under average storage conditions.
More to the point perhaps is the use for family histories; family trees including pictures,, scans of certificates and other historic addenda. Never store family tree data only in proprietary databse formats. Always save in at least two other "open" formats; preferably human-readable ones (e.g. XML-based). Take the time to produce PDF's of charts, etc.
Label the media. Put it in a protective case with documentation about the contents and the method used to create them. Store sensibly.
2 g/km sounds really terrible until you're stuck in a summer traffic jam with nothing to do but to pick your nose. At that point, a combustion engine is producing an infinite amount of NOx per kilometere travelled.
That's why cycle totals are relevant; not instantaneous ones.
As a contractor, I always "minute" verbal discussions and instructions along with my interpretation of what I am being asked to do; and if I object then I advise them of the potential consequences. This is simple to do nowadays and much less time consuming than it used to be 35 years ago. But no less onerous if your job is on the line.
Still; I'd rather lose my job than my professional reputation.
"I suppose that normal commercial systems vs. HPC systems is a bit like the difference between a Ford Transit and a Formula 1 car"
Yes indeed. The F1 cannot be built by strapping a bunch of Transit engines together; nor can you take the ubiquitous "white van man" and plant him/her into an F1 car and expect anything like a good result.
Or can we? We do have drivers moving "seamlessly" from a Dacia Sandero into a Bugatti Veyron. It's the Engineering of the latter that facilitates the transition. It's a vastly more sophisticated car yet is as simple to drive as an ordinary hatchback; even at much higher speeds — though not necessarily the top 10% of the performance envelope.
To carry the theme further; we have a road traffic environment with mostly much more powerful, faster cars than ever before and the driver are becoming less skilled and interested; yet the roads are far safer, cars are more efficient and (relatively) seldom need to be caressed by a magic monk in blue overalls. That "miracle" is the result of Engineering over the past 60 years.
In my initial posting, I focused on the inability/unwillingness of the industry to provide a platform in which those who are not au fait with the particular intricacies of a specific machine can still extract maximum performance. That's like Bugatti selling the Veyron and insisting that the be driven by Bugatti's own chauffeurs.
I don't make that comparison out of ignorance. I make it to highlight the shortfall in real Engineering in I.T.. Sure; the make the lights blink faster and the bits bang harder; but most don't seem to have tried to fill the need for machines that are truly useful to the people who can make productive use of the machines. The people who are the interface to the real world.
It's as absurd to expect application "programmers" to write code differently for HPC than for a "conventional" environment. It's a distraction from the problems that they *should* be solving, is not a good use of their time and more likely than not, will be done in a half-@rsed way; requiring it to be redone, over and over again, yielding results that are essentially irreproducible.
So the maximum petaflops of the HPC become irrelevant as they were all expended "quickly" producing a result that is useless.
The call by government to make the necessary Engineering happen is bound to be unproductive if not counter-productive. Commercial users of HPC must demand what they need from the manufacturers; and remain severely unhappy if they don't get it delivered. That is the only leverage that the free market can pull. If users need to form groups to produce common demands, then that wouldn't be the first time in history.
Let the boffins/buffoons at the manufacturers work out how to squeeze performance out of their own hardware because they are the ones who have intimate knowledge of its foibles. Make them provide a uniform application software environment that is portable and produces consistent results in application software.
"Current HPC … systems are very difficult to program, requiring careful measurement and tuning to get maximum performance on the targeted machine. Shifting a program to a new machine can require repeating much of this process, and it also requires making sure the new code gets the same results as the old code."
Could be that the people who built those machines and software environments are relative nincompoops. Or it could be that they're trying to maximise their "value add" by portraying something that is fundamentally simple as something complicated that only a guild of highly-paid specialists can undertake.
Dr Christopher Essex produced an nice video ("Believing in Six Impossible Things Before Breakfast, and Climate Models") on reproducibility of results and the tendency for uncontrolled models to introduce their own artifacts leading the Gospel-Out believers to often become enchanted by the "results".
While there may be some "push" from employees if inadequate tools are provided by their employer, letting ("unwashed") employees bring their own won't solve the inherent problems.
It is essential that IT understands what tools are necessary for people to do their work effectively and efficiently. IT staff need to understand the underlying business processes and work towards introducing improvements that they can see with their knowledge of the technologies.
Shock! Horror! There is a lot of "push" from employees to bring their own devices so that they can do personal stuff at work using their employer's resources, including time.
The issue of being connected directly into the corporate network with "uncontrolled" devices is one that is seldom explored. While wired connections are "easily" dealt with by treating every one as "hostile", the wireless network is one presenting a plethora of new security challenges.
Is every wireless device assigned its own WPA key? I doubt that the employer has the infrastructure. If such a key is "permanent" and pre-shared, then the key goes with the employee's device anywhere that the device goes and connects -- beyond the control of I.T. management. Even if every device has a unique wireless access key, malware can collect vast quantities of data using promiscous wireless operation and a platform for key cracking and subsequent tunneling from the Internet into the corporate network.
Lobotomising BYODs of departing employees is closing the gate after the horse has bolted. Critical data are probably synched to computers at home or into their own "cloud".
Some businesses believe that they are too small a target to be "hacked". In reality, an Internet connection is a sufficient asset. The objective of hackers isn't generally to obtain intellectual property. (That's just a bonus.) The primary objective is to get a toehold from which they can launch further attacks so that they can get at real money, by whatever fraudulent means present themselves.
"Australian financial institutions have made sure punters aren't out of pocket, refunding them for fraudulent purchases."
Watch for the next bank fee hike.
As far as retailers are concerned, many "understand" that it's sufficient to have cameras record offences taking place. Then they let their insurers deal with the losses. Rising insurance premiums?
NB: The upcoming CCC conference is entitled "Not my department.". Entirely relevant.
A second "production" system would be the one that you have provisioned for disaster recovery (DR). That's the one in a separate building, with its own electricty, etc supply. Connected only by optic fibre or a directional, wireless link.
How else does one cope with e.g. a physical catastrophe that might take days to weeks to resolve?
The DR system doesn't have to have 100% of the capabilities of the live system. It "only" needs to provide core business functions. You must be sure that it can, so you have to test it irregularly but at least several times a year; perhaps running a full "DR drill" once every year to 18 months if the plan isn't exercised out of necessity.
As for rogue RDBMS transactions, most engines support transaction logging. Those seeking to milk the last bits of performace out of the system may not have turned it off or never enabled it. The performance hit from transaction logging, when properly configured is a few percent. That is well worth it for protecting the integrity of enterprise data.
The vanilla procedure for recovery from a committed rogue transaction is to restore from the previous backup and then roll forward on the transactions up to the one which was rogue. It *may* be safe to roll forward on subsequent transactions but that is not the general case.
If you don't have a transaction log, then restoring from teh previous backup required "re-keying" all the data. Which is more likely not "possible" nowadays with EDI and web-presence. It may be possible to replay electronic transactions but their ORDER is significant in most EDI systems to preserve e.g. order numbers. Otherwise shipments, etc using such sequence numbers will become lost.
A friend had a PC which had had problems since he got it. Prone to crashing, etc.
After a couple of years of torment, he asked me to check to see if I could get it going again for him to extract some of his old files.
The first thing that struck me as I took it out of the car at home was that it rattled quit a bit more than a PC should, even when turning it slowly. Once I had it on the table and the side dropped off, I gently rocked the case and isolated the rattle to the PSU. As it was never in warranty, I removed the PSU and cracked it open. Trying to discharge the caps to avoid an unpleasant shock, I noticed that the components moved a bit under the probe's pressure.
So out came the long-nosed pliers and I pulled on a component .. which came clear off the board. And so did the next. And the next. Releasing the PCB from the PSU chassis revealed that there was no solder at all. Electrical contact existed purely by pressure of component leads onto the copper of the PCB. Looks like it didn't get wet when sailing across the solder seas.
I'd cut through the QC sticker when opening the PSU case. Obviously, placement of that sticker had been quality controlled.
Replacing the PSU, brought the machine up and running a rescue system without a problem. The machine was riddled with malware which was knobbled/removed before booting the installed OS. The installed OS turned out to be unlicenced and was probably a pirated version (being from a different region) so no updates/patches were possible, nor was, as a consequence, the installation of recent versions of anti-virus software.
My guess is that the vast majority of computer users don't actually need a word processor or spreadsheet. They probably don't appreciate that they are wasting time and effort. Nor do their lords and masters, by and large. They simply do things like that because they do things like that.
In fact, those systems can work against the efficient and effective management and collection of data as each piece of information becomes an orphaned island, soon forgotten and frequently backed up.
Word processors take time to learn... but most people aren't trained and/or get away with poor practices such as using spaces and blank lines to achieve the "look" that they want for a document.
Spreadsheets? Yeah ... a swill of unauditable calculations and undocumented formulae that blow up in the face of the third user down the "pass the spreadsheet bomb" chain.
Nope. The pervasive problem is one of managing data and business processes so that users can get on with doing work instead of futzing about and "programming" without; judging by the results; the training or the aptitude.
Get a proper ERP system with the necessary plugins configured for all you form letters, etc (it's called "CRM", but you don't have to use it just for customers) so that all significant data originates and is managed within the ERP system and its database(s). The outcome is that documents issued have a consistent look, with key information correlated to other relevant enterprise data. ANYBODY with appropriate access rights can then see e.g. correspondence against other activity with the external and internal parties.
Then PAY some COMPETENT experts to analyse the enterprise's objectives and business processes; make changes to tailor the ERP system to the enterprise and change business processes to make the most of what computers can do best.
The whole "office suite" thing for corporations reeks of the mentality of treating each document on its own and not part of the information trove for the enterprise. Paying Microsoft (or anybody else for that matter) a licence fee will do nothing to improve the operational efficiency of Freiburg Council. Their hopes won't be enough.
You don't seem to have the neceessary command of the English language to understand the meaning of "denier" (not having declared what is being denied) and "conspiracy" when it is the products and the subsequent behaviour of an individual that is being criticised.
Like Richard Chirgwin, Professor Lewandowsky of the University of Western Australia seems to prefer infamy over obscurity; refusing to withdraw/hold the LewPaper from press in the face of the massive cockups in the design, conduct and analysis of the "experiment".
When even a cursory analysis of the raw data lead to a conclusion that contradicts the headline hypothesis, the Professor doesn't blink.
When further analysis shows that his hypothesis is dependent largely upon results tainted by obvious gaming of the survey which was conducted by soliciting responses largely from one demographic, the Professor doesn't blink.
When the questionaires omit the vastly more popular conspiracy theories of e.g. big oil/fossil fuel industry supporting the "skeptics", the Professor doesn't blink.
Instead of addressing the tangible, substantial flaws identified not only by the "usual suspects" but by "believers" in CAGW, the Professor "defends" with insults and distractions.
It doesn't take a conspiracy for Lewandowsky to behave like Chirgwin or vice versa. Self-interest, arrogance, ignorance and a degree of sociopathy are sufficient.
Imagine if ...
... your web site(s) only had to support standards and not browsers.
Oh I remember: That's what (Sir) Tim Berners-Lee had in mind when he invented the mechanisms for the WWW in 1990
Google not evil? Perhaps.
But it's certainly not doing any good by attacking browsers instead of encouraging compliance with standards.
+ bystanders will no longer call for emergencies assuming that the crashed car has already called
+ provides an attack vector against emergency services as connections are inherently untrusted
+ GPRS (minimum) connectivity to connect to 112 call centres to post the XML object isn't ubiquitous
+ each vehicle will require a "slot" in a mobile cell and be constantly connected for rapid response
+ if the vehicle isn't constantly connected, it can take minutes to establish a connection
+ "constant" connection facilitates vehicle tracking
+ powering an electrical (radio transmission) device in a crashed vehicle can add to fire risk
Some of those things can add to the road toll.
It wasn’t long after the Mac was installed that I was hooking up my first modem and watching the glowing green characters coming up on-screen almost as fast as I could read them."
Glowing GREEN characters on a Mac screen? IIRC, they didn't do colour until late in the 1980's; except with "hacks". Perhaps Chris had one of those fancy screen filters.
Download the full report. (http://www.ipcc-wg2.gov/SREX/images/uploads/SREX-All_FINAL.pdf) Not the gloss-over for political apparachiks.
It (cites elided) states:
22.214.171.124. Attribution of Impacts to Climate Change:
Observations and Limitations
There is high confidence, based on high agreement and medium evidence, that economic losses from weather- and climate-related disasters have increased. A key question concerns whether trends in such losses, or losses from specific events, can be attributed to climate change. In this context, changes in losses over time need to be controlled for exposure and vulnerability. Most studies of long-term disaster loss records attribute these increases in losses to increasing exposure of people and assets in at-risk areas, and to underlying societal trends – demographic economic, political, and social – that shape vulnerability to impacts. Some authors suggest that a (natural or anthropogenic) climate change signal can be found in the records of disaster losses, but their work is in the nature of reviews and commentary rather than empirical research.
The private industry press council and government ACMA can take action against real publishers.
Defamation is a crime under Australian law.
Civil action against those making defamatory comments online date back to at least the late 1980's; from Usenet postings.
85% of the comments received by the enquiry run by The Fink et al were obviously the result of astro-turfing by Avaaz and GetUp!. GetUp! is funded heavily by the Australian Union movement; which also pulls the ALP's strings. Avaaz is a foreign political movement; supported by GetUp! and the likes of Soros.
The Fink's report was pulled together in a hurry. Apparently without any technical advice as to scale and feasibility. There's no legal way for the Australian authorities to track hits to blogs hosted overseas. (Maybe it's hidden in ACTA.) It is trivial, given the low threshold, for authorities to become overwhelmed with the number of web sites that require monitoring for "correctness".
The proposed mechanism to adjudicate transgressions is by a panel of judges nominated by political masters. The judges nominate henchmen to do the nasty work. It's unclear if such with be as part of the judiciary; with formal injunctions allowed, or a bureaucratic machine which doesn't itself suffer consequences for bad decisions. Those with "incorrect" blog/web pages, have only 2 days to "correct" and to publish an apology.
Political support of the report by the Greens isn't surprising. They're also in favour of "suspending democracy" and surrendering national sovereignty to save the planet.
openSUSE provides 3 sets of security policy settings; slut, secure and sheldon (officially Easy/Secure/Paranoid). ... in addition to custom settings. One of them is selected at installation time. Slutty is the default.
The presets aren't perfect for everything and the interpretation of "secure" varies from individual to individual. Defaults were initially too restrictive for NetworkManager to do its thing without superuser privileges. And it's also annoying for removable (USB) media, but the risks of mounting a strange filesystem automatically should be considered.
Networking is either "classic" or NetworkManager. The latter is appropriate for most mobile computers as it allows the user to choose the network connection.
The settings can be changed in the policykit files. That's how I got NetworkManager to connect to WLANs without root password; just the one for my KDE Wallet which stores the network passwords.
Printing on openSUSE is by CUPS. The system can be configured to listen for CUPS servers on the network; whichever network is connected. The appropriate printer can be chosen by the user in an application dialogue.
The user's timezone can be selected without root privileges. Users should not change the date-time on a *nix computer. If the network has been configured correctly, then NTP can be used to adjust the clock automatically.
A user chooses their timezone from the desktop environment settings. pre KDE4, the timezone selection was possible directly as a "clock" option. In KDE4, it's selected from the Mac-like configuration thingy.
I copied the quotes from O’Halloran into http://www.blablameter.com/index.php and it responded:
Your text: 748 characters, 114 words
Bullshit Index :1.61
Congratulations, you managed to blow up our index scale from 0 to 1. It is highly unlikely that you will impress anybody else, but you did manage to impress us!
Drivel like "digital economy participation" are sufficient for me to insist that the utterer leave the premises immediately!
Most businesses in Australia, outside of the ISPs rarely understand the concept of latency. Australia is about 50 milliseconds "wide" in latency. IMHO, NBN deliberate avoids talking about that. That higher speeds don't mean reduced latency; that latency is what makes most things on the Internet appear to be "slow".
Nor that the promised higher speeds of the NBN won't extend across the whole country; let alone internationally.
The survey only shows the depth of ignorance about the NBN.
1.61 is interesting because it is about the magnitude of the cost blowout anticipated for the NBN, which most subscribers won't see as anything better than what they had before, at the same price. Some will get LOWER speeds when changing from ADSL2+ to the entry-level NBN.
Meanwhile, taxpayers are forced to pay to build another quango monopoly.; which controls the flow of all electronic data throughout the country; except that on private radio links.
BlaBlaMeter is a handy device that can be used by anybody; even those who've broken the BS Meter with which they were equipped in childhood.
DRAM technology keeps moving too; trying to keep up with the feeding frenzy of processors.
Computer architecture didn't make the expected leap back to "core storage" in 2008; probably because funds dried up. In 2008, it was expected that NAND would soon plug into system boards; extending NUMA space in another dimension.
The other obstacle is of course that the paradigm shift is so substantial, that the industry can't cope. Architects, hardware, systems and software are stuck up to their necks in a world where mass storage is on a different medium and needed to be fetched from cards/tape/disc/... and there's little hope that the non-technical industry participants will recognize the fundamental change; because they don't, for the most part, have an inkling about how stuff works.
Many simply can't get their minds around the fact there there is now the potential for "mass storage" to be directly addressable. It is what VM "hoped for", but now that that's deliverable at (near) full force, most don't understand how to use more than a smidgen of its potential.
Biting the hand that feeds IT © 1998–2022