Re: pent-up pressures
"Does the Fuckhead-in-Chief understand that if he somehow does manage to shut down Twatter, that he will effectively cut off his own oxygen supply, and that his head will explode.."
From your lips to [someone's] ears...
412 posts • joined 18 Oct 2012
You are correct, that racism is not only reserved for white people (as the Japanese Shōwa era proved).
But, in America at least, it's not the white people dying because of that prejudice
"And Windows folk seem to put up with an awful lot so they can use a particular set of applications to do particular jobs irrespective of whether those jobs
could be done more easily can be done at ALL by other means."
And you've proved MY point. You FOSS-heads think that you have reasonable replacements for just about everything that, almost, anyone needs to accomplish on their computers.
Because hundreds of thousands of industry-specific Windows-only applications, plus hundreds of thousands of Windows-only hardware device support, plus the inability of FOSS programs to meet the complete functionality level of the Adobe suite (for just one example), never seem to matter to you guys.
And you'll *continue* to NEVER GET IT.
"It never ceases to amaze me what Windows users will put up with rather than switch."
I know, right?
Just because we have those inconvenient APPLICATIONS that we need to run, you'd think we would willingly ditch them all so that we can proudly say online that we switched.
We're fools, I say, fools!!
The bits of Microsoft that make $$$ don't care what OS you are using as long as you pay your subscription
Exactly. With respect, I'm tired of hearing that Microsoft, somehow, pulls a magic hypnosis session on anyone and everyone who will bother to look and somehow buy/stay with Microsoft only because of the brainwashing.
People and industries stay with Microsoft products because the products gets work done. Be it Windows, Office365 or other systems, "The Year of the Linux Desktop" will *NEVER* happen until someone in FOSS stops concentrating exclusively on the OS quality and brings the entire application ecosystem up to a level that not a single compromise in productivity needs to be made by switching over.
That is the ONLY measurement that companies, and professionals using systems worldwide, care about: the fact that a piece of FOSS software is free is meaningless if it doesn't provide an efficient workflow to get their work accomplished.
Go on to YouTube and listen to professionals talk about their jobs, people!!! They *don't* say "Oh, this free software is going to save me money and I'm helping support freedom!"; they ALL say "My workflow", "output quality", 'difficulty in switching and impact on my productivity", "quality of tools", "features to get my job done"...etc etc etc.
It's all they care about: Time is Money. And until FOSS delivers every bit of productivity that the quality paid solutions do, almost nobody is going to switch. Davinci Resolve has done it, offered an extremely powerful video system that is now putting Adobe Premier in the boxing ring, and on the ropes, but (sadly) it seems to be an outlier. GIMP doesn't compare to Photoshop, forget the dreams - no hard-core paid imaging professional falls back on GIMP as their primary editor, unlike Photoshop. No industry is going to drop Microsoft unless, and until, every bit of productivity software that they depend upon daily is up and running on Linux, with absolutely no compromises in both performance AND support - because businesses only care about getting things done, not about taking lots of time to make it work to get to that point.
It didn't used to be this way. In the'70's we believed we were headed for 'enlightenment'; we believed that we wee working for some type of future where things would be great.
Then Reagan came in. And then Chernobyl happened, the USSR went bankrupt and collapsed, but Americans were told that the collapse was due exclusively to our political and economic pressures. America "won". And if capitalism "won", then more capitalism, more personal, economic and social Darwinism (greed) - "Greed is good", using the exact and direct quote.
So America has sold its soul for MONEY. Everything is OK as long as you can personally justify it, and it's PREFECT of e you're making money doing it.
We've become...morally corrupt. And the constructs that are supposed to uphold a greater morality, up to and including religion, have been either enslaved or silenced by propaganda ("socialism!"), fear, or just outright selling out for that almighty Dollar.
There are still people of conscious in this society. But all too often they stay home instead of voting that conscious, and greed continues to win. So we've gotten the government, the system, that we (continue to) pay for.
This. The lawsuit is ridiculous, taking the stance that "You sold $2bn worth of chips, but didn't precisely tell [Wall Street] which type of customers bought which type of product."
(Directed to the plaintiffs)
So let me get this straight: nVidia never lied, in any fashion, as to how many units it sold in total. You aren't arguing that point. What you ARE arguing is that nVidia didn't divide out markets to your satisfaction because you wanted to hedge your bets only in one direction, that being yours, and if nVidia wasn't doing precisely what YOU thought they should be doing, you may have double-guessed how much play money you expected your bets to double down on.
Here's the thing: no company is *required* to split out market analysis for each product sale. Companies deny sales details, customer or product specifics all the time. They do it because scum investors want to play roulette with only the parts of a company that makes them happy - the Carl Icahn scenario. Where a company isn't doing good unless they are doing things MY way.
So you played only because you wanted to see numbers that made you, exclusively, happy. Damn the fundamentals of the business, are they trendy NOW, I want instant returns. I'll bail if things don't go my way, so make sure they either go my way *or* you give me advanced warning (market analysis, even to the point of future projections).
My pinky-sized fiddle is playing for you, dear.
"However, it does involve quite a bit of work to access the data,"
Just the opposite! Did anybody beyond myself actually read the white paper? I'll quote said paper:
"All the attacker needs is 5 minutes alone with the computer, a screwdriver, and some easily portable hardware."
Or did you watch the video??
5 minutes, you're DONE. The attack is almost completely automated: lift the back, attach an attack computer, reprogram, attach a Thunderbolt attack device, done.
The researchers even provided 2 tools they developed to assist the crack!
This isn't even difficult. And a *lot* of people use Sleep mode whilst traveling; for example, using the computer whilst awaiting boarding in the airport, then going into Sleep mode to awaken back on the plane.
It's quite serious if you take it from the perspective of state-run secrets, confiscations or political actions: if you are a political dissident, or arrested even on a trumped-up charge, and you thought that your password/encrypted data could not be used against you because they never will be able to get to it, this changes that outcome.
Corporate espionage on a stolen, yet encrypted, laptop? Done. Don't agree with our political agenda, refused boarding of your plane, and fishing for "evidence" on your locked laptop? Done. Even a simply confiscation of an attorney's, or an accountant's, records, with just a twinkle of a suspicion of fraud, but the prosecutor can't get the courts to agree to a full warrant? Done, and done (and fight about it in the courts but that's later, hopefully after that nice "guilty" verdict that sounds great in the news and on your resume as you run for higher office).
This is horrible news for anyone who used BIOS passwords or encryption in the first place.
'James Damore, the one-time Google developer who infamously suggested his bosses' diversity rules made it impossible to voice some opinions, has dropped his lawsuit against the internet titan.
Well, on this he certainly wasn't wrong."
Yes he was, and the labor board has declared such.
The issue that [certain] people wish to avoid is the fact of the first part, and we'll repeat it here again,
"James Damore, the one-time Google developer who infamously suggested his bosses' diversity rules made it impossible to voice some opinions".
The flat fact is that Google HAS the right to create a [diversity] rule that "makes it impossible to voice some opinions", if said opinion violates the rule, as Google is PRIVATE. Your 'freedom of speech' as guaranteed by law only applies against <u>government</u> censorship.. For example, a Non-Disclosure Agreement on your employment contract certainly muzzles your freedom of speech, but since it is a private agreement between two legal entities it is fully legal and allowed by law.
Google made a policy that states that, essentially, no statements regarding diversity which may possibly lead to disagreements, arguments or social upheaval within the business, will be allowed. Damore not only agreed to abide by this rule during his acceptance of his employment contract, he de facto agrees to abide by the agreement by continuing his employment voluntarily whilst Google continues to maintain the policy. If he had problems with the policy, feeling that it was too 'PC' for his, well, "conservative values", then he should have either quit, or not taken the contract in the first place with such a "progressive" thinking company.
Failing that, Damore should have sought approval from the PTB regarding the topic, prior to posting, as he knew he was openly challenging the rule. Again, a rule Damore agreed to follow during his hiring.
People need to understand the LAW. Freedom of speech is only guaranteed in the public realm, but companies are private. You are not, and have never been, free to proclaim anything you want against the boss whilst employed with a company and not face possible repercussions (you're free to mouth off, but don't whinge when you get kicked out on your butt).
Damore shot his mouth off against a policy that exists to prevent friction between workers. And was fired for it. Even if you agree with his beliefs (not the point), he violated the agreed-to rule. Google had the right to decide his fate, and out he went, as his continued presence would create the very friction that the rule tries to reduce. And Google [has] that right because they, legally, have the right to make policies regarding speech that regards your work.
Sad, for some people who think otherwise, but very, very true.
The context of his statement was vague to me, I thought he might be talking about the plaintiff and a side reference to the fact that lawyers often take the biggest chunk of a class action settlement. But it seems most people are applying the appellation of "bully" to Apple, so this has both cleared up my confusion regarding the direction of the statement and gotten my upvote - I'm quite OK with Apple being claimed as a bully. :-)
I've been crushing permissions on all my apps, anywhere I can, for quite a while now. IMHO there has never been a good reason to allow my browser access to the microphone, my location, my contacts, et al on my phone, and no reason for same on my desktops, so "Off" they go. Yes, that includes Google Play Services. And Firefox on Windows. Plus many, many more.
I keep trying to tell everyone "Only the paranoid survive", but I believe I have grown tired of the repetition.
Exactly. You will never find a (de-restricted) photo of a submarine's reactor room, but here is a photo of a scale model of one
Nothing but a super-shielded reactor, pumps, pressururizers, pipes and heat exchangers. Really an unimpressive, yet subconsciously frightening, sight - you know what's on the other side of that metal will kill you outright (at that proximity).
I've toured decommissioned nuclear submarines. The reactor room was behind a porthole and the only thing you saw was...just as described here. Big, big, big machinery and cylinders, just painted, a big mechanical room, that's all.
which goes along my point, "You can't come close to a nuclear reactor in a warship, they are not only contained in compartments that are sealed from 'normal' entry during operations". As I said, "normal" entry and "can't come close during operations". Even in U.S. warships you can gain access to the reactor, you just don't do it whilst the reactor is 'critical' (operational).
So Red October's scene being the reactor? Double-no.
That shot from The Hunt for Red October is *not* in the reactor room, those tubes are the exterior surfaces of the launch tubes for the intercontinental nuclear missiles that the submarine, as a "boomer"-type, holds. That area is nicknamed the "Forest" by submariners, as the tubes tower over you like giant trees.
You can't come close to a nuclear reactor in a warship, they are not only contained in compartments that are sealed from 'normal' entry during operations, the reactor is so heavily shielded that they just resemble massive 'lumps' that take up a huge amount of space. For example
On a submarine, only the Forest has the wide open spaces that are displayed in the scene from Red October.
The reason a company purchases MacBooks is because of residual value; if they lease or trade up often, the residual value can indeed lower costs for the upgrade.
An IT department buying longer term purchases either ThinkPads or Dell Latitudes, because unlike a MacBook they can actually be both serviced and upgraded. And serviced or upgraded in-house to boot. A faulty out of warranty MacBook can become a very expense thing, while a faulty ThinkPad gets you on eBay for an easily affordable replacement SKU.
In my own experience, I have yet to see an executive (not in the public relations or advertising departments) be thrilled with the prospect of spending money on ads. It's a cost of business, without any guarantee on returns, and they all hate it, seeing it only as a necessary evil (one they wish they could just avoid from the outset).
I think you are assuming that Adobe needs to spend the time, effort and money to port their products to open platforms.
From their balance sheets, their "fixed" support of Windows and Mac apparently does them just fine. Adobe seemingly prints money. Spending hundreds of thousands of dollars, or more, worth in manhours, from the actual port, to testing and then on to technical support, in order acquire the relatively small market share where FOSS desktops inhabit, I guess doesn't make much business sense to them.
I was just going to refer to that video myself, AC, but if you watched it you would have learned the answers that you sought.
The 90% productivity is an estimate from the workers themselves, acquired after trying out the alternatives in order to produce the video itself. So this is highly experienced video producers using a product and stating that yes, it works for me, but I [feel] I can't do "about 10%" of what I need to do.
But the most important factor brought up by that video, the most relevant in regards to InkScape and what is constantly dismissed however much I try to remind them, is INTEGRATION.
When you [buy] the Adobe suite you get an integrated solution system, one that not only tries to retain a similar look and feel to one another but also dynamically links apps together, PLUS allows easy collaboration between designers in different locations.
In other words, if you are in InDesign, or Premier, you can quickly open, embed *and* call to re-edit Photoshop and Illustrator files directly from the Id/Pr UI. Once you switch a product away, you as a designer will have to go back into the originating non-Adobe app to do things like adjust layer visibility or any other adjustment, save, back into your designer, and relink to update.
We did that years ago, this is 2020. I don't want to have to go back to 2000.
Plus, if you want/need to share your work with a collaborator, you can sent the file directly, without any possible conversion loss, because pretty much every visual pro uses the Adobe suite. Share a .PSD or .AI file, heck even the .IDD or .PRPROJ file, and you can be pretty damn sure that your collaborator can open it. Because they are almost certainly using Adobe products themselves.
If you, or they, aren't using Adobe, then you'll have to convert. If you even can, in the cases of project files. And conversion can lose embedded things like UI settings, histories, maybe even objects. So you'll spend the time converting only to hope that your recipient will get everything you're hoping they will.
It's like the frustration of a Mac user sending a iWork text file to (any) office, filed with PC's, and expecting the world to be able to read it (yes, I've been a victim of this). Because they can, after all.
So the world sticks with Adobe...because all (the rest of the world) sticks with Adobe. Yes it's a Catch-22 but that's the life we've been forced to lead.
So what other developers do, unless it's so special that it carves its own niche (Premier vs. Resolve), just automatically has a mountain that they pretty much can't climb, at least in professional production circles.
According to the linked review, the MintBox's nVidia 1660 GPU overheated in Windows 10:
"it was able to render the [iRacing] game with almost any in-game settings, but if you asked too much of it, the temperature could get close to 90C, at which stage the NVIDIA card would cut the signal to the HDMI port."
"I found that NVIDIA HDMI port to be sensitive to high frequencies also. It didn’t like pushing signals at 120 FPS. I don’t know if it’s a limitation of the GTX 1660 Ti, if it’s due to heat dissipation or if it’s a decision made by NVIDIA. In any case it only affects the HDMI port. The DisplayPort continues to work without any issues at these temperatures and at 120Hz."
"My overall impression on the performance of the MintBox 3 is that it’s overpowered. It has so much power in there I’m not sure we’ll ever use it… and when asked to do things completely over the top (like playing a sim in VR with HDR and max settings) it bottlenecks on heat more so than on performance."
So does the MintBox make a good workstation? From the sounds of it, no. An excellent compact desktop, yes. Sure. according to the review you can use DisplayPort to avoid shutdown of the video output, but that doesn't address the GPU temps a single bit - running an nVidia card over 90C for a significant length of time (from historical evidence) usually doesn't end well -_-
We will have to await the results of tests on the NUC9VXQNX to see if it can stand up to the heat (pun intended) of a workstation workout.
So Mozilla is rolling out an integrated password manager.
On all my Firefox installations, desktop and mobile, the autofill password is borked. Sometimes I get complete autofill; sometimes the fields refuse to get filled in at all; or, more often, I must supply the username first before Firefox decides to complete the rest. Sometimes things work, sometimes it doesn't.
How about you fix that first, you damn idiots Mozilla, instead of worrying about integrating yet ANOTHER "feature" we may or may not want??!!
So fundamentally your defence of the proposed sale is that *some* people are making money in the .Org realm, so why shouldn't the registrars be able to be taken private, operations hidden from public view behind not only corporate firewalls but layered shell companies, and prices raised to make untold millions off anyone else, and all, who use said registry, in order to permit "free market" to have its reign?
Hmmm. And you'll wonder why you have so many downvotes, so quickly accumulating.
Again, that's your opinion and you have the right to have it. But millions of listeners apparently think otherwise, because they are [still] listening to the band; Metallica is still "successful" based upon that qualification.
You are looking, listening actually, to their music based upon your expectation that they have a sound that you prefer, that being the way they sounded in their youth. They aren't young any more. They sound different now, that's them. Almost no artist ever stays still, from Picasso to Beethoven to Metallica, they all change. That's what time does to people.
If they are (still) commercial successful then their music is hitting a nerve of some people...but not, necessarily, classic Metallica fans. But time never goes back, only forward, and if what they do today no longer agrees with you, it may be time to ask yourself if you wish to continue looking at their work. I've done the same with some of the greatest influences and loves of my life, what it was no longer represents what it is and I simply had to let go because I considered the new greatly inferior to the old (hint: a great sci-fi pop culture classic).
It is what it is; if I continued to hang on to the past I'd only be bitter on how bad I consider the new stuff, and simultaneously only end up sounding like a miserable old fart :p
We all love some of the work that artists do, yet do not like others. That's the nature of art. Time, plus the artists as well as ourselves, do not stand still. It is, well, foolish IMHO to even try. I can completely understand you not liking anything the band is currently doing...but, it may be time to just let go.
It's not just "reinventing their sound". As the article, and many comments here shows, many listeners of a band's music never seem to move on but the artists *do*. Artists are human and they age, and during that aging their viewpoints and attitudes change. They can't stay angry young men (and women) forever, and both what they want to say on their art, and how they want to say it, grows different with time.
But fans never want to hear that. Many fans want the same sound forever; many fans look to a particular sound of their favorite bands as a way to relieve, nay revive, their youth. But that's poison to any artist worthy of that appellation: artists grow. And to grow is to change.
Metallica is doing just fine, in both their business and their music. They we no longer angry young men, being soundly in their middle age now, and they wish to say things in new and different ways. I'm sorry if that hurts people who think they must sound the same until the end of their days. There are artists who I loved in earlier years but now no longer feel a connection to; there are artists who I never paid any attention to in my younger years that I now appreciate. Respect what you loved of them in the past but honor just that: their *past*. If what they do today doesn't jive with you, then honor that too, and allow the artist the right to grow into whatever they feel they have the right to be.
I am pretty sure that the argument is that the argument is *no* longer relevant as the charges were reversed (dropped, but Google's search results fail to reflect that fact with no update or link to this additional information. A dismissed issue is no issue at all; I would personally believe that Google's responsibility is based upon the ranking of the, now, depreciated article (if Google places the accusation article highly but places the correction articles poorly).
Then, as noted below by another commenter, you did it WRONG! And, not only THAT, you INSISTED on impressing your WRONG opinion on everyone else!!
The standard is NOT a simple single space, the typographic standard is an EXTENDED single space
since all but typesetting programs pretty much don't handle the extended space, we, the users, put in double spaces.
Unless you programmed to replace the double with the extended single, you messed up, and your typesetting looked terrible *and* only managed to annoy people like me who were forced to (try) to read your poorly-set type.
"Use a prefe proper program to typeset the text"
And therein lies the problem: IMHO what both Microsoft and other single-space proponents are suggesting is to consciously modify their active typing technique to match what you can (perceive) your output device is doing.
Using a word processor that auto-adds end of sentence spaces? Single space key.
Using a monotype font? Double space.
Using email? Double space.
Using well programmed proportional font? Single space only if your program is actively acknowledging the auto extra end of sentence space.
This is madness. First figure out what default your program follows, then figure out how said program interprets the font of your choosing, then examine the results and modify as necessary.
Just double space the damn thing. That's been the standard since the typewriter keyboard was invented.
For me it's entirely a stylistic choice: I really do not prefer serif fonts as they, IMHO, imply an old fashioned ideology. I personally prefer sans serif in all communications, paper or electronic. Block formatting as well; when combined, serif and semi-block, it just yells "1970's!" to me.
I was taught how to type in grade school, over 40 years ago. In later schooling I was the head assistant to the entire Secretarial Studies department, and the most advanced IT student of the entire school.
Double space. End of story. Microsoft can go fsck itself if it believes that proper form is do to otherwise.
But that, making CSS compatible with wide gamut, in no way solves your stated problem. Color reproduction accuracy is in absolutely no way guaranteed, just because your customer's monitor is wide gamut and your new, shiny web site has the new CSS implemented. Breadth of ability in no way, shape or form linked to the accuracy of said ability; only if the monitor has color calibration, used frequently with an accurate sensor and software, can you believe in decent color reproduction accuracy.
It was not uncommon in older British cars to have a mixture of metric and imperial, especially for designs that have their roots in the time before metrification of the UK.
Some things still do: the Canadian-built Can-Am Spyder 3-wheelers, which use a mixture of American-built and Canadian-built parts.
And drive me crazy every time I work on it. If I would have known this before purchase, I may have second-guessed that decision.
That is true. But my point, proven by the downvotes, is that, as usual, Apple fans think that an Apple solution is the magic bullet...to just about anything.
Intel was somehow holding Apple back from creating more "Magic!" products...by Intel's inability to make [their] 10nm process work. So Dell, HP, Acer, Asus, Razor, Microsoft (Surface), and more, were using the technology that Intel was capable of producing...but poor Apple's dream of fantasy products was impinged by evil Intel's lack of skill in getting their technology advances to work.
That's saying that we'd be on Alpha Centauri...if only those idiots at NASA would get off their butts and get warp drive into production. Or if that evil cabal of Tesla/Panasonic wasn't constantly pushing back their super capacitor products, we'd have electric cars that went 900 miles and recharge in 5 minutes.
It may be *possible* that Intel was just biding their time on 10nm, to get maximum profits from 14 when they had no competition; if Intel quickly rolls out 10nm to address AMD, then that's a reasonable accusation. But you say that Intel was blocking Apple from creating better products is the epitome of fanboy stupidity - not only does everyone have to deal with developing at the same technology level, Intel and any other supplier/manufacturer has to put products out at the best skill level they can. If they can't create better...that's it. They couldn't create carbon fibre for airplanes in World War II, and Intel couldn't get their 10nm process up to par. That's the way technology works.
If Apple is so disappointed with Intel's performance then why don't they just switch to AMD?
AMD has an entire new series of chips out and they are both well regarded and excellent excellent performers. And doing so would avoid the entire new platform/incompatible application thumb.
But fans would rather listen to the belief that it's all Intel's fault for 'making" Apple think about switching.
The only people not keeping up with developments, like AMD? Better look in that mirror.
"Intel has failed to deliver most of what they have promised in the last 5 years"
I would like to know what you mean by that statement. 'Intel has failed...in the last 5 years" to deliver what, exactly?
Are you referring to Apple's failure to properly implement, and therefore failure to properly utilize, Intel microprocessors due to Apple's utter incompetence to design proper heat management systems into most of their recent product line?
Are you referring to the fact that, for at least the last two generations, MacBook Pros have physical thermal design limits below their CPU requirements? Meaning that MacBook Pros intentionally throttle the CPU during standard-use loads, thereby never allowing the CPU to achieve the processing level that the buyer paid for?
Do you mean the well-known utter failure of the thermal design of the trashcan Mac Pros? Meaning that Mac Pro buyers never achieved the full CPU processing power that they paid for?
Do you mean the utter failure of the thermal design of the latest MacBook Air, causing outright CPU failure due to a massive failure of proper airflow design?
The world would like to know.
We can certainly hope. From your mouth to their dumb ears.
It sounds negative, but I've grown tired of this country wasting millions upon millions of manhours, never mind millions upon millions of dollars, dragging these Neanderthal types out of the muck of thier own creation.
Let Darwin have a bit of leash, I (sadly) have begun to believe; constantly saving these inbreds from themselves has only diluted the gene pool. Good pool upkeep *does* demand occasional chlorine treatments... :-p
People like you, with those typical knee-jerk responses, are the epitome of irrelevant. All you "Capitalism rules!" reply fools see are 2 answers, black or white, "great capitalism" or "evil communism". You people never understand shades of grey - a middle ground of mixed, practical solutions attained by knowledge, history, trial and error, and lessons learned.
But that's because you don't WANT to. You always want you paint black and white solutions so that you always are in the right, always the "hero". Since your way is purity and right, and you've only ever presented a dark, seemingly negative option, your's is, obviously, the only apparent choice to anyone and everyone listening.
Exactly, I agree. Flash haters are only viewing this through the lens of business outcomes; over the past 20 years there have been hundreds of THOUSANDS of artists, both paid and fan/independent, who have created Flash-based art. Now I've backed up an older version of the Flash installer to allow me to continue to access my legacy art trove, but future potential viewers may have a lot of difficulty accessing this equivalent of an online art museum.
Biting the hand that feeds IT © 1998–2020