Re: Yeah, right.
"they'll accept them graciously, thank them for them and take care of them anyway. "
... You know nothing about children.
1555 publicly visible posts • joined 26 Aug 2014
"I'm thinking that Yahoo's core business is selling ad space."
I'm not so sure tbh. I think Yahoo! think that they're an ad space firm... but they're not very good at it. IF it weren't for their (extremely fortuitous) Alibaba stake, I don't think anyone would have bothered to mention Yahoo over the past year or two.
"What is their "core business"? Even the search is contracted out.... so..?????"
I think that's Yahoo's big problem really. They have no idea. After losing the search engine wars, they've just kind of existed and lived off some fairly good share purchases. Which actually means Yahoo's business model is somewhere between 'hedge fund' and 'pension fund'.
It's always struck me as rather pathetic how desperately economists want to be treated as a 'hard' science. It's a social science, which is indeed a type of science; it's arguably tougher than the physical sciences (since generally speaking, once you're outside the quantum level, the objects physical sciences measure rarely decide to change their behaviour completely and en masse in a short space of time). But that lumps them in with Sociology and Anthropology and Psychology and Political Science, which economists want to look down on (because they tend to disagree with most of what economists say people are like). Suggesting that they're on equal footing with such lesser being is a sure-fire way to wind up Econ grads.
As to the article... so, it doesn't matter that we're not spending money on science because other people are. That's definitely not an utterly idiotic answer, well done Tim. And the past 30 years of relentless deregulation and opening things up to competition has somehow coincided with average productivity growth falling enormously, with a particular correlation between the places where growth was worst and those industrialized countries which engaged in the most aggressive deregulation. Even if the bleeding edge has remained more or less at the same level of productivity as 30 years ago, that none the less suggests that there's an error in your argument that more competition will solve the problem.
"I propose the Ledswinger test, in which a machine qualifies for AI status only when it comes up with a coherent and realistic plan for resolving the complete mess that is the Middle East at lowest cost and minimum inconvenience"
It'd come back with '2 state solution' after about eight seconds.
Now waiting for Davedavedave to declare me antisemitic for that statement.
" The problem is that unless you know for sure all devices in your datacenter can handle that, it is better safe than sorry"
That's the real issue, I suspect. Every company has that one box built by hand in 1986 which is, for some reason, still vitally important but impossible to shift onto something newer. It's so inefficient that it's using half the power input of the DC to produce one thousandth of the compute, it requires a temperature 10C cooler than every other box on premises, and it's OS is so ancient that you have to pay the one remaining man in Britain who knows how to use it a small fortune in retainer fees just to service it. But if it stopped working for 30 seconds, then the sky would fall in, so literally everything is designed to take it into account.
"Language is the transfer of information through communication. The sound-carried version we *primarily* use as humans is by far not the only one, nor, in fact, the most sophisticated in terms of efficiency and accuracy. Language as a system to communicate evolved from the first single-celled organisms up, so you can hardly call that a "human invention"."
Not really, no. Language and communication aren't synonymous, and single-celled organisms do not have 'languages'. Bee dancing is communication, but I don't know of a single linguist who would call it language either - a language needs to follow Saussure's clinical definition of the concept, and almost no animal communication does so (it's been argued to exist in higher mammals, like chimps or dolphins, but even that is pretty contentious. Biologists in the 1970s liked to say it does, but linguists, on the whole, didn't agree, and most modern biologists are increasingly coming round to the idea that describing animal communication as language is anthropomorphism rather than science).
In fact, the more that the structure of language is looked at, the more it seems plausible that its use for communication may be a fortunate by-product. Language - by which I mean all 'natural' languages - is actually structured very inefficiently for the transfer of information.
It seems to have evolved more around the articulation of complex abstract concepts than for actually talking to people - so language actually developed for thinking, and just happens to be something that can then be used in communication. It's a bit suppositional, but this might explain the 60,000 years or so between the evolution of anatomically modern humans, and the 'great leap forward' when they start making cave art etc.
In which case, yes, language is the first and greatest invention of mankind, as it allowed the transfer of knowledge between people. Then agriculture, which allowed the control of the food supply; then writing, which allowed the storing of knowledge externally; then we're probably onto the scientific method and the use of fossil fuels. Economic toys like limited liability and double-entry bookkeeping are not remotely important compared to these - they're of no value in themselves if they're extracted from the capitalist world system, which is why they weren't invented by the Romans or the Egyptians. Tim's amazement at how they were able to run an empire without accountants tells you more about Tim's assumptions than it does about ancient societies, tbh.
I work for an architect. Amazingly, the most important element in our business is the architectural design guys. I strongly suspect that they always will be, on account of the fact that they're where the money comes from. No matter how much we 'disrupt', unless we decide to jack in this whole 'drawing buildings' thing and become a cloud provider, we're still likely to be taking a back seat to the guys who do our core function.
Junking all our existing systems and replacing them might change that, very briefly, because no-one aside from IT would have the slightest idea how anything works. That might not be a positive move from the business's point of view, though.
Most of the rest of the article was similarly idiotic tripe; I just couldn't let this pass.
tbh, not for me either, and I'm on a windows system with an antivirus.
Hell, the first thing I did when I read that line was open up a word doc and timed it. About half a second from my finger hitting the mouse button to Word springing up, fully-loaded and ready to go. I don't think that that half-second is crushing my productivity. I'm not rushing to ditch Office 2013 and shunt to 2016 because of it, and I'm not really going to be appalled if I find that the half-second delay hasn't reduced to a half-picosecond delay in the new suite.
The author's attitude falls into the trap of not realizing that there's change that NEEDS to happen, and then there's change for the sake of it. Moving from a frame rate of 20 to a frame rate of 40 is a necessary optimization. Moving from a frame rate of 150 to 300 is not. Replacing an elderly, mission-critical system that you can no longer get the parts for is necessary. Replacing on that is still doing it's job and can be repaired just because 'it's old' is not.
"How much liability can a producer shed using an EULA."
Nowhere near as much as they attempt to tbh; generally speaking, when a EULA is examined by a court, it's found to contain dozens of unlawful clauses and so can't actually be enforced. Steam ran into this in Europe; their original EULA rejected any sort of refund out-of-hand, which is flat-out illegal in the EU. Steam then climbed down. Similar things are probably the case for most others.
I'd be interested to see what happens to, say, Google if someone challenged their end-user agreement, since the amount of data they keep is probably far outside the boundaries of European legislation (hell, in Google's case there's successful cases where people have shown it's outside US law).
Except that cloud vendors aren't running AWS or Azure on a bunch of Dell boxes. Dell aren't gonna pick up a major amount of business from the big players in cloud, so yes, it's generally a threat to them - how much of a threat is almost certainly massively overstated, since there's just a lot of things the cloud offers no serious advantage for, but a company like Dell (which is basically just a distributor of commodity hardware) hasn't really got much hope of picking up big sales to MS or Amazon.
"The numbers of Macs being sold year on year is IMHO increasing."
May we have a recent citation on that, too, please?
Apple's own stats generally show Mac sales hovering around 5mil /quarter worldwide for the past 5 years... compared to >70 million Windows-based computers per quarter. The OS market share figures still basically have Apple's total users at roughly the same number as those still using basic, hated Win 8 (not 8.1).
That said, I agree that Apple probably won't attempt to merge iOS and OSX - there's no particularly good reason to do so, and frankly O/Ses for desktops and O/Ses for phones need to do different things. If IoT takes off, I'm going to have a processor sat in my toaster. That doesn't mean I really want it running Windows 10, though.
"If you're concerned with the tone of the message"
I suspect he was more concerned over the fact that the OP is talking absolute shash, tbh. Hyper-V is used in thousands of production environments successfully, and if you know Powershell then deployment and administration are considerably easier than Vsphere (tho if you don't know PS, then stick to VMware). You don't even need to have it as part of a windows install, it can run as a standalone hypervisor.
Even if you do run it as part of full-fat Windows, MS have recognized that their patch-heavy approach is unpopular, and have been going out of their way to cut down on reboots for 2016 - and to reduce downtime generally. MS are currently claiming that uptime on a win 2016 box is going to be higher than an equivalent RHEL box; 1 reboot a month with 40 seconds boot time. Whether that's a realistic metric for production boxes is debatable, but still shows that things have moved on a bit since Server 2003/XP environments (which appears to be about the level of Windows knowledge that the OP is running with).
Besides, modern DC environments with clustering, HCI etc kinda makes the uptime/downtime thing a bit irrelevant, no? I can happily shunt VMs around between boxes, shut down random hosts, and even knock my filers over without end users even noticing. We're using VMware, but I use hyper-v for my test environment because really, there ain't much difference between the two in terms of day-to-day features and being able to just rattle off instructions in PS to a dozen boxes at once is more convenient that mucking about in the horrible VSphere web app.
Anyone ever get the feeling that DaveDaveDave is extremely anti-Semitic, and just projects his loathing onto literally every other subject? After all, he's declaring tax evasion to be fine here because anything else would be antisemitism... which implies that he thinks all Jews are tax evaders...
"They've already been bought by long time (and expensive) 3D printer maker Stratasys."
And that was 4-5 years ago, IIRC.
Makerbot had already lost most of it's credibility and talent even then. Repeated shifts in direction that isolated it from the 3D printing community, a number of decidedly half-baked products, a couple of humiliating climbdowns and a CEO who pretty much pissed off everyone who ever met him (before being parachuted out to Stratasys in the buyout) all mean it's been a failing company for a while now.
"Also, if a chip uses less power under load, that doesn't necessarily mean that it uses less power when idle.
In other words, the Samsung chips might actually last longer in real-world scenario"
I'd expect the inverse tbh; the 14nm process chips would have greater leakage when idle but should have lower draw when active.
"Curious comment. Of course any modern device will stop charging its internal battery once it reaches 100%."
Ish. They usually charge to 100%, stop charging, runs on battery to 95%, and then begin charging again. The display remains fixed on 100% at this point (mostly to stop users questioning why their phone isn't charging even though it's plugged in and only on 96%). So while it's not charging constantly, it IS charging for about half the time it's listed at 100% and plugged in; the phone isn't running off the mains.
And yes, that still saps the battery life comparatively aggressively. Of course, given that most flagship phones are probably replaced after just a year to 18 months (as opposed to budget models, which are usually used for much longer and usually have replaceable batteries), it's probably not overly noticeable to anyone who's not still rocking an iPhone 4; the other flagships from that period pretty much all had user-swapable batteries. And really, how many Apple fanboys would still be using a phone that's 2 1/2 generations old?
It won't.
It *should do*. But no, it won't. 90% of consumers won't care. It's easy to forget in the tech echo chamber, but really, Google is infinitely worse and yet still absolutely dominates in markets that are not simply limited to 'people who can't afford an iPhone'.
If you were right, then DuckDuckGo would be the most popular search engine on the planet. As opposed to Google and Bing having about 80% of the entire world search market between them (and I'm still fairly sure the percentage Bing has spend about half their searches looking for Google).
People don't care as much as we wish they did, and if the Surface is seen as a cool piece of kit (which it increasingly is) they wouldn't give a damn if it sends Microsoft nude pictures of their grandparents in an envelope made of bank statements.
"Your examples show a company that does not have a real strategy."
This. Just this.
MS don't really have one, and haven't more or less since Gates left. Gates concentrated solely on strategy, and while he built a company everyone loved to hate as a result, he was insanely successful. Since Ballmer took over, and just as much under SatNad, MS spent more time desperately trying to get people to LIKE them than outlining a serious business strategy. Sure, they still make a lot of money from their pre-2001 business strategies, but these haven't been developed much since. When asked about what the strategy is, SatNad just lists products ('Office 265, Windows, and Azure'); this isn't focusing on a sector or thinking ahead. This is just three things that you hope are going to sell well, without outlining HOW YOU'RE GOING TO SELL IT.
I still remember when any MS event would be all suits showing up, presenting you this product that you were quite simply going to have to use (it wasn't even questioned, they knew it and we knew it too, even if we hated it), and then informed you how much more per year every person in the room was now going to be paying them. Now, it's all excited balding men in T-shirts giving out USB drives and talking excitedly about, well, crap.
Bill Gates was never 'pumped' about anything... He just worked a lot, made a lot of money, and was kind of a dick to everyone who worked for him. MS should remember that their core segment is not consumers. It's enterprise. Serious men in suits who have been professional purchasers for 30 years and who will be making their decisions based on compatibility, price, licensing deals etc... they're not the guys who go in for Apple-style cult meetings.
Basically, MS need to go back to doing what they do well - being about as likable as Oracle, and caring just as little about that as Oracle do. Ruthless corporate strategy and to hell with their brand image. It's too late to try and salvage their position in the public imagination now, MS will ALWAYS be considered an Evil Suit, regardless of how much they try to appear as a huggable teddy bear now.
"n my time you had to dope germanium yourself atom by atom using nothing but a screwdriver and a hammer and gramps was building wired-or logic gates with nothing but crystal detectors cobbled together from a sharpened graphite pencil lead and rusty Gillette blades*!"
Luxury!
When I were a nipper, we had to clump hydrogen together BY HAND until the density reached such pressures that thermonuclear fusion process began, then wait WITHOUT OXYGEN until enough heavy elements had been formed to begin the planetary formation process, usually with two or three supernovae interrupting the whole thing and scattering it all over the show, until we were able to begin hand-mining the individual atoms on the still-cooling surface of a newly-born planetoid. We used to DREAM of getting pre-formed individual Germanium atoms...
etc etc.
"Who ever thought there would be a Microsoft conference where Microsoft software was the elephant in the room everybody was studiously trying to ignore?"
Anyone who picked up on the theme of the event being hardware...? I for one would've been somewhat surprised if an event billed 'device day' had started with a 3-hour showcase of a desktop O/S. Complaining about the shortage on software is like going to a Google event on self-driving cars and complaining that there was no Android showcase included.
The Surface book is the big ticket item, there's no denying that - and tbh it's a very good big ticket item. The Surface unit has been increasing in revenue enormously since 2010, and while Andrew may not have met anyone that's bought one I know a lot of people who have - and I don't know anyone who's used a Surface 3 Pro who actually dislikes it. They're good machines, and the Book honestly looks like a very solid option at a very attractive price from this distance.
The Lumias will undoubtedly continue to be a ludicrously niche device, even though I don't share Orlowski's pessimism about Continuum (I for one have enough monitors lying around my house that the ability to convert a phone or tablet into a PC on the fly is genuinely interesting). But MS might as well accept that they're never, ever going to win a big slice of the consumer pie in smartphone land, and would do better to concentrate on taking up an enterprise-centric, AD-integrated strategy, adopting the niche that currently just contains the corpse of Blackberry.
And Band is about as exciting as other wearables, meaning 'not remotely'. Seriously, is anyone who's actually really tech-aware (i.e., genuine techies as opposed to journos and analysts) excited about ANY wearable yet? Has anyone found any real reason for their existence? Watch/Wear are most useful for saving you 3 seconds to remove your phone from your pocket; most of the rest are little more than pedometers and 1980s calculator watches. Hololens has more promise in this area, but suffers from making the wearer look like a prick, needing to be plugged into a PC, and being prohibitively expensive.
"Since Intel CPUs have been getting faster by a mere 5-10% per generation the last few years, I suspect by being "twice as fast" this laptop has twice the cores of the Macbook"
Or it might just be that it's using the latest architecture with more cores, higher frequencies and way more graphical grunt and RAM. Funny thing; Macbooks aren't automatically faster becaues there's an Apple logo on the front.
It's hard to judge either way though, given the differences between the OSes. Hackintoshes often perform worse than Windows 10 on equivalent hardware, though.
It's obviously true that yes, this requires a precursor hack to get the dll in place - an in any environment where the admins have any training at all, installing a dll onto the front-end MX should already require domain admin creds.
But this might still be relevant for 'after the breach' hacks - everyone changes password, the CISO is fired, the firewall is swapped for one that costs six times as much etc, but the exchange server remains compromised so the attackers can easily re-acquire credentials.
"If Google opted to turn Android into a notebook operating system, “we can see this product making real headway.” "
Hands up anyone who'd seriously agree with that? Making the jump from PC/notebook to mobile/tablet hasn't exactly worked well for anyone else who's tried it, and I've not seen much evidence going the other way either really. There's a good reason Steve Jobs kept iOS and OSX separate and never took suggestions that the two should be merged seriously.
The devices are for different purposes and analysts would do well to remember that, rather than lumping together everything with a CPU in it. When the IoT picks up (and when they're stopped masturbating furiously over the profit projections from it), are we going to have idiots at Canalys or Gartner claiming that a buoyant self-aware refrigerator market is likely to factor into reduced PC sales?
"Additionally what was the autonomy buy go to do with VC's any way. As I recall both Autonomy and HP were public quoted companies at the time of the deal so any VC's originally involved would have been long gone with their pound of flesh."
Indeed; surely any VCs who were still hanging around in Autonomy will have made out like bandits from the over-valued sale anyway?
"Updates to nuclear systems will always be infrequent because they're not the sort of systems where you can just dump the responsibility for finding bugs upon the end-users."
'Infrequent update' vs 'uses systems that are now older than 90% of the people working there' are different things. Amongst other things, several silos have had hardware faults that cannot be repaired properly - hence the security door being propped open with the brick. No-one makes the electronics tat are compatible with the security system anymore. The result? The security is bypassed by users, and so may as well not be there at all.
And this is where we keep the things that can end civilization as we know it. If the lock on my front door stops working, I replace it. If the lock on my nuclear weapons silo stops working, I just stop locking the door...
""by now" is indeed true: "by then" alas is not."
It does go some way to highlight the incredibly low priority hardware upgrades get in around nukes - both reactors and missiles. There was that silo in the mid-west where the (3 foot thick) security door had been propped open with a brick for the last ten years and they were still using 8" floppy disks because that's what was current when the computer system was installed... the US government is shockingly cavalier with these things.
"I'd prefer the guys stood next to the the hot bits to have full and unfettered access for speed and ease."
In a world where you need a 4-digit code to enter the shared laundry room, I think that we can perhaps expect the nuclear reactor control room to at least adopt a similar level of security to the machine you use to clean your y-fronts.
"Windows of old was the slightly annoying but attentive waiter in a restaurant."
Really?
I seem to recall that, at the time, we all felt Windows was the antisocial halfwit proprietor, who pissed in the soup, swore at customers and tended to either spit at them or ignore them when they asked him to do anything, but we had no choice but to put up with him because he owned all the restaurants in town. Well, aside from the weird restaurant with the penguin logo that you could only get into through a secret tunnel over an assault course, and once you were in all the other clientele sneered at you. And the Apple cult, who grew their own food that tasted dreadful, but got away with it by telling the devoted that they shouldn't like other food anyway.
Thinking about it, computing pre-2003 was basically horrible.
"Wake up, Apple is a target and is being breached."
Have an upvote for sanity, to balance out the hundreds of downvotes you'll get for implying Apple aren't perfect.
Apple's approach to security assumes they're smarter than the enemy. That's exactly not how security professionals are taught. Always presume there's someone brighter than you who can crack anything you build.
"I wish they'd figure out a way I could use iCloud with a key that only I hold, "
Or, rather than wasting time doing that, they could just use standard public/private PKI encryption like everyone else, which seems to have done the trick for the last twenty years.
This doesn't need some 'Think Different' re-invention of the wheel. The security standards have all been there for years and work fine for this. Apple just haven't implemented them, out of a mis-placed fear of damaging the 'user experience'.
It was the same thing which lead to the iCloud hack last year (or rather, not a hack - just lots of private data being taken by unauthorized people. Which isn't a hack when it happens to Apple. Apparently); Apple could've implemented 2FA from the start, and they rolled it out in a matter of days afterwards. But they didn't put it in beforehand, because security pisses users off.
Apple don't take security seriously. They're getting better, but it's still way down their priority list, behind 'User experience' and 'looks cool' - so whenever there's a conflict, security takes a back seat.