I thought the ban on Huawei was not part of a trade war, but for national security reasons. Is DT trading away the safety of the US' infrastructure for short-term economic gain? Inquiring minds would like to know!
56 publicly visible posts • joined 13 Aug 2007
The data feeding frenzy will not stop because people suddenly protect their privacy and eschew free messaging. It will stop because everybody already has all the informmation, so it loses its monetary value.
Facebook: I'll sell you Andrew Orlowski's complete profile for 5 quid.
Facebook: £4. Special deal!
His mobile phone provider, teamed up with Amazon: 20p. It's not our primay business, but we have it.
At some point there is nothing left to steal...
I stand corrected. Yanks using imperial measures... Brits at least would have had an excuse. It was their empire at some point.
As to your insinuations about my name and certain political developments in this country: I signed up many years ago, in more innocent times, when "This is a tech forum. Why shouldn't I show my real name?" sounded entirely reasonable. The aforementioned developments would be a lot funnier if I didn't have my livelihood tied up on this island. Coming from a country that used to be (and still is) very good at xenophobia, I can tell you that it's not GCHQ that I am worried about. I can come up with a few bad scenarios, but let's not give them ideas.
Seems to be obvious. Or was it the Brits calculating in imperial measures again?
Btw., years of reading online publications have resulted in me automatically scrolling past Twitter screenshots, and now it's compulsive. Would it really be so hard, if you write an article, to actually write it, and not just pass on the source data?
There is literature outside the US and UK, y'know...
Stanislav Lem's "Fables for Robots" tells the fairytales that robots tell their offspring. The universe is populated by robots. In some of their legends, a particularly devious monster appears. Made largely of corrosive water, he is only a distant collective memory. An enemy that, in their early pre-history, may have tried to destroy them. Some even believe that he created them in the first place, but that's a disgusting idea and obviously nonsense.
Firefox is just another browser. We have FF and Chromium on our PC, so that my wife and I can keep our bookmarks and login names separate, but 99% of the time there is no difference. Even IE is pretty much the same nowadays.
Thunderbird, on the other hand, is our mail client of choice. Platform-independent (Windows and Linux; I'm told it also works on a Mac), in feature freeze for years now (which is GREAT!; I want a consistent mailer, not a completely different UI every two years, or something that wants to control my complete computer). If the open-sourcers keep up the security updates, I'm happy keeping TB and letting FF go the way of Netscape. If they don't, there is no obvious alternative I can think of.
"... could at least employ the STEM grads that won't actually find employment in their core disciplines." And who has suddenly identified this as a problem? Both in the sciences and the arts, there are fields that are only (or mainly) done at university, and if you don't want to have an academic career, you will try to apply what you learnt in a different field. If you have done astronomy (or arts history, for that matter), you knew from the start that there are 100 students to one academic position. You hopefully picked up logic and problem solving skills, in addition to presenting your results and (in STEM) a bit of maths and possibly programming. Somebody will hire you and retrain you for the job they want to fill.
That they will pay you 10% of an economics graduate or 1% of an MBA is a different problem...
It only hurts individuals. If the percentage of birth defects and the cancer rates go up by a factor of 10, wildlife will still thrive. Damaged newborns die immediately, sick adults die as well, but the reproduction rate easily covers that. So if you are happy with an AVERAGE life expectancy of 45 (you might die younger, or you may live until 100), and with an increased proportion of handicapped children (in contrast to animals, we don't get rid of them immediately, at least in most human societies), you're welcome to live there.
As a system administrator, I am somewhat surprised that the word seems to have changed meaning since I last used it. Didn't know it now means "upload for perusal". I'll quickly have a rifle through my users' backups for something saucy, or of interest to the competition. If I find it, I will use it for targeted advertising. "My silence. This week only! £1 Gazillion".
> I have loads of the stuff, it's coming out of my ears
Can you tell me where I can find your ears, so I can place a collecting basket under them?
I have a bunch of perfectly good laptops that were top of the range 10 years ago. A couple of project students every year, but no budget. They basically need a terminal (in the old days, we called it a thin client): something that runs xterms to ssh into a remote server, and an X server to look at remote graphics. Doesn't hurt if they can collect data in a spreadsheet and a write it up locally. Should I spend 500 quid that I don't have on each of them, so they can have translucent windows and the ads in their browser move more smoothly?
'All right, I've got my war face on, and I'm going to the fight,' He's watching too many war movies and not talking to his men (and women). It's probably more like "I'm going to kill 50 people today, who can't defend themselves. Most of them are truly our enemies, but 10% are innocent bystanders, and 5% just hold something in their hand that could possibly be a weapon. That child on the car's back seat yesterday looked just like my daughter".
But never worry, within 10 years this job will be taken over by an AI. Slightly less accurate, but nobody will be able to challenge it. Welcome to the rave new world.
You don't just tag and track whales. If you want to study whales properly, you kill them on sight, cut them up, and sell them as food. Just ask the nation with the greatest interest in whale research.
Alternatively, blast them with sonar until they beach themselves, and then pretend you weren't even in the area.
"not just an alternative to GNOME and Unity, but one that's every bit as sophisticated and refined". Please don't! I use an older version of Xfce because it leaves a few cycles and kilobytes for doing something useful, even on older hardware. And I don't want MORE docking behaviour when I move a window out of the way. I know where the "maximize" button is, thank you very much!
So which one am I supposed to use now?
Bougt one lately, but had to return it. Brilliant speaker, except that you can't use it as a speaker.
OK, I should have done my homework, but it hadn't crossed my mind that the simplest functionality of a speaker could be missing. You can only play music (from specific sources), and only through their app. Possible explanations:
1) To enforce other people's copyright. Did I pay for my speaker, or did Sony?
2) They want to collect and monetise my listening habit. Did I pay for that speaker, or did Doubleclick (or whoever the data aggregator is)?
All I wanted was one single speaker, ideally wireless, instead of the cable spaghetti caused by wired subwoofer + 2 speakers that nevertheless stand in the same corner of the office, for Youtube and BBC News.
Having spent a career in HPC, I don't understand this disdain for execution speed. If you write a code that will be run a million times with different input parameters, by different users, it makes a difference if you get the results from one of these runs in a day or in a month. The measure chosen in this case may be too simple, and the neglect of parallelisation is unforgivable (but then, the scope for using MPI tends to correlate with the execution speed of of a programming language anyway), but if you run a program often enough, the time it takes to write it becomes negligible. I regularly run for weeks on hundreds of cores, and a factor of two in execution speed compared to a competitor is enough to render a program useless.
Maintainability is much less language dependent than CS types try to make us believe. "Real programmers can write Fortran in any language", as the old essay said (quite unfairly, btw.). You can write maintainable and well-documented code in Assembler, or win the code obfuscation challenge in Mathematica.
So they want to join Google in the stalking business now. This would remove their last selling (sorry, giving away) point vs Chrome. New business model: we look like Chrome (http://www.theregister.co.uk/2014/04/29/firefox_29_redesign/), we act like Chrome, and we are nearly as good as Chrome. Good luck.
The worrying bit, if the allusions in the article are to be believed, is that this is borne out of financial desperation rather than opportunistic greed. Is there a good, FOSS, light-weight (as Firefox was meant to be, long ago) browser left? Don't say Chromium; their scraps are as poisoned as the real thing, offered to smother the competition, only to be withdrawn when no longer needed. See iGoogle, Sketchup, and numerous other services Google suffocated under its wings.
Warming is happening. Anthropogenic or otherwise doesn't matter. By and large it's not desirable, because most of us happen to live where conditions are favourable NOW. We can't relocate the inhabitants of Bangladesh to Siberia, or London to the Highlands. We know how to counter this development, in theory. This requires sacrifices. Since our willingness to make those sacrifices differs, those of good will will make them first, which only tells the more selfish ones that it's not so bad after all, and they can continue their wasteful ways. This means that said sacrifice has no effect whatsoever, except that the balance of economic success, and therefore power, moves towards the more ruthless.
Solutions on a postcard please.
(1) Explanation for those not familiar with British social history or economic literature: Villages and towns used to have "Common" grazing lands accessible to every member of the community. These tended to be over-exploited and neglected. Garrett Hardin made the term popular in a "Science" article in 1968.
> Interestingly the Romans seemed to like the idea of "metricated" multiple
Of course they were metric. Mile comes from "mille passuum" (a thousand paces, i.e. two thousand steps). Easy to count when you are marching long distances. The yard as well as the metre are roughly a step.
Does anybody remember the old "Long live Fortran" essay? "If you can't do it in Fortran, do it in assembly language. If you can't do it in assembly language, it isn't worth doing." "A real programmer can program Fortran in any language."
But seriously, since about Fortran 95, it does what you need to do. In 2003 they succumbed to the OO nonsense, but that's easy to ignore. (I bet 95% of all C++is actually C, which is a slightly less readable version of Fortran IV with added system calls).
Whenever I have to write an output parser, I ask myself "Do I spend two days learning Python or one hour writing it in Fortran". And what if I afterwards decide to add something numerical to it, which might actually take more than 20 seconds and involve a matrix?
Anyway, in an industry that's notoriously youth-obsessed and overrun by one revolution after the other, this old fogey without any formal IT qualifications (in those days we learnt it on the side in physics and chemistry) isn't worried about his job prospects. I should have learnt COBOL, but I have my limits...
Well, it may be OK to spout abuse online, but actual threats are another level. It's not legal offline, so why should the medium make a difference? If you allow bullies to threaten, you remove freedom of expression, not enhance it. If you disagree with somebody, the English language offers a few options for expressing the strength of your disagreement:
- I am of a different opinion than you.
That's a fact, unless one of us is lying.
- You're wrong.
That's an opinion.
- You're an idiot.
This may be considered abuse, but it's still an opinion. You'll get away with it in most countries.
- You're stark raving mad and should be locked up.
Ditto. No direct threat.
- If you don't shut up, I'm gonna rape you with a baseball bat, and your little children too.
Doesn't count as an opinion in my book.
- Too late to shut up. I'm gonna rape you anyway.
That's I think the level we are discussing here. I don't know why it needs discussing.
USB 1/2 was a huge improvement, but I never understood why they didn't go the whole hog in reducing apparent symmetry at the time. PS2 keyboard/mouse connectors (always at the rear of a computer chassis) were seemingly rotation invariant, so you had to guess the correct orientation out of an infinite number of possibilities, unless you had a clear view. USB (at that time also still only at the rear) reduced this to two options, making it infinitely easier, but still too fiddly. At some point they started universally colour-coding the (identical, but not exchangeable) keyboard and mouse plugs from grey/grey to lilac/green, which helps you bugger all when you can't see them. Can I claim back the hours I spent under desks trying to plug in keyboards and mice over 2 1/2 decades when I'm on my death bed and have only half a day left?
Well, since GCHQ and NSA share all their knowledge anyway (at least in one direction...), this won't make a difference.
A few years ago, long before Snowden, a UK University I know of decided to outsource their email. The students' accounts went to Google, but staff email is done by a local company who could guarantee that data won't end up on US servers. They are producing IP. They may do business with US companies of strategic interest (anybody big enough to afford a lobbyist in DC) or these companies' competitors, and the Patriot Act gives half the US public service access to any data they ask for.
Good thing Westminster doesn't produce any information worth keeping secret.
We are not made for other planets. Not in this solar system, anyway. If we anyway depend on lots of supporting technology, we may as well just stay in space. In a rotating space station, you can adjust gravity to exactly what you need. The radiation problem is solvable. Just make the walls thick enough and of the right material (I know it's not quite that simple, but it can be done). Energy is free and comes from the sun. Water and organic material can be largely recycled (it's called composting). It just has to be big enough. You will need some external supplies (unless you start mining asteroids or at least catching cosmic dust. That will take some technological development, but again, it can be done).
And then those staying behind can continue destroying their stupid planet, because we're outta here!
You are criticising the use f computers on various levels, some of which is justified, some not.
Where I'm with you is the current trend to replace causal understanding with statistics. Headlines starting with "studies show ...", and in the last paragraph some psychologist makes a wild-ass guess connected to evolution. You find this is mostly in psychology (which is crackpot science) and medicine (where the argument "as long as it works and kills fewer people than it saves" holds). This has no place in the hard sciences.
Programming bugs are a short term problem, annoying as they can be. If the underlying theory and method is published, more than one group will implement it, and discrepancies will be spotted.
Zooming in on chemistry: the underlying physics is known and understood. But to get results for anything bigger than hydrogen or helium, you have to introduce increasing levels of approximation, trading accuracy for tractability with current hardware and software. The skill of a computational chemist lies mainly in finding the right balance here. This year's Nobel Prize is for the combination of several of these approximate methods in such a way that chemical or biological systems large enough to be of actual interest can be treated with meaningful accuracy.
There is the implied accusation that it makes things "too easy". I agree, but that's a side-effect of making computational tools accessible for non-theoreticians, and it's not the fault of the method developers if they are misused. When I was a lad, you had to study physics and theoretical chemistry for three years before you could coax the available software into producing a meaningful result. The developers (of which I was one, long ago) worked long and hard to make the software more accessible. The result is that people can't be bothered studying the manual for three hours. They click on shiny user interfaces and get meaningless results, which they publish to seven digits accuracy. It's not the fault of Pople (Gaussian, Nobel Prize 1998) or Karplus (Charmm, NP 2013). Or mine, for that matter.
"Some were like: "Meh, whatever!", some were busy looking their watches: "Is it lunchtime yet?""
Are you surprised? It's like announcing that you're a redhead: everybody knew already, and it doesn't make a difference whatsoever. That your boss used it as a context to get rid of you is sad, but it sounds like this would have happened anyway. The only difference the official diagnosis makes is that you could have had the b*st*rd for constructive dismissal.
As you can see from the overwhelming "me too" response to this article, there are jobs (particularly in IT) where half the workforce are somewhere on that spectrum, many proudly so. I am working with a maths department at a university, and there it's three quarters. And I feel right at home there...
The one reason why a self-driving car sounds tempting that you could get plastered at the pub and be driven home afterwards without being a danger to yourself, others, and your driving licence.
The problem is that this won't be allowed. The official reason will be the liability question: somebody with the power to override the automatic settings must be legally responsible for the vehicle. The real reason is that once the government has managed to take away people's liberty to have an enjoyable evening at the expense of tomorrow's productivity, it will never give it back.
The mudslinging that this comment page is descending into, and the political convictions on both sides, aside, I think SH is wrong in this case. Fundamental science should be shared and disseminated irrespective of politics. As long as it's not used for military or other aggressive technology (not much danger in his case) or propaganda (OK, a "Presidential Conference" might be open to that accusation), a boycott is nearly always wrong. If you don't want to support countries whose governments don't live up to our standards of human rights, where do you stop? China? Russia? The USA (Mr O makes friendly noises, but Guantanamo remains open)? Go there and talk to the scientists about science. You can use the opportunity to talk to journalists or politicians about politics. Progress is made by dialogue, not by "I don't talk to you bastards".
"Could you then tell me what the correct temperature of the planet should be and what range of variations is considered normal? I can't find that information anywhere..."
The correct temperature is one that doesn't disrupt existing ecosystems too fast. If global warming is a fact and can't be stopped, there will be winners and losers, but the losers are sitting where it's good to live now. I can't see the entire population of Bangladesh migrating to Siberia in an orderly fashion, where they will be received with open arms, just because Bangladesh is under water, but the receding permafrost makes Siberia inhabitable.
And it's not just Bangladesh. It's the Netherlands, London, Louisiana, Florida, you name it.
It's a question of trust, and Canonical has lost mine. The Amazon search makes me wonder what they will spring on me next, with or without telling me. You're asked "update Y/n" when you switch your computer on, and suddenly the highest bidder owns your data (not yet, but after the Amazon thing, I think they are capable of it). With Google at least I know that this is the case, and I can avoid it.
I did discuss the Mint/Ubuntu connection. It makes me queasy. It's not a showstopper, but I fear it may creep in the same direction, or be abandoned by Canonical. I want to use a machine for 5 years without installing a new OS (another argument against Mint: no upgrade without reinstall).
I know I can change the desktop and switch things on and off. I can create my own distro it if needs to be, but I have better things to do. But I am supporting maybe 20 (and counting) not overly computer literate Linux users with different requirements and preferences, and this is not my main job. I am trying to point them in the right direction and hope they largely leave me alone from then on.
For me anyway. I am not a hardcore fanboi (using Windows and Linux, but preferring the latter, because I have to do stuff on remote Linux machines all the time). They went off on a tangent with Unity and went totally overboard with the Amazon search debacle. At the moment, if somebody asks me to install Linux for them, I use Mint, but I'm actually strongly drawn back to CentOS. Mint depends on upstream support from Ubuntu, which I consider unreliable (not to mention the fact that the glut libraries shipping with it don't work with OpenGL applications on the old RH servers my computational chemist friends need to connect to).
Shuttleworth thinks that he can get away with anything, because of market share and inertia. This works for MS and Google, but they offer something you can't replace without quite a learning curve (MS) or that's just the best in its field (Google). Ubuntu is just another distro, and after week of using a different one, people will have forgotten it ever existed.
...that I am not on Facebook, never have been, and don't intend to. But nevertheless keep reading every article about it.
More seriously: I see the lure of a fairly ubiquitous network where you can share family pictures and exchange messages without having to keep track of people's changing email addresses, but every time I read about another of its childish and patronising ("couple page") or exploitative (advertising messages purporting to come from you, just because you "liked" a product) "features", I think: nah, I'm happier alone in the woods.
My desktop is mine! I can see the point of mixing of local apps, a quick link to an installer, and online apps, as long as I am asked if I want it. But mixing paid-for advertising with a local desktop search is such a hare-brained rip-off idea that the members of the Axis of Evil (MS and Google) would not have tried to tread there. This is as bad as Smart Tags and worse than the fact that Google thinks nothing of searching my email, FFS. Even if they retract it now, it shows a mindset that makes me add Canonical to the companies to avoid if possible. In contrast to MS and Google, this one can be completely avoided.
Speaking as a semi-academic at a UK university, I do see why sheer inertia keeps the current business model of scientific publishers working, but it's such a rip-off that it must change in the long run. Here's the procedure by which a scientific article gets into a university library.
1) A scientist or group of scientists do some research. They are employed by a university, possibly with additional money from government funding bodies (EPSRC/BBSRC, etc. in UK, NSF/DoE/DoD,... in the US). They write up their results in an article.
2) They submit this article to a scientific journal. The main criterion in choosing a journal is how widely it is read in the community of researchers who might be interested. Actually, they submit it to a scientific editor. This is an academic who does this job because it adds to his standing in the academic community, not because he might be paid for it.
3) This editor, after sifting out any totally hopeless submissions, sends the article to 2-4 other academics in the same field for peer review.
4) The reviewers read the article and write a report, recommending to reject it, accept it, or demand changes. There is little advantage for them in doing this (for one thing, they remain anonymous), but it's part of what's expected from an academic. Abuses do happen at this point, but in my experience they are rare.
5) Based on these reports, the scientific editor accepts or rejects the article. If accepted, the (non-scientific, even though they usually have a degree in the field) editors of the journal create a publication-ready layout. This is the first step in the process that's paid by the publisher. Until a few years ago, this involved combining separately submitted text, figures and tables in a readable form. Nowadays journals require submission in the final publication format, so it's barely any work.
6) This publication-ready "proof" is sent back to the authors, who check for final typos and misspellings. These are sent back to the journal. For free, of course.
7) The article appears online. Web hosting needs to be paid by the publisher.
8) The paper appears in the next issue of the journal, which is sent out to subscribers (university libraries and very flush companies). This is a waste of dead trees. I'm doing literature search on a daily basis, but haven't looked at a bound copy of a scientific journal in years. But we have access to the online archive because we pay for the paper subscription...
- What the publisher pays for: Cursory editing. Web hosting. Paper, printing and postage.
- What the publisher charges: tens of thousands of pounds/dollars per year and institution. Having a de-facto monopoly on an important journal, they bundle it with less popular publications to hide the price.
The alternative: Open access journals. Drawback: It doesn't work.
The model: the author pays for being published. It's a few thousand pounds per article, which an academic without a grant just doesn't have. If you do apply for a grant, there is provision for it, but at the same time you are supposed to keep the cost down. And if you do get the money, you are tempted to spend it on something more useful and publish with Elsevier.
The solution:Beats me. Either a wholesale switch to open access (with changes in funding to allow/enforce this), or a slow drift to slightly more reasonable publishers.
It's not for the leather-clad Neanderthal with an anger management problem who wants to go 150 mph when the cops aren't watching. It's for the commuter who wants to travel at any speed != 0 on the M25 (or its LA equivalent).
As for sound: if in the din of city traffic you rely on people hearing you coming, you are already betting your life on wrong expectations. Have you noticed the funny wires coming out of everybody's ears lately?
Paris, because the chick-attraction potential may be one of the few weaknesses. But being fortyish (going on fiftyish) and in a midlife crisis, not even a Harley would help.
Re the various comments about emergencies and otherwise unpredictable needs for a car: There's a tried and tested fallback option available, used by many who don't have any car: It's a technology called "Taxi". You spend 20p on a phone call and 20 pounds on the ride (which would have cost you 2 pounds in petrol in an old-fashioned car). If the emergency happens every day, keep your petrol car. If it's once a week, charge your E-car fully all the time at a higher price. If it's once a month, the taxi is cheaper. It's a simple optimisation problem, and you may need to adjust your response to the pricing structure and your circumstances.
Slightly off-topic, but one of my pet hates: How do you encourage people to write down their passwords and carry them around or store them in multiple places?
- Demand inanely long/complicated passwords that no-one will remember. Unless it's one permanent one that you could learn by heart.
- Force the user to change it after a small number of unsuccessful attempts, even if at different times. ("Twice wrong already. I'll better try the third time from home, after looking up that piece of paper.")
- Disallow resetting it to a previously used one. ("OK, I jumped through the hoops with birthdate etc. once, but now I do remember." Nice try. So much about learning by heart.)
- Give the impression that stealing minutes and hours from a user's life gives some kind of unbreakable security, when all a crim would actually need is the credit card (easily stolen) and the birthdate (was there a driving licence in that wallet too?).
Back to on-topic: I cam assure you that the biweekly change of password won't make it into my will. It doesn't always make it onto the piece of paper at home.
> We're gonna need a bigger Kindle.
I don't know if you meant it like that, but what guarantees the survival of books is the existence of many copies. The books that have survived up to now are the ones of which at least one copy survived fires/floods/bombings/dry rot/religious zealots, the chance of which was proportional to the number of copies in different places.
If the Kindle Mark 15 in 2025 comes with the out-of-copyright section of the British Library preloaded on 10% of its storage capacity, and people can't be bothered deleting it but just keep copying the complete contents to their next device, the books will survive somewhere.
I can see only two developments invalidating my theory:
- The move to "cloud storage" means there aren't any actual copies.
- Disney and friends keep stopping old content falling out of copyright.
Let's not hope so...
> Perhaps. I've made similar arguments in the past.
My opinions may have changed, but not the fact that I'm right (Ashleigh Brilliant). You're inviting a cheap shot (hereby delivered).
> The moment it's perceived self-interest is furthered by contributing rather than
> free-riding, Amazon will contribute. And not until then.
The managers at Amazon know as much about the future as you, I, and your average astrologer. They make decisions based on company culture, prejudices, hunches, and untroubled by any technical knowledge. So Amazon doesn't contribute; Google does; pretty much any HW manufacturer does... Some companies do, some don't. Success and failure can be found on both sides of the fence (which isn't even a fence, but rather a broad continuum).
I don't know _WHY_ FOSS is successful. In an economic system explicitly based on selfishness and competition, it shouldn't. And yet, since I entered IT more than 20 years ago, when it was a ridiculed idea espoused by a bunch of hippies, it has developed into a billion-dollar industry. The likes of IBM, SGI, Intel, Sun (OK, they went bust, but so did many others) keep investing money into it. Red Hat and Google have built empires on it. All of them following the capitalist maxim of increasing shareholder value and crushing the competition. I don't know why, but enough of those hard-nosed capitalists see an advantage in being part of it. Not to mention the army of volunteers who contribute for a bit of short-term professional recognition.
Their change (card reader required; up to now this was only the case if you transferred money out of your account) seems to be meant to increase the bank's security by making it tedious and not worthwhile to use internet banking. They seem to be rowing back now, with some "two--step process" (whatever that may be) that doesn't require the card reader.
That's not "32% of notebook owners" who are affected. It's 32% of stolen laptops. For any meaningful international comparison, you would have to relate it to the number of laptops in circulation. Maybe they just nick even more computers outside the home in other countries, or maybe not. We don't know.
Most of the comments are from the developer's point of view. I don't know Python, and I don't want to learn it. My typical scenario is that I download a tar file of a chemical visualisation program (there are several written in Python) that a couple of grad students wrote five years ago. That today's developers of different codes "saw it coming" doesn't help me one bit!
In my experience, already 2.2/3/4 were not really compatible.