* Posts by tiger99

26 publicly visible posts • joined 20 Aug 2013

Linux kernel set to get live patching in release 3.20


@crayon - rebooting after network change

Yes, I have always wondered about that. Considering how M$ originally were too inept to write their own TCP/IP stack, and borrowed it from xBSD (no problem with that, at least they got some good code), and how, as far as I have seen, no version of xBSD, Linux, Apple OS X, Solaris or anything vaguely UNIX-related, needs a reboot after changing network settings, I have to wonder how they managed to achieve this absurdity. Perhaps the ever-incompetent Billy-boy was "illegally commingling" * the code again? As always, enquiring minds need to know...

At least one example of this remains, apart from the original IE abomination, and that is some "Telephony" service which when killed also kills networking. There are probably umpteen more spurious and inappropriate dependencies in Windoze.

* (illegally commingling is a quote from the judge in the Monopoly trial, and relates to the practice being "illegal" in properly constructed software. The judge did NOT say that it was illegal in the way that law works in his courtroom, but it was a wonderfully succinct way of saying what he thought about the coding practices at M$.)


Reboots, and some thoughts on patching....

I tried, very successfully, to avoid rebooting a nice but cheap ACER Aspire 7520 laptop for over a year. Kubuntu was updated regularly, I just skipped anything involving the kernel or drivers, as there was no indication whatsoever of problems. Then it went very slow....

On investigation, the cooling arrangements, fan and heat exchanger on the end of a very nicely made heatpipe, were heavily clogged with dust, and the CPU temperature sensor was throttling the speed to avoid a meltdown.

So I concluded that it did really need a shutdown, clean and reboot every six months.

I ran OpenBSD on a small tower of relatively low performance level, I think AMD K6/II-450, continuously for 3 years without a reboot. No delicate airways to clog. That was a successful trial to show that OpenBSD, plus KDE, was perfectly usable as a desktop OS, but the lack of nice (compared to apt-get, synaptic and now muon) package management tools brought it to an end, for now.

I have yet to see Windoze, even 7, which has been the best so far, stay up and working fully for more than two months. Something always goes wrong. If you turn on automatic updates, they force tediously slow reboots, and if you leave it off, you get hacked....

On modern hardware, I now always plan for a maintenance reboot on occasion. If it is a vital server, you can always configure another machine to substitute for it when needed.

Oh, and although I fully appreciate the use of live kernel patching, my instinct tells me that (on any kind of architecture, even a humble microcontroller), new code must be patched in via something like jump instructions, or changed subroutine call addresses, leaving unused code in situ, and adding bloat as well as some performance loss, compared to a freshly compiled kernel. But don't let that stop you, if minor performance loss is acceptable, as it very often will be. All I am saying is that a full, fresh rebuild every so often may be advantageous.

Having said that, I used to work with a fine piece of hardware of about 1985 vintage, a Tektronix 8560/8540 system. The 8560 was basically an LSI11/23+ and various extra bits, basically a UNIX machine (truly wonderful in 1985!), but the 8540 was for in-circuit emulation, using dedicated hardware cards with a real 8086, 68000 or whatever, and lots of trace memory, but its control processor was a humble Signetics 2560, with, presumably, EPROM memory. (I think it was copied into static RAM at run time, then any patches overlaid.) There was about 2k (very expensive in those days!) of EEROM, which was used to hold patches. We used to get a letter in the snail mail from Tektronix occasionally (no email in those days, and only 300 baud modems!), with an ASCII text listing of what patch to enter, and it was duly entered, and fixed some bug or other (many did not apply to what we were doing). So I have successfully, and even very happily, used a system which overlaid patches on to its OS, a very long time ago.

M0n0wall comes tumbling down as dev throws in the trowel


@Frumious Bandersnatch

Yes, every small guy has to do that, to avoid being wiped out by legal action if a bug results in someone suffering loss or damage, despite his or her best efforts. If you are giving software away free, it is entirely unreasonable to be held to be liable. Developers already give a great deal of time and intellectual effort, with often no direct reward, except the immense satisfaction of doing a job well.

What is unacceptable is that the Big Bully Boys of the industry (pick whichever you hate most, M$, Apple, Oracle, SAP.....) are also able to hide behind similar get-out clauses, whereas the manufacturers of just about any other retail product are liable for the consequences of their products malfunctioning. Someone, I think it may have been the dangerous incompetent who ran M$ at the time, once claimed that it was because software was not yet a mature industry. That was more a reflection on the maturity of the individuals concerned, himself in particular, because, unknown to him (I don't think he is capable of understanding), techniques for producing rock solid, reliable code, and the necessary hardware to run it reliably, are well-known and understood, but rarely practiced. Nevertheless such techniques are only within the scope of large teams, with some capital to invest, and the bulk of software that is written is either just quick hacks to satisfy the internal needs of a business, or open source for public distribution. The amount of expensive commercial software that is written, and can economically support the procedures needed to give it high reliability, is most likely well under half of the total, whether measured by lines of code, programmers hours or installed copies.

Nevertheless, the "small" developers can, and do, apply good practice to their coding in many instances, and turn out stuff that is as good as any commercial product, if not better. Long may that continue. People should always feel the need to do things well, rather than just adequately, it provides more satisfaction in the end, and everyone wins. Visibility of source code helps, of course.

The developer is to be congratulated and thanked for running a successful product for all those years. Closing down now is a sign of success, not failure. The successors to m0n0wall live on, much like how UNIX is virtually dead (thanks to a certain Mr. McBride who sued the world and lost), but one of its successors (Linux) is found in the majority of smartphones (all Androids and some more) and umpteen other places, being the numerically dominant full OS when all computing devices are considered. (OK, not very significant on the desktop, yet...). And look what the other successors, mostly the xBSD family, are doing. Apple products, many routers and NAS boxes, etc. Solaris lives on, despite almost the whole world hating, with good cause, its current owner, Oracle. So UNIX is all but gone (the remnant of the business lives on, still). Was it a failure? Of course not. Likewise m0n0wall was, and will always remain, an outstanding success.

Samsung gets KINKY with new Galaxy in 50 SHADES OF GREY



That is what they call, in some parts of the world, a tool for prying open expensive, supposedly sealed phones from a certain inferior, over-priced and obscenely profitable competitor, who on occasion has sued their key supplier of screens (Samsung) for daring to make rectangles with rounded corners or something similar. (Not a clever move, and I recall they were slapped down hard by a UK court, sitting as an EU court, a couple of years ago, for vaguely related things.) A company currently run by someone whose distant ancestor may have been good in the kitchen. (Maybe he is too?) Oh, and not Pear or Banana...

So, I think not a phone at all, merely a cleverly moulded piece of tough reinforced plastic for getting access to apple cores.

Seriously though, we will have to wait a bit to see what it really is. Right now, the information is insufficient to form any firm conclusions, and I suspect that we will all turn out to be wrong.

But guessing is fun....

Zimmermann slams Cameron’s ‘absurd’ plans for crypto ban


Locked out of your own data...

What Cameron should be banning is the misuse of encryption, especially if it locks YOU out of legitimate access to YOUR data. I would fully support and welcome that. I could not care whether GCHQ reads my often very boring emails, indeed they may occasionally learn something useful from them, as a few things I discuss may have educational value to a very few people. They will not learn anything of extreme personal importance, it just doesn't go by email, as far as I am concerned. Sadly, many others don't appreciate the risks, yet. Joe public can't reasonably be expected to be a security expert, but education and publicity of the risks will help.

My main grumble about encryption is that with things like HDMI camcorders, the development of downstream processing, mixing and recording hardware and software is difficult or impossible, because the protocol is sewn up tight in NDAs and licence fees way beyond any FOSS developer. Allegedly this is so that the encryption (not needed on your own recordings, or mine, as it is the property of the people making the video and performing in it) demanded by the dreaded MPAA, M$ and various other nasty organisations is not compromised. Of course serious pirates just do a hardware bitwise copy of anything, encrypted or not, so the encryption is all pointless and only harms legitimate users. And yes I know that this is somewhat off topic, we are not discussing licensing and trade secrets as such here, but the underlying cause is an ill-informed obsession with encryption by the dinosaurs at the MPAA etc who have failed miserably to update their business models to work well in the internet age.

I would have to pay $10k to get a licence to access the HDMI protocol, just to make a simple gadget, important to a very niche market, maybe 100 worldwide, costing under £100. So, the entire revenue, manufacturing costs and design costs would be gobbled up, just to prop up encryption that should be illegal. And, some people can't get what they actually need, but don't yet know that they need, because they don't know what benefits it brings them. As things stand they never will, and I can't even make the one unit that I need for myself.

The UK, led by Cameron, has, if I recall correctly, recently relaxed copyright law to allow converting media to other formats, which of course has become necessary every time the standard format changes, VHS->DVD->BluRay, or LP/Cassette tape->CD etc. This was right and sensible, albeit rushed through just ahead of a contrary EU law, but in many cases we are prevented from doing exactly that, by a stupid, irrelevant and outdated obsession with encryption.

Then there is the somewhat serious risk of losing the encryption key, and all data, in cases where storage is encrypted. And, having lost your business data, going to jail because you can't give the key to the police to allow your data to be read. There has to be a better way. Hardware measures involving locks and keys, biometrics, etc, perhaps.

I would think that where data travels internationally (and sometimes it does, even between end points in the UK), there is a compelling need for encryption, as some countries (China, North Korea, US...) are entirely untrustworthy and can be relied on to steal both commercial and military information. But, within the UK, it may tip the balance unfairly towards the criminal element.

I (and no doubt countless others) have the technology to make the computer equivalent of the "one time pad", the only truly unbreakable (without physical access) method of encryption, as far as I am aware. I have abstained from trying to exploit it since 1992, because I feared the inevitable visit from two large gentlemen from an unspecified government agency which always follows the release of such things. I think it has become even less likely that it will ever be sold as a product, but there is nothing to prevent criminals creating the same type of thing and using it. How does Cameron propose to stop that? A seemingly random block of data sent through the internet may be legitimate binary data, or if labelled as something like a jpeg, as modern art, or just about anything else. How do you actively detect and block only those data transfers which are encrypted communications? You must allow everything else to pass, with only minimum delay. And. if you don't block the transfer of the encrypted data, law enforcement after the event may be of no use whatsoever, or even be impossible, if it was sent from one PAYG mobile to another, or an internet cafe.

Just some random thoughts as to why Cameron is both right and wrong. He is right to restrict encryption, but wrong in not going after all the places where it already causes a nuisance, and in not formulating clear methods of rapidly identifying and terminating only illegal communications. He really needs to make it illegal to forcibly encrypt a legitimate user's data. The other aspect of that, the HDMI specs, could probably be dealt with by the same EU court that forced M$ to publish their networking specs, if anyone would put up the cash to fund a legal team. Or, Cameron, who has, as I said earlier, quite properly loosened copyright to bring it into the digital age, could take the lead...

This needs a lot more work at the political level, with good technical input, not just from GCHQ, but from security professionals, academics, end users and even some reformed crackers. It should be a comprehensive review of ALL aspects of the use of encryption, to see where it is already causing damage and should be eliminated, as well as where it may need to be made compulsory.

Bankruptcy could see RadioShack close doors for good – report


Tandy UK etc went the same way. Maplin beware, you are next!

We have seen all this before. They operated as Tandy in the UK and parts of Europe (also Australia, but I can't remember what they called themselves there), retailing grossly over-priced junk. In the Netherlands they went bust around 1992 (can't remember exact date, but I did know someone who bought lots of cheap stuff in the liquidation). In the UK they sold out to Carphone Warehouse in 1999, but later someone acquired some rights to the name and began a small-scale retail operation, tandyonline.co.uk, but I don't know how successful they are, the brand name being a real turn-off for those with past experience of the Tandy name.

Same thing in Australia, they sold out to Dick Smith, who used to retail serious electronic stuff at good prices, but have recently moved into consumerist stuff only, no components, and remain a good place to buy phones (I got one at an excellent price in my last visit, and the sales person could not have been more helpful).

So history is repeating itself, this time in the US. All businesses which are lagging way behind the competition on both cost and quality are bound to fail, eventually.

As for Maplin, they are following the same path as Tandy. Having all but killed off a large number of local electronic retailers in the 1990s, they now have only a rapidly decreasing range of good components and basic hardware, the very roots of their business in the late 1960s, and an increasing number of retail outlets stuffed full of consumerist trash, at rather uncompetitive prices. I go there only if my need is desperate and I can't wait a day or so for a delivery from any of the excellent on-line retailers like CPC or Rapid. So I predict that they will be next...

The US, like the UK, Australia (try Jameco) and probably every developed or developing nation (I know that India, for instance, has decent suppliers of components and hardware), has at least a few efficient on-line retailers. Google knows all about them, so use it! If consumers have to use on-line suppliers for their electronic supplies, they will save money, even allowing for postage costs, so I don't see the loss of Radio Shack as being detrimental in any way, provided that the staff are re-employed in the new retail operation. Losing jobs is not at all good (people matter very much), but losing a useless, inefficient, ripoff business is a very good thing indeed.

systemd row ends with Debian getting forked


Re: End of the world

"It was binary logging that thrust the first icy dagger of despair into my soul..."

Yes, it smells very badly of something that originates in Redmond or Cupertino, very bad indeed, and inevitably the cause of much future misery, as well as a proven route to bug-infested code of the type practiced by Gates and his gang.

The technical success of *NIX since late 1969, 45 years now, is due in large part to the use of PLAIN TEXT for important things. Some time last year, that became a numeric success when the installed base of Linux kernels exceeded that of Windoze or any other OS (due in part to Android, also server farms, supercomputers, routers etc). Still growing....

'..."Follow us on Google+"'

I don't care that they use Google+. Some people take exception to it, but, just like the search engine, it is free and provides a useful service to many people. OK, it is not truly "free", it is funded by the cost of advertising on almost everything that you buy, but that is almost as fair as funding it from general taxation for the public good, like other basic services, which will never happen. Governments should fund other essentials like Wikipedia too.... However I have not seen flying pigs recently.

By the Rivers of Babylon, where the Antikythera Mechanism laid down


This is seriously interesting. We may be able to discover more....

All joking apart, I remain seriously impressed by this mechanism, and wonder just how much ancient history we are missing, because it is hard, proven fact that this was made in a period when no such things were assumed to be happening. I am not a historian or archaeologist, but occasionally delight in the work of such experts, so I am hoping that those who research manuscripts and other sources of historical evidence will be able to come up with some references to such stuff, and maybe open a fresh line of investigation into what the ancients actually were able to achieve.

Work in metal, including brass, copper, iron and bronze, is recorded in the Bible well before this date, but there is no suggestion, as far as I can see, of complex mechanisms.

It is also apparent to a casual observer that, contrary to what we are generally conditioned to believe, long before acknowledged pioneers such as Marco Polo, there was occasional long distance travel across continents, so despite the Babylonian mathematics, we should not automatically assume that it did not come from much further afield.

All in all, there is yet much scope for fact-finding by those with the correct skill set. It may even be useful to correlate translations of every ancient manuscript, tablet etc that has been found, and this is where the hacking community, meaning any member of the public with some coding inclination, could make their contribution by doing some good data mining, given freely available translations of everything. That of course depends on the willingness of the custodians of such items, and some are guarded with obsessive secrecy, in the Vatican and elsewhere.

Mysterious BEAM outside London Googleplex ZAPPED


There are more credible explanations, with daft and unnecessary causes

Are they using ultrasound sensors on the driverless cars, and are they doing any experimentation on that therein? The SPL of a simple ultrasound burglar alarm (you see very few of these nowadays) used to be about 140dB, and an expert from NPL told me that he strongly suspected that ultrasound was as dangerous as audible sound, both to the ears and other body parts. Sustained 110dB is dangerous to ears and causes headaches, 140dB is 30 times worse! But, as with lasers, people were ignoring the safety aspects of what they can't see or hear. Testing these pesky burglar alarms in the lab was making people ill.

As to electromagnetic effects, quite possibly, broken armouring and screening (there is a screen on high voltage underground cables) could be causing a dangerous voltage gradient across the ground surface, but I don't think that would account for the symptoms. It is likely that there was a faulty cable, probably unrelated.

There could be a dangerous medium to high frequency electric and/or magnetic field if anyone was using "contactless" charging technology at more than a very minimal power level. Again, suspect electric cars and other high tech ideas, where people with much enthusiasm and no common sense have ignored the major drawbacks. Never mind the health risk, there is an efficiency loss and needless energy waste, which is not in keeping with the fine idea of things like electric propulsion.

Let's get real, people, and make all our fine ideas work, without the daft and pointless features. Use a clever plug and socket for energy transfer, for instance, or low power microwave and optical (eye-safe diffused laser beam, for instance) for obstacle sensors, etc. No daft things like high power sub- or super-sonics, eye-dangerous lasers, high EM filelds, etc. You can do everything that you need, and do it well, without any of those.

This 125mph train is fitted with LASERS. Sadly no sharks, though


Re: MENTOR does test the overhead LIVE!

Ledswinger, I guess that you never used the ECML between Kings Cross and Edinburgh post electrification. OK, the electrification was done on the cheap, and they are still fixing it, but the same trains, albeit refurbished a couple of times, and many of the same staff are still working the line, just with different bosses for the third time. I always found them to be polite and efficient. Likewise for some parts of what is now Scotrail. A large amount of new rolling stock was introduced by BR, and apart from the Pacers is still doing well. It was not their fault that for three years or more Mrs Milk Snatcher prevented them from buying any new trains, which really damaged our manufacturing industry too. The only real problem with BR is that they were controlled too tightly by the government and unable to borrow money.

Remember that every train with MK3 or MK4 bodyshells was designed by BR and their subcontractors, and that includes a lot, not just the HST coaching stock. There are EMUs of all 3 types, AC, DC and AC/DC, and plain non-HST coaches, and NOTHING is loved more by the passengers than an HST coach. They built the HST on a shoestring budget, and got it very nearly right at the first attempt. A world-beater for many years, no-one thought that they could do diesel at 125mph.

Remember also some of the leading managers like Chris Green, of Scotrail, and then NSE.

I remain convinced that BR really were getting very good at running trains, when they were allowed to.


Reading crisp packets....

Well, such things really do matter. Some researcher somewhere, probably in the field of sociology, needs to know whether consumers of Smiths, Walkers, Golden Wonder or Tudor crisps are more predisposed to dropping them on the railway, no doubt as a result of some socio-economic factors in their environment, which make them prefer one brand of crisp, or dump rubbish on the railway. So, the crisp packets have to be read, and entered into a database....

Sadly, I am not joking, people do really research daft subjects like that, at our expense! :-(

Fortunately, the orange army, who do their very best to keep the trains running, often in horrendously bad weather when the rest of us stay at home, have much more sensible things to do with the data, which will not involve the brand of crisps. However, they may well be interested in the brand and manufacturing date of concrete sleeper, invariably visible as text on the upper surface, because some may have a shorter lifespan than others, so the ability to read text on the move may be important.


MENTOR does test the overhead LIVE!

Not as stated. I know, because I designed one generation of the equipment used. It kept me busy for a couple of months. I suspect that it may have been replaced by now, as I did it around 1988, which would make it very old in terms of electronics. The previous generation was much bigger and heavier.

Anyway, it works, or worked in those days, by having an electronics box and a battery pack mounted to the pantograph base frame, which is carrying 25kV ac, plus huge spikes and transients. This energised and processed a number of strain gauge and LVDT sensors on the pantograph, such as left and right vertical load (the head being sprung lightly at both sides), and arm height. These were turned into FM signals in the audio band and propagated down what we jokinlgy called the worlds longest optocoupler, basically a 1 metre insulating tube, oil-filled, containing 2 fibre optics (the second one sent commands upwards to switch the unit on and off, or select calibration signals).

Inside the MENTOR coach a fairly ancient rack of Schlumberger equipment demodulated the 6 channels of FM telemetry, and some simple analogue computation derived wire lateral position from the ratio of left and right vertical loads, as well as any bumps and other irregularities, and the data ended up in a multichannel UV chart recorder.

Now I suspect it will all be done digitally. I wanted to at the time, but was only redesigning the upper set of equipment, that runs at 25kV, as they were, at that time, keeping the Schlumberger equipment, no doubt due to budgetary constraints. Now, of course, I would use at least 16 bit ADCs, and have a very much higher data rate, with a simple DSP processor at the live end just to multiplex the data, and soem kind of digital system in the operational area of the coach to do the main processing and recording.

MENTOR can be added into any loco-hauled train, subject to its speed restriction (from memory, 100mph), and there is no need to be powering down the system to use it. It does however have to go into a place where there is no overhead wiring, pantograph lowered first of course, to have the battery charged. If I was doing one now, I could make an isolated power supply that would withstand the upwards of 50kV spikes that are present, to eliminate the battery. Such is progress...

Oh, and by the way, the method of extracting lateral wire position (it is staggered within certain limits to spread pantograph wear) was developed by the British Rail Research Labs at Derby, a very innovative bunch of people, who used to look after MENTOR. British Rail were actually getting rather good at running trains before they were unnecessarily privatised, and some of their better assets, including the reasearchers, dispersed.

Forget eyeballs and radar! Brits tackle GPS jammers with WWII technology


Re: eLoran, GPS jammers, Unsymmetrical Warefare ??

I was about to mention Decca, till I saw that you had already done so. I evaluated it as one of the backups in a very important system for locating certain road vehicles as long ago as 1988, and found that within mainland UK, as well as at sea, its performance was sufficient for that purpose. It also occupied almost zero bandwidth (unlike Loran, which is a nuisance to other spectrum users) and would be very hard to jam (again unlike Loran). The receiver technology eventually became very cheap, and nowadays could be integrated into a phone or tablet, if the demand was there. Sadly a very fine, low-budget British invention is gone, because no-one wanted to pay for its upkeep, believing that GPS was the ultimate answer!

But you CAN emulate and improve on the better aspects of Decca, and give Loran a sound thrashing, by using combinations of broadcast transmitters. A university research group demonstrated it 20 or maybe 30 years ago. Fixed monitoring stations, maybe 10 to 20 to cover the UK, measure the relative phase of the modulation on all transmitters carrying, say, Radio 4. You need to do that to cover for arbitrary delays in transmission links etc. Given that information, and a means to broadcast it, a mobile that can pick up 3 or more Radio 4 transmitters on different frequencies can do a simple phase comparison, much like the Decca coarse pattern used for lane ident. Then it can compare carrier phase, and get down to as good acuracy as Decca. If FM signals are available, you can get down to feet quite easily. It is in theory possible to use stations carrying different programs, but the computation gets to be rather complex! But 2 of one program and 2 of another gives you two simple locus lines, and one intersection, without fancy stuff.

I think that the phase information could be done relatively easily to the Coastguard monitoring stations which provide the correction data for differential GPS, and included in their broadcast data. There, problem solved, in the UK, and apart from anything that may already be patented by others, I hereby release this idea into the public domain. If open source hardware and software hackers want to knock up something that works, that would be very good....

Internet of Things? Hold my beer, I got this: ARM crafts OS to rule them all


Been around since 2009, and please don't underestimate its potential!

At least one earlier post (I have not read them all yet) identified it has having been around for a while. Wikipedia says September 21 2009. I know I have seen the name on many occasions over the last few years, so it is not correct to call it new.

That minor grumble apart, I think this is potentially good stuff, and those who are sceptical may like to pause for a moment and consider that ARM processors far outnumber any other on the face of this planet. Do not for a single moment imagine that a small, by global standards, Britich company can not conquer the world, because they already have, unseen by most people.

I have no connection with ARM, but I do use an unknown number of their licensed products and at least 5 or 6 that I do know of, Raspberry Pis and phones. I think they are in a couple of gadgets and NAS boxes too, but have not bothered to check.

This market penetration is a bit like free software, including Linux, GNU utils, Busybox, and umpteen other things, mostly produced by individuals or smallish businesses, by global standards. Such software is found almost everywhere, starting with most server rooms, every Android phone (by far the brand leader now, usually also running on ARM), and almost all supercomputers.

But basic commodities like wood, metal, plastics, food, air and water go everywhere too. Just because something will NEVER be a globally leading brand like Coke or McDonalds, does not mean that its uptake will be small or insignificant. A close examination of the facts (i.e. what stuff runs the world and keeps people happy) will show that lots of almost unseen, unknown things actually have massive commercial success behind the scenes.

mbed will stand or fall on its merits, not because it is emerging from the UK. UK bashers and Apple and M$ fanbois take note!

EVIL patent TROLLS poised to attack OpenStack, says Linux protection squad


The problem is the license

Without wanting to start a flame war, I would point out that the major weakness of the Apache, BSD and various similar licenses is that anyone can take the code and make private modifications to it, with no obligation to contribute anything back to the original developers or the wider community. The GPL prevents that very effectively, and GPLv3 effectively deals with the patent problem.

There are reasons why some people would prefer not to choose the GPL, however sometimes these reasons are based on misunderstandings or dis-information. All I can suggest is that everyone should take great care in picking the licence that best suits YOUR requirements, and if necessary get legal or economic advice about the impact of that choice. You may feel that giving your entire work available freely, for others to take private again, and do with as they wish, is bad business. I would not do that, why should I work as an unpaid developer for M$ or Oracle or whoever? But the dominating need for YOUR particular code to go EVERYWHERE may over-ride such concerns in certain cases.

Or, you may feel that you want others to help fix and improve YOUR work, so that YOU can benefit from their contribution. If you don't mind them benefitting too, then GPL is the way for you.

If you want no-one else to benefit, then you need a totally restrictive license. You may or may not (at your option) allow access to source code, but you can forbid it to be used, or modified, or insist on all contributions being fed back to yourself, who will have final say on what is done, and whether the software is sold, leased or given away. If so, get your lawyer to write a restrictive license, but don't expect much help from others.

There is no such thing as a one size fits all license. My personal preference is for GPL in all but the most exceptional circumstances, but that is what is appropriate in my circumstances. I can't force that choice on YOU, whoever you are, but I can strongly recommend it because it usually makes sense.

Windows 10: One for the suits, right Microsoft? Or so one THOUGHT



I am sure that my Linux machines all had multiple virtual desktops in the mid 1990s.

Now just what else were they very late with? 32 bits, and again 64 bits. Demand-paged virtual memory. Proper seperation between user and system space. Serial over USB drivers that work, seamlessly. Proper security features. The web browser. Sensible disc partitioning, with user data not in the same partition as the system. Mountable drives. TCP/IP. IPv6. (Please feel free to add more...)

What were they in the lead with? The truly innovative BSOD, text by Ballmer. Bob, by Mrs. Gates. New forms of security hole, guaranteed by ActiveX. New forms of non-compliance with standards, led by IE and Outlook. New forms of illegal monopoly.

Considering that well over 50% of interactive computing devices (phones, mainly Android, tablets and web servers account for most, along with the odd supercomputer) now run Linux or a UNIX derivative (BSD in the case of Apple), are M$ even relevant now?

So long Lotus 1-2-3: IBM ceases support after over 30 years of code


Source code and licensing

If my memory is correct, the rights to CP/M passed from DRI via Novell to Caldera and then SCO, and the CP/M and MP/M source was released by Ransome Love, under a very permissive license. I have a full copy of the source, and license, somewhere.

But alnong came nasty Darl McBride (who we all know sued IBM and others over ownership of UNIX, Linux, and half of the world's software (slight over-simplification) and lost, big time. But I digress...) No new licenses were issued, but it is well-nigh impossible to take back that which has been issued, and of course trade secret protection, if it still applied, was automatically lost on public exposure.

The rights, including the copyright of the source, passed eventually to Lineo, and Bryan Sparks kindly made it available again under a permissive license. See here: http://www.cpm.z80.de/ and here: http://www.cpm.z80.de/license.html

IANAL so please check with your own lawyer before making use of the CP/M source in any way that might get you into trouble if I am wrong.

If anyone can expand on this information, or correct errors on my part, please feel free to do so. It would be nice to be sure of the current status of CP/M, since it is potentially still useful on small, simple, ultra low-power things.

Linux duo land $54m VC Xamarin cash bag


Re: Remember Mono?

Yes, and Miguel, who does seem to be a competent programmer, and for all I know may be a nice guy, utterly wiped out his credibility with the FOSS community. Unfortunately Gnome still has many Mono dependencies. The first thing I do when installing a new Linux distro is to expunge Mono. Fortunately that is rather easy on Kubuntu, as it is not there. But in some distros you end up with 100MB of run-time guff cluttering your RAM, and I have seen performance slow to a crawl on a thumping great quad core.

The Linux kernel has a good and consistent API, as do the main GUI layers such as Xorg etc, so why would anyone need something that emulates a layer that M$ put on top of a complete mess called Windoze, to give it a very slightly better API than bare Windoze? Who needs bloat on top of bloat on top of bug-infested bloat? And all because Sir Bill violated the terms of his Java license, and did not like the court's findings....

I would like to see Miguel applying his time, energy and talent to something much more useful than trying to foist an emulation of backward M$ technology on to those who do not need or want it.

Security chap writes recipe for Raspberry Pi honeypot network


Re: Raspberry Pis are possibly easy to detect

I am hoping that it is not much of a problem on a fairly busy network, because, for example, the time "should" become slightly randomised by the switch needing to wait till the upstream link is free before forwarding a packet. There are always random delays like that in any busy network, quite oftem more than the odd millisecond. If the Pi had set up a direct link to somewhere else, that might operate in real time, but it would be either a point to point link or a circuit switched link, whereas all conventional uses of what we generally call Ethernet, certainly on Cat4 or Cat5 cable, are packet switched, hence the semi-randomisation of the timing.

Ok, you CAN set up an Ethernet type link with deterministic timing, indeed audio people do it regularly from stage box to mixer these days, but it is a point to point connection. For all I know, you may have special routers or switches that can set up a circuit switched connection for you, but neither of these are typical of the normal office environment.

Nevertheless, a point worth considering seriously, and hopefully more people will add their expertise to this discussion. It has got me thinking....

Security matters, and if cheap stuff like Pi's can help, that is good.

UK.gov's Open Source switch WON'T get rid of Microsoft, y'know


Don't forget security!

There is one VERY good reason why government departments, like my employer, can not, will not, and never will be able to use "cloud" services for their core business, and that is security. In most cases it would be strictly forbidden, for very good reasons, to host government documents, or those containing personal data, in some arbitrary cloud controlled by an entity not subject to either UK Data Protection Law or UK national security regulations. So the cloud, as an alternative to keep M$ in the game, is just not going to happen.

No, this really is VERY good for free and open document formats which are controlled by legitimately obtained ISO standards, and are actively supported by multiple vendors, but very bad for M$.

Cheer up, Nokia fans. It can start making mobes again in 18 months


"No - it will make them look at low end Windows Phone handsets - exactly what Microsoft wants to happen and why they are ditching all of these legacy handsets."

Utterly ridiculous nonsense! Microsoft's penetration of the smartphone market has remained, and will continue to remain, close to zero. The fact is that no-one wants their ridiculously insecure bloatware any more. Those who can afford to do so are moving away from PCs to Apple. With phones the situation is more complex, but with at least 3 out of every 4 new phones being Android, and Apple still doing quite well, there remains no opportunity whatsoever for M$, even if they do eventually manage to create an acceptably good product. (They have had over 30 years of trying to do that, so far...) Time to market is everything, and every single one of their attempts at the phone market has been off by years, not just months.

You may well ponder why Android unseated Apple as market leader some time back, one of the very few exceptions to the rule in any business, anywhere. (One other was the Japanese car industry practically wiping out Detroit, because the products of Detroit had been utter trash for a very long time, and were regarded as a bad joke everywhere except in the US. M$ will fail for the same reason, their products have been a bad joke to anyone competent in the field of software for a very long time now.) It was not time to market, because Android was rather late. Could it have been value for money, or perhaps just that people were utterly sick of Apple's walled garden approach to everything?

It is interesting that M$ are usually only able to mobilise Anonymous Cowards here, while others, who prefer to debate issues on the facts, not the dictates from Redmond, have the guts to use proper names. If I had posted so much sheer drivel, I would not want my name on it either. I expect that there is, or soon will be, a medical term for being obsessively positive about M$ products in the face of all evidence to the contrary.

Oh, and what occupies the majority of the market for servers large and small, supercomputers, gadgets, phones, and, one of these days, the desktop? I will give you a clue, it does not come from Redmond or Cupertino....

When the remnant of Nokia resume making decent phones, which they still do know how to do, it will be despite M$, not because of them, and those phones will, like it or not, be running something based on Linux, or maxbe xBSD, but not necessarily Android. The remnant of Nokia (not the part owned by M$) will eventually rise to a credible position in the industry again, despite the malignant incompetence of Elop, whereas M$ are on the way down, with severe challenges on all fronts and not a credible product in any of them, starting with Windoze 8...

Oracle vs Google redux: Appeals court says APIs CAN TOO be copyrighted


Re: W. T. F.

Just to add my support to what others here are saying, COPYRIGHT is very clearly the correct protection for software, and that means ALL software, from closed, proprietary stuff like Windoze, through to FOSS like the Linux kernel or GNU utilities. Even the GPL licenses, which give you, the end user, almost complete freedom to use the code in any way, except that you must not restrict anyone else's freedom, are founded on copyright law. But we don't want to get into a pointless flame war about licenses here, all I am saying is that every single line of code produced anywhere, unless it is expressly put into the public domain (not recommened, unnecessary, and not recognised by the legal systems in a number of countries), is covered by copyright law. It can be officially registered as such in some countries, relatively cheaply, to make enforcement in the courts simpler, but like any other piece of writing, the mere act of creating it confers protection automatically in many countries, although putting at least the (C) symbol, owners name and date in every file is advisable. This is simple, efficient and effective.

Patents, on the other hand, are a legal minefield.Software is the ONLY fileld of endeavour which has, due to an incorrect court decision, become supposedly protectable by BOTH copyright and patents. Everything else is one, or the other. Why would software alone justify double protection? Patents cost money to obtain, in one country, and vast sums of money worldwide. The patent system is hopelessly broken in the US, where, as we know, Apple effectively obtained a patent on a rectangle with rounded corners, something that has been known and used for centuries, if not millenia. Very recently, Amazon has been awarded a patent for photographs taken against a white background. We could dredge up TENS OF THOUSANDS of imbecillically stupid patents on software or business processes, which should never have been granted. It is actually illegal of the USPTO, because a patent MUST contain something which is novel (rectangle with rounded corners?) and not obvious to someone who is skilled in the art (which rules out almost all minor incremental advances in software, leaving about 10 to 20 genuine advances, such as the assembler, then the high level language and compiler, then the multitasking OS, then virtual memory, possibly a hardware patent, then the GUI concept, and a few other odd things, all of which are now over 20 years old, so would have lapsed by now), so NONE should be granted patents. The USPTO needs to be brought into line with what US and international law requires. (Meanwhile certain despicable people in the EU are trying, yet again, to sneak software patents through, it is a world-wide problem, no disrespect to the US intended, the problem is political and beaurocratic in nature, not national, but the US, having more lawers per capita than anywhere else, succumbed to the problem first).

The fundamental reason why software MUST be treated like any form of writing, such as a book, is its method of creation. Most software (over half) is created not by large corporations but by individuals, whether developing a small but necessary utility program for sale, or some FOSS tool, or an engineer at work just solving an immediate problem by making a quick and dirty tool, and lots of similar circumstances, i.e. negligible funding, or oversight by "management", just individuals doing what is expedient, or what they feel the urge to do. Exactly like writing a book, the creative spirit of the individual is at work.

Then there is the barrier to entry to consider. To invent something like a new medicinal drug may take very little, or many millions or billions, but the testing to have it legalised for use will most likely run into billions. Prototyping a novel machine will take tens of thousands, up to millions, and will need expensive machine tools, or expenditure with sub-contractors who have such things. Making a new model of car costs maybe a billion. Patents, RIGHTLY, allow an inventor of something genuinely novel to claw back his costs and earn a living, if the invention is successful, so the reward has to be set high, and it is right to have patents, and license fees if others want to use the technology. But if an individual in any even moderately developed part of the world has the inspiration to write a book, he or she needs (leaving out the obsolete and inefficient methods such as typewriter or manuscript) a cheap PC, and as a baseline some free software. The rest is pure inspiration and hard work. Now what does a prospective software author need? Starting point, a cheap PC, some free software, lots of inspiration, and hard work. Exactly the same as writing a book.

So, teh barrier to entry being similar, the means of protecting the work, and providing rewards, can reasonably be expected to be similar. I don't hear authors demanding patent protection for their works! Indeed, it would be a VERY BAD THING, because any slight resemblance of one plot in another would result in one publiching company suing the other. There could only really be one book of each generic type, fictional or even non-fiction. VERY BAD!

Yet we have, for some years now, suffered the deranged rantings of various current, former (in one case a proven dangerous incompetent) and in some cases deceased, supposed leaders of the software industry, ranting and rambling about patents being necessary. Note that most of these individuals seem to have deep psychological problems. The only reason that they would want software patents is so that, every time they feel like it, they can sue a competitor for infringing some aspect of a patent that should never have been granted, on some feature of their buggy product that had been in common use for several decades, and whose method of implementation was common knowledge to many thousands of programmers or engineers (and even at least one judge). It is a business method, sue your competitor for patent infringement, when you can't beat him by fair competition. In the case of a small competitor, who can't afford 10 million in legal fees, sue him out of existence, then use the product of his hard work freely....

The people who do that are not businessmen, but SCUMBAGS. Software patents should never have been allowed, and ought to be expunged from the legal system as soon as possible. They hinder innovation, the very thing that patents were supposed to (and do, in the right circumstances) encourage.

Enough said, I think. Let us hope that software continues to enjoy proper copyright protection, but definitely not patent protection, which is just plain wrong.

Oh, and the Google code copying was not as straightforward as it seemed, and the offending code was subsequently removed, so the whole thing was blown out of all proportion. In other parts of the world, where perhaps the lawyers are not paid as much, or not as greedy, the case would have been quietly settled in private, possibly with a small cash payment. Large businesses do themselves no good at all by creating a public spectacle in court over mere trivia, and compared to teh original fanciful allegations, the few lines of code were indeed mere trivia.

20 Freescale staff on vanished Malaysia Airlines flight MH370


I suspect that having several passengers with stolen or fake passports on board is not at all uncommon, despite security checks, and may be completely unrelated to the tragedy. However, I would be more concerned about enemies of Freescale, or IBM, or any other corporation whose staff were aboard, or even some individual. There have been airliners crashed previously, with the loss, usually, of everyone on board, due to personal vendettas and such like. In at least one fairly recent case, the pilot had something against his employer and decided to commit suicide, killing all on board. I am not saying that this disaster is murder, suicide, or anything else, just that we should remember all the possibilities, of which there are many, including mechanical failure, crew error, terrorism, strike by a meteorite, and, as this is a fly by wire aircraft, lightning strike with a current beyond the design limits, plus many more causes that I can't be bothered to mention. The fact is that we don't know, and if wreckage is not found, and no-one such as a terrorist organisation claims responsibility, we probably never will know. I suggest that we wait patiently for the investigators to report, and not allocate blame speculatively to anyone, regardless of whether they may have been travelling illegally.

Open-source hardware hacking effort 'smacked down' by USB overlords


If you think that this is bad (it is!), take a look at the truly obnoxious situation with HDMI. I need to make a device, which may cost about £100, and sell maybe 10 to 100 (maybe none) to those with the same particular need. The entire revenue would come nowhere near to covering the initial licence fee that you need, just to get a look at the specs. I can't even use someone else's chipset, because I can't get data sheets without a licence.

It is all locked down tight in a completely misguided attempt to prevent vido piracy. As it happens, it doesn't, because serious pirates have DVD and BluRay bit-copying systems etc. My videos happen to be made by me, with full permission of all those involved, yet I am penalised by this.

There is no reason why a completely free and open interface can not be developed. Expansion cards could be supplied for desktop PCs in the first instance, laptops would need some kind of USB (oh no!) or PCMCIA (or the later thing whose name I forget) adaptor. But if it gained popularity, motherboard manufacturers would incorporate it soon enough. The thing that would prevent it becoming universal is that the locked-down Apple empire would never adopt it.

If someone does develop a new interface, I suggest using a circular plug, like a jack or DC power connector, to make it easy to connect for the elderly and those who are slightly visually challenged, avoiding the hassle of which way around a micro USB goes. Designing a new connector will be about half the development budget for the prototype.

Maybe the situation with Firewire would be better than USB, but sadly it seems to be becoming obsolete. It was useful, as it was fast and could supply a serious amount of peripheral power.

Or, with IPv6 coming along, why not just equip absolutely everything with Ethernet? I would welcome the thoughts of others on that subject.

Surface 2 and iPad Air: Prepare to meet YOUR DOOM under a 'Landfill Android' AVALANCHE


Re: A quick correction

Just to clarify the price of the HUDL, I got mine for ZERO cash, as they do indeed accept Clubcard points up to £60, but are giving double value for them!

I also use a larger, older, Android tablet, Motorola Xoom. The HUDL does not equal the battery life of the Xoom, which is hardly surprising, as the latter has a HUGE battery and much slower processor.

I don't yet know about Cyanogenmod for the HUDL, but I had it rooted within a week of it becoming available, and it does look as if the usual level of support from third parties such as the Cyanogenmod developers will be there. It is certainly not the "best" tablet around in the UK, but at the price, it is likely to be the "best value", which matters to many people.

I don't see why anyone would want something that is locked down tight for ever such as the ARM-based with secure boot M$, or indeed any Apple product. If the manufacturer loses interest, you are well and truly stuck as far as updates are concerned. In any case, if I pay good money (or in this case, Clubcard points) for something, I feel that it is only reasonable to be able to install anything that I want, at my own risk of course, not just things that some abusive corporation decide that I can install. It is not just the cost differential, which in some cases is negligible, that is causing Android to outsell the total output of Apple, M$ and their new subsidiary Nokia. It is the freedom to own your own equipment, and not be dictated to by someone else.

Legal bible Groklaw pulls plug in wake of Lavabit shutdown, NSA firestorm


Re: A Sad Day

Yes, Groklaw has on many occasions exposed the dishonest and dishonourable in the software industry, and has been a vital source of education on such things as why software patents don't work and should not be allowed. PJ maintained very high standards of accuracy, integrity and general politeness, more so than any other blog that I have seen.

Right now I think there will be great merriment in Redmond and Cupertino, homes of two of the most corrupt and obnoxious abusers of the patent system and other laws. The rest of the world will be feeling a great loss, the end of an era.