But it always assumes 'starting from scratch' which isn't the case - FAIL.
Here's a thought experiment proposed by the Linux Foundation today: If you had to start from scratch, what would it cost to create a Linux distribution? The short answer is: about $1.4bn for the Linux kernel and about $10.8bn for the Red Hat Fedora 9 development release, the latest one out this summer. Because Linux is an …
But how do they account for the ammount of programming that need not have been carried out, becuase it gets replaced straight away by better, more consise code?
The point I'm making is, how much of the code that is written for linux is just dumped because it's rubbish? With a big corporate this sort of thing is kept to a minimum.
Not that I'm suggesting that Linux wouldn't cost a fair ammount to develop from scratch, mind.
The question not asked is if most businesses would pay for Linux or any other Unix derivative if the actual cost were being recouped via sales? Given the implementation and support costs of Unix derived systems, I suspect that would be a big "no". Like DARPANet, if the development and deployment costs had not been inititially underwritten by those subsisting on government largesse, Unix and Linux would have gone nowhere.
With Linux Code flooding the Money Markets and Invisible Trade Sectors , Crack Coders will be Gifted with Gifting and as Much Wealth as they can Spend for Good Effect ..... for All of their Enobling Endeavour and Selfless Effort Rewarded with just More than always More than Enough to Allow Spendings Gift ..... Freedom to Invent and Build Dream Scenarios .
AI Companion of the Venus Project ?..... http://www.thevenusproject.com/index.html
Thank our selfless FOSS contributors, without them my 8 year old son wouldn't be able to dual boot Ubuntu & Mandrake, as I wouldn't be able to afford them :-)
He hates Windows, with a will, after spending over 4 hours on a Scratch program only to see all his work lost when the Windoze PC BSoD'ed (totally unrelated to his Scratch session)
Penguin obviously, though I'd like to boot Paris.
NathanMeyer: "Like DARPANet, if the development and deployment costs had not been inititially underwritten by those subsisting on government largesse, Unix and Linux would have gone nowhere."
Sounds like an excellent argument for government 'largesse', to me. Of course, a more accurate term would be 'extremely sensible investment'. To turn it around, the argument runs "if the government does not spend money on intelligent people doing interesting things which may not have immediate commercial value, no-one will". This is why we have publicly-funded research institutions in the first place, yes?
I thought linux was a kernel? Thats it. Then a bunch of Coders from around the world built the software that runs on the kernel. Did Linus Torvald spend a billion dollars creating a kernel?
In my experience most small business startups will have founders moonlighting to run the startup until enough income rolls in to work on it full-time.
The entire notion of some angel investor stepping in with salaries and benefits in the millions to create a product from scratch is not realistic. Any real world attempt to create a new operating system would not waste time re-inventing the wheel. They would be aware of the processes that created OSX.
I'm sorry but I feel this article is a load of horse shit.
... the volunteers gained and your total will be negative.
Really, the people who did this work are volunteers. They chose to spend their own time in this pursuit and therefore gained some (intangible) benefits from doing so. Whether that was entertainment, peer-group recognition, bragging rights, magnanimous giving, the chance for fame or self-education is irrelevant:, what they got out of the exercise was, as a whole, greater than what they put in.
So far as corporately sponsored or produced donations goes - who can say. In true capitalist traditions some choose to give away, or sell at a loss. Whether they get any long-tern gain from that is part of the gamble. Since wealth is increasing over the long-term, you've got to say that this strategy, too, pays off.
By trying to put a cash value on the development effort, the researcher betrays a mean spirited attitude: that everything can (should?) be thought of purely in monetary terms. By assuming the costs are all in US dollars and includes american rates of overhead (the 2.4 times salary component) he/she/it demonstrates a level of parochialism that invalidates not only their attitude, but their methodology, too. Ignore.
I used to buy every second issue of SuSe until Novell took it over. I am back on OpenSuse now.
I have paid for Mandriva before - why not, they have to eat. I only stopped using it 'cos the USB boot version they sold me only jbexed with my thumb pressed down on it.
I see no reason not to spring 30 quid every other year for something that so many people have worked on. At lkeast the money goes toward something I like, rather than the Microsoft tax.
"if the development and deployment costs had not been inititially underwritten by those subsisting on government largesse, Unix and Linux would have gone nowhere."
I think you'll find otherwise. UNIX was developed at Bell Labs, funded by AT&T. Linux was developed initially by Linus Torvalds when he was attending the University of Helsinki. Likewise, most of the GNU tools (especially GCC, the GNU Compiler Collection) came out of colleges and universities world wide.
Sure some lonely hobbyist code bits of Linux occasionaly (hands up even I have), but in reaiity most of the work now is paid for by the likes of IBM or Tata who love to sell services on the back of it or people in Uiversities with time on their hands.
As the saying goes - anything free is worth what you pay for it.
Linux inherited a well-tested basic design (though no actual code) from Unix in 1991 and has accumulated 17 years worth of real world experience since then. X and the GNU tools have matured for over twenty years (I remember installing the GNU toolset on Sun boxes in 1990 because they were so much better, even then, than what Sun offered at the time with SunOS). There's no way to value that experience.
Plus, Linux vendors choose what they think is the optimal subset of the universe of free software to include in their distributions. Most software (whether Free or commercial) is crap. The fact that the code has been included in a distribution means that it's much better than average.
If you just gave a team of 2000 thousand developers the money and time to produce something like a Linux distribution (as suggested by these models) 1) the code wouldn't have the decades to mature and learn from mistakes, 2) it would be of merely average quality (mostly crap, some good stuff) and not comparable to a distribution that only chooses the best software to include.
To make it more concrete, there's a huge selection of Free window managers to choose from (http://xwinman.org/). They all work (they do what a window manager needs to do). A Linux distribution gets to choose the best from these pre-existing applications.
If you had to create a window manager from scratch you could give some money to developers and get something that worked. But if you wanted to duplicate what goes into a Linux distribution, you'd have to pay 20(ish) teams to develop window managers and then choose the best.
@Nathan: Not sure how much government largesse was involved in the development of Linux, but most of the other core Unix variants would probably have happened along quite nicely with or without it. Unix as a saleable, commercial proposition was fairly successful from a reasonably early stage and most of the really big IT market players have made plenty of money from it over the years. Hell, most of them still do make a bit of cash from it in one way or another. Of course, you could always turn the situation around and wonder whether DOS and Windows would ever have gone anywhere if Microsoft and the PC manufacturers hadn't, essentially, stitched up the market and foisted them on us. (Actually, there might be more truth in that than there is in the Unix/Linux thing come to think of it...)
Now according to my calculator if 17 million lines of code takes 4500 man years, that's 3778 lines a year, or if a developer does 200 working days a year that's 19 lines a day. Or 204.5 million lines taking 59,389 man years gives 17.2 lines a day. Now I've known some pretty slow developers but if I were paying $75,000 a year and got 17.2 lines a day ($21 per line) I'd be mighty pissed off, and almost certainly out of business. I just checked my own little open source project. It's been going a year had some major re-writes in places, all the work done in the evening while maintaining a normal(ish) family life - comes to 16,000(ish) lines of code as it stands. So that's 4.2 man years and worth $336,000. There is something very wrong with either:
a) those figures
b) my calculator
(my guess is mainly a) with a dose of c).
>> most of the GNU tools ...
Were made by The FSF with Richard Stallman leading the charge.
Man you drop the Gnu project from the name and it's suddenly missing from the history too. See if you don't give credit were credit is due, you start saying silly things like "Linus created the Linux operating system":False
"most of the GNU tools ...
Were made by The FSF with Richard Stallman leading the charge."
Exactly. And most members of the FSF at the time were, er, college/university students. Which is exactly the point Jake made. The Gnu's Not Unix project was (and still is) fired by the input of students. Read your history before you blame others for being historically correct, sir.
You're assuming that the kernel was a single-handed effort and has not received any volunteer contributions since. Also, re-inventing the wheel was necessary to produce an operating system whose source was available, so that future programmers wouldn't have to choose between lashing their work to the side of a sinking proprietary OS and reinventing the wheel themselves.
In other words, I'm sorry, but your comment is horse shit.
Thanks, open-sourcers. Now I feel guilty that I've taken advantage of their excellent products for many years without contributing! Guess I'll have to start.
Where's the GNU icon?
I also add my thanks to those who contribute be them hobbyists who do it for the challenge/fun/recognition or if they are the big corporations who are sponsoring OSS.
Over the years i have bought a few distros in order to support the producers.
And who gives a damn about how much money could have been spent developing it... the point is nowhere near this amount was invested as many people gave freely.
Do you think after readying this article Linus and RMS are banging their heads against walls thinking "Why didnt i go and code for money???". Nah... they are probably laughing their asses off.
"If you had to start from scratch, what would it cost to create a Linux distribution?"
How long have you got? It makes a huge difference to the answer.
If I had to write some code to solve a problem, then I might decide that I could save a packet by building on an existing package and save even more by doing *my* bit open source and thereby getting contributions from other people. We'd all have our own reasons (quite possibly hard commercial ones) for wanting to solve this particular problem and we'd end up with a joint solution for less than any one of us could have engineered an individual one. The net cost of *adding* to a Linux distribution therefore works out negative.
I think an AC pointed this out with regard to companies like IBM selling services on top of Linux. (I wouldn't want to claim this insight for myself, but I *do* want to emphasise it.) Linux passed the point some years ago where the incremental cost of addition was negative. It may by now have passed the point whereby the total cost (over history) is actually negative and the "seed capital" invested by governments and hobbyists has been repaid to society. (It certainly will pass that point eventually.)
So, you see, if you work on it for long enough, Linux actually works out costing you less than nothing. The figures cited by the Linux Foundation are a bit like judging a startup right at the end of the "pre-revenue" stage.
There is a well known metric that on average, a developer produces about 20 lines of written, tested, debugged and documented lines of code a day. I think this is probably a bit out of date now, but the right order of magnitude.
Yes, your open source project may have 16k lines developed in a year, but is it fully (FULLY) tested, is it fully commented and is it fully documented. If it has any of the above, then it's very unusual for Open source, where the majority of stuff I have to work with is badly documented, appallingly commented, but does, to its credit, mostly work (but I have not idea of whether it works in all scenarios)
"(but not the David Wheeler who got the world's first PhD in computer science in 1951 and went on to invent the programming subroutine)..."
Alan Turing was using subroutines long before that. I recall a remark of his about how, if programmers wanted "advanced operations" like addition and subtraction they could write subroutines for them - all he was putting into the hardware was logical AND and logical OR.
So he invented RISC too...
No focus on the benefits reaped worldwide by FOSS.
No focus on all the man-hours and expenses saved by FOSS.
No focus on all 2nd/3rd World inhabitants able to get into programming/software for free compared to paying thousands for IDE's to MS/Borland/Sun in the days gone by.
No focus on altruism as being a valid facet of human nature just as much as greed.
Explains their current financial predicaments.
Evet held a door open for a woman?
Just imagine how much money it would have cost if they had a doorman to do just that!!
You are probably right, but how much of a Unix distro falls under the 'usual for Open Source' bracket? Much of the distro is a collection of 'standard' open source packages. Some will be well tested and documented, but many won't.
20 lines a day is, I think, rather out of date, considering modern development and test enviroments - but then I write c# in VS, not c++ in vi :)
(My project is, of course, not tested/doc'd, it's a hobby project and aimed at developers. Chances are the number of lines will not change much now, but the testing and documentation will. I guess in four years I'll still have 16,000 lines of code but it will have taken four years - but it still won't be worth $336,000)
How do you know ANY of the millions of hidden lines of code in Vista have been checked?
Which doesn't look good for Vista, but let that pass...
Now if you have Windows source, you can't compile it and you have no proof that that code you have is in the binary you are checking veracity from.
OSI approved open source you CAN prove the code is right (except for Tivo-like bypassing of the open source license, where you can't install your checked code and see that this really IS the code running on your Tivo).
Not to total cost. Actually, because it's future value that economists use, the longer you take, the more expensive it is. £10 spent in the 1850's is worth a LOT more than a tenner spent now.
Oh, and in the ONE YEAR between releases from Red Hat, $7Billion value has been added to Red Hat. So that would be a timescale of one year.
IF you consider timescale important.
NOTE: that was only possible because Red Hat didn't pay for all of that. IBM paid, Cisco paid, volunteers "paid". Each got enough value from their own contribution to make the spending worthwhile and gained so much more from other people's work there's no comparison.
"20 lines a day is, I think, rather out of date"
Measuring code in "lines" is, I think, rather out of date. Not only does it mostly measure the expressiveness of your language, or your use thereof, there's also the well-known witticism that a good day is one where you manage less than zero.
Bell Labs flourished because (under the Vannevar Bush scheme) they were getting vast sums of Federal dollars. As did PARC Xerox, etc. When the Feds cut funding, they choked and died. Even more immediate, the people who built UNIX had started the effort as part of MULTICS, which was a DOD project, of which Bell had a piece. There are a lot of legends surrounding the birth of UNIX, but as far as I can tell it was not an official project funded by AT&T. UNIX was a bunch of guys at loose ends waiting for the next project who hacked something using software and hardware that had originally been bought for a different task. Not to demean their accomplishment; but it was never intended to be used for anything like what it is used for today.
By the way, MULTICS continued after Bell dropped out, and I hear the source is now in the public domain.
FAIL! You never need $_ =~ s///; . s///; is enough, since the =~ operator binds a regular expression operation to a variable. If used unbound, it binds to $_ .
In Perl, if you call almost any function without an argument, it uses $_ . This allows you to avoid using temporary variables (like a stack only one space deep, and still much more useful than no stack). Also, where $_ is created by a foreach loop, it has highly magical scoping.
All you people coming from lesser languages really have no idea!
Mine's the camel-hair one.
Biting the hand that feeds IT © 1998–2021