* Posts by Ian Joyner

622 publicly visible posts • joined 6 Jun 2014

Page:

Classy move: C++ 20 wins final approval in ISO technical ballot, formal publication expected by end of year

Ian Joyner Bronze badge

Re: The old problem of the programmer perisope view of the world..

"> Multiple inheritence is too scary ...

One. Do you actually know how M"

I'm not sure who 'Anonymous Coward' is addressing here. However, the fundamental thing wrong in this answer is the ad hominem attack on 'academia'. This is a problem in C think that 'oh we deal in the real world, those academics only give small examples'.

That is because academics see a problem and boil it down to a small example that demonstrates the problem, so it is easily understood. They should not feel that they need to apologise for that. In fact, academics have considered programming in the wide and wild, far wider than most real-world practitioners. So I am very against this faulty C thinking that 'oh things are very nice in academia, but we know what really goes on'. That is garbage thinking.

"So I suggest you go back to your academic textbook MI examples. Because that is the only place you will ever see them."

Textbooks also boil it down to small examples. Orthogonality means that many elements can be combined. Non-orthogonality means things combine in surprising ways – I don't mean with pleasant outcomes, quite the contrary. Now these 'real-world' practitioners might think that non-orthogonality is a necessity in their 'real world', but it is not.

In fact, saying there is this difference between 'real world' and academia is wrong. Programming is not in the real world – it is in the virtual world of electronic circuits. Virtual worlds are what software is about. Software does not deal with physical things – that is why it is really powerful and easy to change and flexible.

Those 'real world' difficulties and complexity are because someone did not understand this along the way, did not guard against non-orthogonality.

"So how did you learn your "real" OOP skills? From some TA who never wrote a line of shipped code?"

Again I don't know who 'AC' is addressing the remarks to, but this again is an example of the false attitude. As an example and general refutation, my first exposure to this world was I was taught Simula by a professor who had worked in the UK with some of the great minds in computing (he did not say it at the time, but I have found out since). But the term OO was not mentioned. I then became one of the first adopters of OO in this country, have done languages and compilers, many large software projects, and studied OO deeply.

Alan Kay noted, when he coined the term 'OO' he did not have C++ in mind. So accusing others who criticise C++ as just being taught by some 'TA' is nonsense.

"Because in my experience those are usually the only people who think stuff like MI is either important or relevant. Those of us in the the trenches shipping product most certainly don't."

You don't understand MI, or have seen it badly applied (it is easy to abuse MI). But MI allows you to break a system down into even smaller classes and then to recombine those abstract classes in very flexible ways. That means a concept is not buried in some other larger class that you then have to repeat that code somewhere else.

As for 'trenches' – again, that attitude coming up because you are working within badly designed systems.

Ian Joyner Bronze badge

Re: "Competent, core language"

"Pascal is nice to teach programming with, but can't be used in the real world"

Absolutely wrong. I have extensively used Pascal. Now most universities had Pascal IMPLEMENTATIONS that only needed to support students. Both Apple and Burroughs had real-world implementations of Pascal that were extremely useful.

That is not to deny that Pascal's type system was a bit severe and better has been done since. But that does not excuse C's almost complete lack of types apart from int, float (over BCPL). But Wirth moved on to take the lessons from Pascal and create Modula and Oberon.

Others took lessons from Simula and created Smalltalk and Eiffel. C++ took Simula and corrupted it. I don't think at the time Stroustrup had much clue.

The world moved on, but C remained in a time warp along with its followers who believe these myths that nice looking languages are just that way for teaching.

"Seriously not getting your complaint about C/C++ being old, stuck in a quagmire, etc... and then going on to quote languages which are fossilised or only of interest inside academia."

You prove in that your lack of understanding and narrow view of computing. You need to get rid of this false classification of languages into real world vs teaching or academia. C and C++ have become fossilised even though C++ comes out with a new standard every three years – but that is still trying to fix the mistakes and move towards those other languages that you consider 'fossilised'.

And you should speak for yourself. Many people in the industry are now questioning using this complex language where the complexity is accidental – that is part of the language itself, rather than essential complexity, which is part of the problem.

Your response and other responses are based on the cult of C-think of 'other languages are just for beginners', etc. It cannot be simpler – you are wrong.

Ian Joyner Bronze badge

Re: Object

"Programming – making things that do something. This goal can coexist with others, but you are always building a thing that does something."

And in computing that is an intellectual activity. Oh, I get it – C and C++ encourage this kind of anti-intellectual mentality.

Ian Joyner Bronze badge

Re: C++ is great

"The best thing about it is that if you don't like it you don't have to use it!"

It is much more than that. You need to learn what to use and not use and that means understanding why it was there, or still is there. More than any other language you have to understand the history of C++.

C++ is non-orthogonal which means you might end up using a feature in a way you didn't anticipate.

https://dl.acm.org/doi/pdf/10.5555/1352383.1352420

Ian Joyner Bronze badge

Re: C++ – never classy

"Can't see why. They both express concepts and change to express new concepts... or die from lack of use."

Because you seem locked in to the C-think cult. C changes very little.

https://www.quora.com/Will-C-get-new-features-in-the-future

C++ just keeps trying to fix the flaws.

http://trevorjim.com/c-and-cplusplus-are-not-context-free/

https://cacm.acm.org/magazines/2018/7/229036-c-is-not-a-low-level-language/fulltext

https://dl.acm.org/doi/pdf/10.5555/1352383.1352420

Like most bad technologies, they become black holes of lock in. With other languages and technologies, the practitioners seem to be able to move on.

Ian Joyner Bronze badge

Re: "Competent, core language"

"By Burrows system languages you mean ALGOL. And then you complain that C++ is old and tired."

Well, you don't even bother to spell Burroughs right.

ALGOL is still a more modern language than C or C++. But ALGOL moved on. It moved to Simula, ALGOL-68, Pascal, and then many more. C and C++ are just stuck in the same quagmire.

The worst technologies lead to lock in. That was the case with IBM, and certainly is now true for C and C++. The others have moved on.

Even so, the Burroughs languages make C look like a toy.

Ian Joyner Bronze badge

Re: C++ seems to generate a lot of hate among the people who failed to lean it properly

Don't excuse exposing what a terrible language C++ is as 'hate posts'. C++ being a terrible language has nothing to do with hate, because it is.

"This reminds me a lot of people, who cannot swim talking to me about the dangers of water bodies..."

No, that analogy does not apply to programming languages. The warnings about problems in languages are technically based and have technical merit – and C++ gives many opportunities for criticism.

Those criticisms are most likely from people who understand programming much better than you do, like a olympic gold medal swimmer saying swimming over Niagara Falls will most likely end in death. Now that is a good analogy.

Ian Joyner Bronze badge

"Some of these features sound like it's taking ideas from Rust"

Most of C++ came from Eiffel – multiple inheritance (C++ got that wrong), templates from generics (C++ got that wrong), contracts – oh wait until 2023 for C++ to get that wrong.

Ian Joyner Bronze badge

Re: Object

"not for intellectual genitalia waving - which is why I don't code in Z"

Sorry, but programming is an intellectual activity. We reason about and model the real world. So I think that is a silly comment.

Ian Joyner Bronze badge

Re: Object

I agree you don't want to program in Z. However, there is a lot to be learnt from it. I have met Jean-Raymond Abrial when he was a guest of Bertrand Meyer at an OO conference. He was Meyer's teacher and much of Z and its ideas made it into Eiffel, particularly Design by Contract.

Contracts still won't be in C++ until 2023, maybe. A generation of programmers have died waiting for C++ just to get it wrong (again).

Ian Joyner Bronze badge

Re: C++ – never classy

"except for the lowest level machine interfaces in which C + asm shine"

Not at all. C is now constraining and hobbling processor design in the same way it has hobbled programming for a long time.

https://cacm.acm.org/magazines/2018/7/229036-c-is-not-a-low-level-language/fulltext

Ian Joyner Bronze badge

Re: C++ – never classy

"Then again, a language that doesn't change is dead"

I agree that languages can improve. But C++ changes to fix the mistakes – that is different and not a measure of anything good.

"that goes for both computing and spoken languages"

No that is not a good analogy. Programming languages are very different to spoken languages.

Ian Joyner Bronze badge

Re: "Competent, core language"

"You are quite correct, C is not a perfect language. There aren't any."

So you can invent any old garbage you want because you can always say others have small flaws? That is a non sequitur.

"However, C is a very good first approximation of a systems programming language."

Actually, C is a terrible systems language. A systems language should only be used for systems, not for applications. Where a systems language is used for applications, it means the systems programmers have failed to do their job. They probably don't even know that is their job.

"In fact, it is such a good first approximation that nobody has bothered with a second"

Burroughs system languages blow C out of the water and make C look like the toy it is. I know your response will be 'who uses Burroughs'. But the one thing about Burroughs system languages is that you only use them for the OS, or middleware, not everywhere as is expected of C.

Also Burroughs languages were around half a decade before C. Did you ever stop to wonder where the idea of writing system software in structured languages comes from, and where #define came from? From Burroughs, from an idea Donald Knuth gave them.

"at least not with any great degree of sincerity"

Oh, Burroughs languages are far more sincere than C. There have been many more sincere languages than C or C++, but they get roundly bashed by C/C++ people. Others are getting sick of C and C++ – they are now old and tired.

Ian Joyner Bronze badge

Re: Is C++ becoming too large and complex?

"If a language is simple enough for you to know all of it then its not complicated enough for modern use."

That is complete nonsense and completely misunderstands computing, programming, and languages.

Languages and tools should be simple – it is the problems we express in those that may be complex.

Complex solutions are most often wrong. And that is C++.

Ian Joyner Bronze badge

Re: Is C++ becoming too large and complex?

"So you would rather C++ be unchanging".

I don't see what you are getting at at all. Languages should be well designed in the first place. Even well-designed languages can undergo change. Even Eiffel has. Alan Kay has said he expected that Smalltalk would have been improved on by now.

But C++ started in a particularly bad place and has needed massive improvements, the to follow all the machinations of those 'improvements'. The excuse that C++ is evolving and becoming better is a pathetic excuse for what was not a very good language in the first place.

C++ rode on the popularity of C, and the popularity of C is also misplaced.

Ian Joyner Bronze badge

C++ fails by its own measures

"Good C++ code should be easy to understand, according to Stroustrup. “One of the measures of good code is that I can understand it,” he said."

Ian Joyner Bronze badge

Re: Is C++ becoming too large and complex?

The problem with not knowing the whole language is that thing you forgot will come and hit you in the head. This is because of non-orthogonality.

See my comment on 'Evolution' to see this is not a good thing.

Ian Joyner Bronze badge

Evolving

A fundamental problem with C++ is practitioners need to understand the history and evolution of C++. I have two books here on my desk "The Evolution of C++" edited by Jim Waldo, and Stroustrup's own "The Design and Evolution of C++". While it is interesting to know the background and thinking in any language, C++ is the one language that demands understanding the evolution – why is this or that construct there, what you should use and what you should avoid.

Different people have different interpretations, likes and dislikes. Different shops will develop their own style rules based on their understandings of C++.

C++ as an evolving language is not a good thing. It is an excuse that C++ still has not got things right – things that were got right 50 years ago in Smalltalk and 30 years ago in Eiffel and other languages, that C and C++ people continually dismiss (and not very nicely).

C++ is like going back 100,000 years and saying there are modern humans, but let's go back a million years and evolve from there. It is time the computing industry lost patience with the approach of C++.

New programmers should absolutely not be taught C++ (or even C, or Java). The fundamentals of programming and computing should be taught with clean languages – and clean languages with semantic checks are not just beginner's languages with 'training wheels'. Semantic checks are a most advanced aspect of computing, not just to be dismissed as training wheels. Also simple languages can be used for all levels of programming.

Programming can use the simplest of tools to build the most complex of systems. C++ is a tool that thinks it must have the complexity built into the tool. This is fundamentally wrong.

Ian Joyner Bronze badge

Too large and complex

"Is C++ becoming too large and complex?

It is an intimidating language"

It passed that a long time ago. Programming is based on a few simple notions. Those who think it must be complex and build complex languages are going down the wrong path.

The two languages that are both good languages but define the extremes of OO are Smalltalk and Eiffel. Smalltalk as how simple can you get with the notion of messaging, and Eiffel which is sophisticated and implements everything cleanly from the ground up, whereas C++ is kludging everything in.

"Stroustrup said that C++ 20 is “the best approximation of C++ ideals so far”, and that C++ 23 will be better still."

The ideal of C++ has always been just to perpetuate C++. The ideals of programming and OO are already there in Smalltalk and Eiffel and people should look at those languages to understand programming and OO.

Ian Joyner Bronze badge

Re: Object

OO does have some reasonable definitions. Particularly Eiffel has a better basis:

OOSC (Eiffel)

For a more detailed look at software development for professionals, but at least first section will explain what we should be trying to do (which is programming, not coding).

https://archive.eiffel.com/doc/manuals/technology/oosc/page.html

Cardelli and Abadi have done a more mathematical basis for OO:

http://lucacardelli.name/Papers/PrimObjSemLICS.A4.pdf

https://dl.acm.org/doi/book/10.5555/547964

Of course relational database have a stronger mathematical foundation, that of relational algebra and calculus, but with OO you can build your own algebras in a class. So OO is more general than relational.

Christopher Strachey was trying to give language development a more formal approach. He developed CPL which was done as a temporary cut down version in BCPL (Martin Richards), which was adopted as the base of C (cut down BCPL).

https://www.cs.cmu.edu/~crary/819-f09/Strachey67.pdf

However, I see little hope for C++.

Ian Joyner Bronze badge

Contracts

"Contracts, a feature once slated for C++ 20, has been moved out of this release and into a study group, as they were not ready."

Still not there. Like multiple inheritance and generics, C++ is copying contracts from Eiffel. Eiffel has had contracts for 30 years. Eiffel did most things right over 30 years ago. C++ is still trying to do badly most of the things Eiffel did cleanly in the first place. Eiffel took a more integrated approach to the whole of the software development lifecycle.

The clean and constrained generics of Eiffel became the horror of templates in C++.

https://www.eiffel.com/values/design-by-contract/introduction/

Ian Joyner Bronze badge

Coroutines?

"Coroutines, functions that “can suspend execution to be resumed later,” used for asynchronous programming."

Simula 67 had coroutines. Yes, back in 1967. More than 50 years for C++ to catch up. Why did Stroustrup try to take the ideas of Simula and apply them to C?

Ian Joyner Bronze badge

C++ – never classy

It has been over 30 years since the original 'C with Classes' which was just a bunch of #defines to emulate classes of better OO languages (mainly Simula). I did the same with ALGOL (using Burroughs define … #, where defines came from in the first place from Don Knuth's suggestion), but really it was not a good way to do things.

The whole idea of merging OO with C was not classy, but tasteless. More than 30 years later they are still trying to get this language right. I contend IT NEVER WILL BE.

Instead of getting things right C++ just perpetuates the things that are wrong – the things that are wrong with both system programming based on C and application programming (which should have nothing to do with C).

C made compromises back in 1969, excusably because they wanted things that would work in a small environment. That soon became irrelevant. Other compromises were inexcusable.

C++ tries to fix some of these compromises but perpetuates others. To do this is not at all classy, but tasteless.

Struggling company pleads with landlords to slash rents as COVID-19 batters UK high street. The firm's name? Apple

Ian Joyner Bronze badge

Takes a big company to represent small interests

It is not just Apple, it is individual shop owners who need rent reduced because their shops are loss makers. For a big company, if a shop is losing, it will be shut. Apple is making this issue more visible for all. Landlords have a long history of greed.

Apple warns developers API tweaks will flow from style guide changes that remove non-inclusive language

Ian Joyner Bronze badge

Years ago one of our hardware guys wrote a basic program with an instruction "Depress the return key" – Apple did not like that. I don't blame them.

Ian Joyner Bronze badge

Re: Yeah, that's going to work out alright!

Vi – an outdated text editor. Some programmers can never get away from such primitive tools.

Macs, iPhones, iPads to get encrypted DNS – how'd you like them Apples?

Ian Joyner Bronze badge

Bias

Whenever Reg reports on Apple, it has to make some derogatory remark, like idiot tax or 'fanboi', etc.

This really shows a lack of integrity in journalism to be able to report the facts and maybe making some intelligent editorial assessment. But the Register continually fails in this.

Apple to keep Intel at Arm's length: macOS shifts from x86 to homegrown common CPU arch, will run iOS apps

Ian Joyner Bronze badge

Re: ARM?

Who are these sheeple? And what have they got to do with anything here?

The term sheeple is judgemental and just used by people who think they are superior to everyone else, who doesn't see things their way.

iPadOS 14: Apple's attempt to pry fondleslab from toddlers' mitts and make it more businesslike

Ian Joyner Bronze badge

Re: Samsung Galaxy Note

Not saying – just trolling.

ALGOL 60 at 60: The greatest computer language you've never used and grandaddy of the programming family tree

Ian Joyner Bronze badge

Re: Ugh! PASCAL

C – game over? If you like weak toys. C has too many flaws. It is not a solid language like ALGOL.

Ian Joyner Bronze badge

#define

Burroughs ALGOL originally implemented define … # from a suggestion by Donald Knuth. This was adopted in C as the #define (# introducing the define rather than terminating it).. Like many copied things in this industry, the original is usually better and Burroughs defines are much better than C defines.

Ian Joyner Bronze badge

Re: Wirthless

Yes, Burroughs system languages beat C hands down. C looks like the toy it is in comparison. And MCP makes Unix look like a toy.

Burroughs machines are secure – they enforce boundary protection at the smallest level, none of this weak pointer overruns or indexes out of bounds junk in the weak C language. Bounds protection is done at the hardware level so it cannot be subverted by C or assembler as on other systems.

Ian Joyner Bronze badge

Re: Old languages

Not just merit – frequently better than current widespread languages.

Ian Joyner Bronze badge

BNF or EBNF is still used to better define language syntax or you end up with non-context-free messes like C (luckily C is simple, or it would not be predicable).

Other techniques like denotational and axiomatic semantics are used for semantically defining the effects of language constructs.

BNF can be used not just for languages but any system designs.

I even use EBNF to present topic and lecture organisation to students.

BNF has a somewhat arcane syntax that was cleaned up, simplified and extended by EBNF.

Ian Joyner Bronze badge

Not a language of card and paper tape

While the videos sort of tie ALGOL to paper tape and cards, ALGOL is far in advance of these long-gone technologies.

Burroughs systems were much more disk-based like modern systems. When Edsger Dijkstra visited the plant in Pasadena he was amazed that the ALGOL program he had compiled in seconds. He wanted several Burroughs systems for Europe, but Ray Macdonald, then CEO of Burroughs did not want to sell systems to Europe at that time.

Ian Joyner Bronze badge

A lot of people program in ALGOL 60

There is a lot of programming done in ALGOL 60 on Burroughs machines (now Unisys MCP machines). Burroughs extended the language with IO, some an extension of FORTRAN-style IO, but most programmers did their own direct formatting, since the FORTRAN IO was interpreted and slow.

I believe Burroughs ALGOL was based on Elliot ALGOL. I later worked with another Burroughs guy who developed a language based on Elliot ALGOL for the Apple II. This was at a company called Netcomm in Australia.

Actually Don Knuth wrote one of the first Burroughs ALGOL compilers on a summer break as a student. It was on the B200 (from memory), but it predated the B5000 ALGOL compiler.

Burroughs ALGOL is a really heavy-duty systems language that makes C look like the toy that it is. It is a shame that C has effectively killed language development. If ALGOL or its next generation (ALGOL-68, ALGOL-W, Pascal, CPL, etc) had continued, we’d probably have pretty solid languages by now, rather than the rather flimsy C.

Ian Joyner Bronze badge

Andy Herbert

I met Andy Herbert a few times at ISO Open Distributed Processing (ODP) meetings. I doubt he’d remember me.

The iMac at 22: How the computer 'too odd to succeed' changed everything ... for Apple, at least

Ian Joyner Bronze badge

I was there

I remember the Bondi Blue iMac being put on display in a roped-off area so you could see it and not touch it.

I shared my room with a teacher from a school (Apple Australia asked if I wouldn’t mind sharing) who was worried about dropping the 3.5” drive. Well, he pulled this trick of asking me to settle the hotel bill and he’d fix me up later. He did not before we left, and after six months of emails, I gave up. I hope he is ashamed if he is reading this!

In 2000 Apple paid for the room and I had it to myself, whereas others shared because Apple was paying.

Apple funnels Worldwide Developer Conference 2020 through iOS app, website amid coronavirus lockdowns

Ian Joyner Bronze badge

An Apple event Reg can attend

Enjoy!

Latest Apple gadget's production behind schedule, will come out at one month past iPhone 12, reportedly

Ian Joyner Bronze badge

Apple sales down due to long-lasting hardware

So Apple do the right thing, design phones that last a long time, so people aren’t continually forced on to new model. Reg spins that as ‘iPhone sales slump’. So, those Apple disparaged ‘fanboys’ aren’t rushing out just to buy the latest new thing.

It is a maturing market, and Apple is leading the way. But other companies will still be selling stuff that breaks, so users have to upgrade more.

Apple: We respect your privacy so much we've revealed a little about what we can track when you use Maps

Ian Joyner Bronze badge

What the others actually do

Not just what the others are capable of – what they actually do. Android – you do not actually own the device, the device is for the use of Android vendors to monitor you. With iOS, you own it, Apple says they do not use it in the same way others do – and if they were shown to do so, they could be taken to court and sued.

And yes, I do believe we should enforce Apple to keep their word on this and not go down the path of Google, Amazon, etc – their intrusion into our lives is pernicious.

Ian Joyner Bronze badge

Re: Shocking

That was about five years ago. You still think that is news?

Ian Joyner Bronze badge

Re: Shocking

Yes, I use Apple maps. What are you trying to say? I suspect nothing, just cast dispersions on Apple with FUD.

Furthermore I do not use Google maps for a very good reason – I do not trust Google. And that is based on what it is known Google do.

2020 MacBook Air teardown shows in graphic detail how butterfly keyboards were snipped for scissor switch

Ian Joyner Bronze badge

Re: Stop insulting your readers

And IBM brought out the PC in an attempt to kill Apple, so Apple was responsible for that.

Ian Joyner Bronze badge

Stop insulting your readers

A lot of people here realise spending a little extra for Apple is a good investment. They are not idiots.

Let’s consider some UK companies – Rolls Royce (oh well BMW now), Dyson – 10 times the price of a regular product?

Apple has brought down the cost of personal computing. As always happens when others see success, the most obvious thing they can do is sell a cheaper less durable product and undercut on price.

Stop making the obviously political point all the time of ‘idiot tax’. It is both wrong and childish and undermines the Register’s credibility.

Apple updates iPad Pro with a trackpad, faster processor. Is it a real computer now?

Ian Joyner Bronze badge

Professionals?

And who are these professionals? Many professionals have no idea. If it were up to them, the GUI developed in Silicon Valley, Xerox, and then brought to us by Apple would not exist and we’d still be doing things on the command line. Computer professionals more-often-than-not stand in the way of progress. They have been taught – a computer looks like this. Processor architectures are supposed to be ... RISC, CISC or whatever. They can’t stand anything that is different. They used to be IBM people (even when IBM had really inferior products), then they went to Microsoft – all with the same single-vendor-supporting prejudices.

Take with a grain of salt what a lot of computing professionals say – that have been fed and taught a lot of false notions. Computers should really be imagining what could be, not stuck in the past of what is or was.

Ian Joyner Bronze badge

Trackpad is a misunderstanding

1) one a touchscreen device – THE SCREEN IS THE TRACKPAD.

2) Apple has insisted that apps are specifically developed for iOS, not just lazily ported from Mac or elsewhere. That means apps must use the touch facilities and USE THE SCREEN AS THE TRACKPAD.

3) Microsoft has taken the lazy approach to just put Windows on its pads. Yes that suits developers – no rework for a new form factor, but the user pays for that in device not being as easy to use. Then you get this nonsense about “no trackpad – it is not a real computer”. That is garbage thinking, misleading the masses. THE SCREEN IS THE TRACKPAD.

4) When using a device you should not think of it as a computer or even be aware it is a computer. The whole criticism of iOS devices not being a computer is pure ignorance on the part of those who say it. The fundamental lesson of computer science is abstraction – making one machine look like something else (a Universal Turing Machine can emulate any Turing machine). Don’t buy a computer, buy a device that will run useful applications for you. Users should not have to do those computer-type things like set IP addresses, etc. Doing computer things on the device makes it less useful – NOT MORE USEFUL.

I hope Apple aren’t losing focus on this and making things easy for lazy developers and not standing up for the end user. The computer industry is NOT for computer people but for the end user who should need to know very little about the workings of a computer. If we insist on that, we computer specialists have failed.

Oh, we may have found the COVID-19 silver lining: Coronavirus pandemic halts Xerox hostile takeover of HP

Ian Joyner Bronze badge

They could have owned the industry

Xerox could have ruled the industry with their PARC technology. That would be the good way to do it. Instead we are stuck with interminable take overs which really contribute nothing to the advancement of technology or humanity – only shareholder's value.

One for the super rich fanbois: Ultra-rare functional Apple-1 computer goes on auction

Ian Joyner Bronze badge

Typical stupid comment from idiotic Apple hater.

Larry Tesler cut and pasted from this mortal coil: That thing you just did? He probably invented it

Ian Joyner Bronze badge

A gentleman

I met Larry Tesler once and he was a very softly spoken and lovely guy. Sad to know he is gone.

Page: