* Posts by Ian Joyner

619 posts • joined 6 Jun 2014

Page:

Apple Watch Series 6 isn't a step back for repairability but in its own way that's a leap forward

Ian Joyner Bronze badge

Re: Watch repairs?

"Stop trying to put words in my mouth" I didn't say you'd said that, but others would use the same words to imply just that. "Seems we both agree it is not made to be reparable." So I'm glad we seem to agree.

Ian Joyner Bronze badge

Re: Watch repairs?

"It is not made to be repairable."

The suggestion that it is a deliberate dark tactic is wrong. It really is the only way this stuff can be built. Thus it is by nature, not by some deception.

Ian Joyner Bronze badge

Re: Watch repairs?

"Yes they were very repairable every part could be removed and replaced"

You have a point to an extent. But electronic components are microscopic and soldered for more reliability. Your counterexamples don't disprove the fact that most watches weren't at all user serviceable.

Ian Joyner Bronze badge

Watch repairs?

Cast your mind back to when watches were driven by tiny cogs and springs. Were they very repairable?

No, you had to take them to a specialist who had a tool to unscrew the back, and then fiddle with the insides.

Anyone else attempting that would end in disaster.

Ian Joyner Bronze badge

Hyperbole?

"usual fanboi whooping but heavy on the hyperbole"

That's patronising hyperbole on the part of Register.

Surprise! Apple launches iOS 14 today, and developers were given just 24 hours' notice

Ian Joyner Bronze badge

Re: Deep Analysis

DaLo "that is what it is saying in the article"

That is your interpretation. OK, the sentence in the article "Android devices have had this for some time and it must be implemented by developers on an app-by-app basis" is badly written, but it seems to only be referring to Android, not making a claim about Apple's PiP.

Regardless of this fact, my original point still stands – that often the competition rush out a half-baked version of things, and Apple takes time to get it right.

Ian Joyner Bronze badge

Re: Deep Analysis

"So your "deep analysis" is stating that others create a "half-baked version" and…"

Sorry DaLo, what you say in that post makes so little sense I don't even know where to start responding to it. What did you not understand that Android has PiP because it was provided in individual apps ""Android devices have had this for some time and it must be implemented by developers on an app-by-app basis" (so long as Register's reporting is correct).

"Which even with some "light analysis" is obviously talking about Apple developers needing to implement it on an app-by-app basis."

That is not what at least the Reg article suggested at all.

Ian Joyner Bronze badge

Title Misleading (no not this one!)

"Surprise! Apple launches iOS 14 today, and developers were given just 24 hours' notice"

No, they have had the beta for at least several months since WWDC. Everyone has known iOS 14 has been imminent.

The last paragraph addresses this but relies on a couple of tweets by unknown people. The first:

"Apple announcing the iOS 14 release date 24 hours ahead of time as a summer ending gift to developers"

Well, that does not sound bad. If Apple said "well it will be available" next month people could be complaining. In fact, people are used to this kind of announcement. An expectation of release (at WWDC), then a sudden 'we have it'.

The reply tweet says "@KarlCheeseAji Apple really really hates developers"

That's it. So what? Maybe KarlCheeseAji is one of those people who trolls the net posting anti-Apple stuff.

It is not true anyway. But Register uses a single tweet as absolute proof. Poor journalism.

I won't say it is a trouble free relationship. Apple, like technology, moves fast. Personally, I have been caught a few times by this. But Apple do a whole lot to also support developers. One has been very successful – Microsoft. But they are also responsible for Apple's begin careful what they let out, because it is well known Gates took advantage of Apple's early openness.

Ian Joyner Bronze badge

"There goes Apple, inventing things that already exist again."

A bit more deep analysis needed here. Other companies compete with Apple. They get an idea (maybe because Apple is working on it), and rush a half-baked version to market.

Apple takes its time and gets it right. The Register article says:

"Meanwhile, the Picture-in-Picture feature allows users to watch videos and place voice calls while using other applications. Android devices have had this for some time and it must be implemented by developers on an app-by-app basis"

Note PiP is in Android because app developers have done it. Could be the same on iOS, but developers probably don't bother because they know Apple is doing it independent of apps.

Ian Joyner Bronze badge

Re: Shocked, shocked I say

Some Japanese guy posts a tweet that only says "Apple really really hates developers." And Register uses this as absolute proof.

I won't say it is a trouble free relationship. Apple, like technology, moves fast. Personally, I have been caught a few times by this. But Apple do a whole lot to also support developers. One has been very successful – Microsoft. But they are also responsible for Apple's begin careful what they let out, because it is well known Gates took advantage of Apple's early openness.

Who cares what Apple's about to announce? It owes us a macOS x86 virtual appliance for non-Mac computers

Ian Joyner Bronze badge

Re: Apple Vaporware

'Apple vaporware prior to rollouts and the Pavlovian reaction of tech blog writers and fanbois has become something between a social phenomenon and a bad joke.'

Vapourware is preannouncing products. That is not what Apple does. So I think your comment is wrong (it is not very clear what you are getting at).

Ian Joyner Bronze badge

Re: Cloud-based virtual machines

"they don't see themselves as a software company and the software they do produce is with one intention - to sell Apple hardware."

Wrong – a software company is exactly what Apple is. They keep control of the hardware to support the software. Hardware is secondary to software. You can't trust hardware to hardware people and companies.

https://www.scribd.com/doc/61812037/Barton-B5000

Ian Joyner Bronze badge

Re: Hackintosh

"That said, the move to soldering everything is a real PITA"

No soldering is more reliable than pluggable.

Ian Joyner Bronze badge

Industry Standards

"Microsoft and have made their hardware a little more open and/or standards complaint, paving the way for open source drivers that can potentially be written for any hardware-OS combination."

"Could Apple's jump to ARM be enough to cause not only a swing away from their products but also a reset of the 'industry standards' on some of the expensive propriety software and especially the property data-interchange formats that currently mean that switching software but isn't an option in some professions?"

I think the writer of the above does not understand industry standards. Industry standards are not whatever is most widely used. Industry standards are data layouts and protocols for interoperation so that there is as little lock in as possible.

It does not mean that Intel as the most widely used is the standard that vendors must use. That would forever keep the industry back. Standards should be as little prescriptive as possible,

Ian Joyner Bronze badge

Walled Garden? Linux?

People keep throwing in this now pejorative phrase 'walled garden'. This is nonsense.

The basis of security is to set boundaries and respect boundaries.

Actually, it is more than respecting boundaries it is enforcing boundaries. A platform and OS must enforce boundaries.

Some spread this myth that such controls are against freedom. For some this is childish, others naive, some dishonest – at the very least it is a complete misunderstanding of security.

Linux is also overrated. I think it is a good OS, but its adherents push that it is technically superior to anything else. That is not true. Linux has made a tradeoff of security for performance. And that is why MacOS is based on a better version of Unix for end users. Mind you I also think Unix is overrated, but at least the Mach, Darwin, BSD underpinnings of Apple are better than Linux.

Ian Joyner Bronze badge

Re: IBM seems to have done very well over the years

IBM have done well out of using the dirtiest tricks in the industry. Read Richard DeLamater's 'Big Blue: IBM's Use and Abuse of Power'.

The financial world would be much better off using Burroughs mainframes from Unisys, with maybe what is still the best processor architecture in the business. In fact, the ideas from these machines influenced Apple because Alan Kay who invented the window was a student of the Burroughs designer Bob Barton. Burroughs were the first machines exclusively programmed in HLLs, and Burroughs extensions to ALGOL make C look like a toy (which it is).

IBM systems saw people as being peripherals to the system. That was broken by Silicon Valley putting people in control of computers.

Burroughs systems take a fraction of the staff of IBM. But complexity ensures lock in in this industry and that is a very bad thing.

IBM eventually lost because it brought out the awful IBM PC which was only to crush Apple.

IBM doing well over the years has been to the detriment of the industry and computing as a way of exploring what computing and humanity is.

Mind you IBM has done some very good stuff, like relational databases, but then somewhat wrecked that invention with SQL.

Ian Joyner Bronze badge

Why listen to someone who won't buy Apple anyway?

Why would Apple listen to you who probably won't buy anything from Apple? Are you admitting that MacOS might be desirable to run rather than your beloved Linux, Android, or Windows?

The best computing experience is from integration – integration of both hardware and software. The best security is also from integration.

Bob Barton – the original 'think different' guy realised this. He went on to teach Alan Kay and others who really invented today's industry and influenced in no small way Apple.

https://www.scribd.com/doc/61812037/Barton-B5000

Classy move: C++ 20 wins final approval in ISO technical ballot, formal publication expected by end of year

Ian Joyner Bronze badge

Re: C++ – never classy

"It seems to me you know very little about the object of your vitriol.". This is the typical response to try to make legitimate problems of C++ as vitriol on the part of those who raise them,

"Now there is an alternative to #include, there is no reason to use #defines either as the last use case, guard defines, are rendered redundant. Obvs."

No, they are orthogonal. You confused #include with #define in the first place.

Ian Joyner Bronze badge

Re: @Ian Joyner

"I'm still not Jake. Jake is one of at least two other entities here on ElReg."

Well, don't post as Jake.

Ian Joyner Bronze badge

Re: C++ – never classy

Someone Else: "Well then, I bet you're crying in your beer that C++ has effectively deprecated #defines with modules...."

Wait a minute modules replace #include, not #define.

Ian Joyner Bronze badge

Re: Is C++ becoming too large and complex?

Man inna barrel

Re: Is C++ becoming too large and complex?

"Languages get messy because life is messy."

No, that is absolutely the wrong approach to languages.

Yes the problems are complex. The complexity (essential) should remain in the problems. We should solve the problems with simple tools and paradoxically that is true, not "Languages get messy because life is messy." When the languages get messy that is accidental complexity and you end up wasting a lot of time dealing with that complexity that is not necessary.

"C is actually a very good language, because it provides just enough abstraction from the underlying hardware"

Which underlying hardware are you talking about? This is actually not true, but one of the simplistic maxims of C. I have ported large C systems (from well-known vendors) to platforms that C does not abstract well at all. C needs its own sandbox in that case. Mind you I knew the internals of that C compiler (written in Pascal!) and it was very well implemented.

"With C++, I am still learning C++11, so I am way behind. I lapped up many of the new concepts, but I still have a lot to learn."

Well, why this process of learning each step of C++. This is why I say to understand C++, you actually need to understand the history and evolution, much more than any other language.

Ian Joyner Bronze badge

Re: @Stjalodbaer-- Object

@Stjalodbaer-- Re: Object

"Like most programming languages, it has no formal logical or mathematical basis and is just a bundle of nostrums, as it seemed good in someone’s judgement."

Someone Else : "So a language that can be formally defined by BNF " has no formal logical or mathematical basis", and is "just a bundle of nostrums"?"

Wow! Just plain Wow...

So what constitutes a "formal logical or mathematical basis", in your nsHO?

BNF is only for syntax. A language should also be semantically rigorously defined. The problem with C++ is that it lacked this at the beginning. Strachey (the designer of CPL, the forerunner of C) did work on denotational semantics.

And as far as you can shoehorn C++ into BNF, that does not actually prove a formal syntax. That was retrofitted as well.

http://trevorjim.com/c-and-cplusplus-are-not-context-free/

Why talk about these language lawyer and formal things? Because if you get them right the languages are simpler for everyday programmers. When they are wrong you get all sorts of problems and headaches. Complexity is a pain and C++ inflicts a lot of that.

Ian Joyner Bronze badge

Re: C++ – never classy

Me: "Now here's the thing. People who have used such other languages and systems seem to be able to move on."

"Yes, and they seem to have moved on to C and C++...."

Like all matter gets sucked into a black hole.

Lock in is the second-worst problem in the industry after security – which C and C++ are also very bad at. It is not a good thing and the state of programming is not in good shape.

Ian Joyner Bronze badge

Re: C++ is great

So I said "It is you who miss the point. C and C++ are self-obsessed languages. You absolutely have to know what there is and why it might be marked deprecated or something softer. To understand C++, you must understand that."

Someone Else (SE) responds: "Complete garbage. I believe I have a firm understanding of C++ (I've been using it successfully and commercially since '92,"

That proves my point SE has been following this for nearly 30 years – about the same time since I was assigned to a large X.500 C++ project. People who have been involved with the language for that long can't understand how they have followed the history and changes, but for someone new to the language trying to understand all that is difficult. That is why I said C and C++ are 'self-obsessed' languages.

"But understanding the history of the changes? Bah" you just don't get it. You have been doing it for 30 years. Perhaps you understand not to use pointers and rather references or smart pointers or whatever. It is not just a case of syntax (although C++ is peculiarly convoluted and ugly). To understand C++ and why it is the way it is you need to read the history and evolution books, more so than in any other language. And that is for regular programmers, when it should really only be for the language lawyers. In other words C++ exposes all that when other languages abstracts that working away.

"Oh, and "self-obsessed"? Puh-LEEZE! You have a lot of damn gall to project your personal opinion on the entirety of the ANSI and ISO standards committees. Methinks you overreach, my friend."

OK, I'll address that again. Complexity becomes self obsessed. Complex technologies get lock in. Note how obsesses C and C++ people are, always leaping to defence whenever anyone criticise them. There is an obsession around C and C++ that is almost cult like – day anything that is wrong and abuse those who point them out.

I've been involved in ISO myself and there is a lot of posturing that goes on. The observations are not overreach. But there are probably some good minds trying to fix C++ and get it right. But it still is not in that position in 2020 and they still have things left out to fix in 2023. And those are things that have been around for 30 years. The problem is trying to kludge them into a language that was not suitable in the first place.

Ian Joyner Bronze badge

"I don't get all the hate for C++"

You don't get it because it is not hate. All programming languages are technical artefacts and should be analysed for what they are good for, where they are a match to some problem domain more than others. But they should also be criticised for their cross-domain flaws.

What computer people should aim for is simplicity. C++ is the antithesis of that. That is not hate on the part of people pointing that out – the problem IS WITH C++.

"If that's not your use case: pick another language. It's as simple as that"

No it's not as 'simple as that'. Many are stuck with having to suffer C++ because of its flaws. Some can see the flaws – others just don't know that is why they are having a hard time programming.

"It is definitely evolving."

Evolving towards what other languages were 30 years ago. Still contracts are put off until C++23. Many people who know other languages roll their eyes and say 'C++ finally got that'. But then they look at it and the C++ way is so much more obscure and then you find it interacts badly with other parts of the language in non-orthogonal ways.

"The right tool for the right job should be the mantra for every programmer." The C and C++ world is full of these false mantras. Yes, there is some truth in that. but in contrast C++ tries to be the tool for all jobs and in that becomes overly complex. The right tools are simple and sophisticated. C++ takes the unsophisticated and complex path.

Ian Joyner Bronze badge

Re: Evolving

Timrowledge:

Yes, I quite agree.

"C users write a file with a cpp extension and claim “hey, I’m programming objects!”"

This is a very good observation and no less than David Parnas said it somewhere as "the worst thing about OO is that people think they are doing good programming just by using an OO language" – maybe not in those exact words. Anyway, he was making the point it is the clean and enforced design of APIs that is important, that is defining all possible interactions of objects.

With C++ you have pointers which can subvert the published APIs of objects.

"I’d argue that new programmers should be taught documenting before any sort of programming language!"

Well, a programming language is actually a documentation language, but what makes it better is that a programming languages is an alive document. That is one of the great things about programming.

Ian Joyner Bronze badge

Re: @Ian Joyner

Jake: "That's not ad-hom, it's an observation based on available evidence."

Oh, yes it is, you said:

"Quite frankly, you are coming off as a jaded, disillusioned academic."

You used the word 'you' referring to me and then made a remark, which is just wrong. Instead of addressing the subject which is C++, you attacked the person. That is ad hominem.

Jake: "Whoever I am, I am not Jake."

So you don't understand (or hopefully you do now if you read my last post) that system programmers should be hiding the details of the platform so that applications programmers don't all have to deal with those details (that is the aim of software at all levels), and yet you want to hide your own identity.

I can only think you have come here to troll, and this discussion has certainly declined into ranting.

Ian Joyner Bronze badge

Re: "Competent, core language"

Jake: "Wait, what? If a systems language is used for applications, the systems programmers have failed to do their jobs? How on Earth is it the system programmer's fault if an application programmer chooses an/the incorrect language for the application? And how the fuck does it follow that the system programmer doesn't understand their job, based of what some other person does?

Your illogic is mind boggling."

OK, you don't understand what the fundamental goal of system programming is. That is to provide a platform free of the underlying considerations of hardware and 'bare metal' as C programmers love to call it. That is any hardware timing concerns that might affect the outcome of computations are handled. Any other foibles of the hardware platform. Those details are abstracted away.

Similarly in distributed systems, middleware abstracts from the differences between systems to provide a single consistent view.

Put even more simply, the goal of any level of software is to provide a strong and consistent abstraction.

When system programmers think other programmers should be using C, exposing the lower level details that should have been handled, it means the system programmers have not done their job.

I'm sorry, if I wasn't clear on that before, and I hope that explains it to you.

Ian Joyner Bronze badge

Re: Is C++ becoming too large and complex?

"Do keep carrying on talking about the 1950s."

What I'm suggesting is we should not forget the language ambitions of those times. They were very far sighted. For example Christopher Strachey's CPL, which could not be implemented at the time, so Martin Richards did BCPL, out of which C came.

But C was a compromise at the time both because it was done on limited machines, and because it did not understand the vision of the other languages.

So anything based on C is going to fall short of what could be done in languages. C++ tried to fix some of that, but really just added its own level of cruft.

Yes, Dan 55, do move on. The rest of us have.

Ian Joyner Bronze badge

Re: "Competent, core language"

"Hilarious. Keep on claiming Smalltalk, ALGOL, Simula, Smalltalk, Eiffel, Pascal, Oberon, etc... are all perfectly designed"

No, I did not say that. However, it does not matter what people think – C++ is a terribly designed language.

Dan 55 is descending to ranting.

Ian Joyner Bronze badge

Re: "Competent, core language"

"None of the languages you cite are getting traction in the commercial world and you miss all the langagues that provide real competition like Java or are up-and-coming like Go and Rust. But apparently I am wrong."

See that is what always happens, people like Dan 55 fall back on "oh it's not used widely". That is the problem of today's populism. People like Dan 55 can't break out of the mould and too scared to admit the flaws in C and C++ and can only keep beating the same drum.

Ian Joyner Bronze badge

Re: Is C++ becoming too large and complex?

" was trying to get the point across that there was no contemporary competition to C "

You are living in the narrow world of C. There were plenty of other languages around when C came out. In the 1960s there was much language activity. C people like you like to take the view that C had no contenders. It did.

"I don't know, if you don't like C++ and even don't like C but cite languages like ALGOL and Pascal in other posts"

Because again that was to contradict your contention there was nothing else. ALGOL was an ancestor of C, and yet, in many ways better. Better and rigorously defined. C was a step backwards. ALGOL was improved by CPL, ALGOL68, Pascal and beyond. C and C++ have become stuck. Even though C++ might seem to have improved with new versions that is still trying to fix mistakes, not any real improvement.

I have more than moved on, but we should not ignore the lessons from the late 1950s and beyond because those people understood the foundations of computing that is lacking in today's practitioners who think programming is down to a couple of flawed languages.

Ian Joyner Bronze badge

Re: Is C++ becoming too large and complex?

Dan 55: "while the language itself remained pristine and unsullied and unusable so the academics were happy.

Meanwhile, in the real world"

You have this false division between academic and real world. While you persist in that false belief and distinction there is no discussion to be had with you. You live in some kind of 'fool's hell'.

Programming is an intellectual activity. It is based on computational models and virtual machines, some of which are implemented in electrical circuits.

You just want to feel like you can dismiss anything else as not being 'real world'. That is a narrow and silly view.

Computing must improve and move on and if that is a move to academic purity, that is a good thing.

Ian Joyner Bronze badge

Re: Is C++ becoming too large and complex?

Dan 55 "See also Why Pascal is Not My Favorite Programming Language."

There is another thing here that C people treat whatever Kernighan, Ritchie, and Stroustrup have written as some kind of holy scripture and last word on the subject.

Like I said, Kernighan has a lot of things right in that paper (which I have read several times over the years), but it is not the last word, and Pascal wasn't where that stream of languages ended. Like most things, people moved on from Pascal. But C and C++ seem to attract the kind of people who refuse to move on.

Ian Joyner Bronze badge

Re: Is C++ becoming too large and complex?

Dan 55: >>Pascal could manage a glorified bootstrapper with a GUI, which I guess is an improvement in MS-DOS, but Apple moved away from Pascal with the move towards co-operative multitasking and PowerPC.

See also Why Pascal is Not My Favorite Programming Language.<<

Apple moved to NeXT and OS X. So?

And the Kernighan paper – most of it I agree with. But most C people take it too widely and think that it applies to any language with Pascal-like syntax, and that is an excuse to remain in the C black hole. Kernighan's paper is not an excuse just to ignore everything else – it is about some very specific things about Pascal, very notably having array sizes as part of a type, which is wrong.

Kernighan's paper is not permission or exhortation to just ignore everything else.

Many of Kernighan's complaints were addressed in commercial Pascal implementations.

Ian Joyner Bronze badge

Re: C++ – never classy

Someone Else: "Modules"

Modules? Isn't that what different files did with separate compilation, or maybe namespaces.

In another post you said just don't use things if you don't like them and you don't need to know all the history.

If modules really replace #define then you need to know you should use modules instead and why you should not use #define. It is all history and having to know that about C++ – it has this thing, but don't use it.

Ian Joyner Bronze badge

Re: C++ is great

Someone Else (yet another cowardly anonymous poster)

>>It is much more than that. You need to learn what to use and not use and that means understanding why it was there, or still is there.<<

"No, you miss the point (and now I finally understand why you whine so much about C++ and revere academic oddities like Eiffel). You don't need to learn what to use, based on some abstruse history lesson. Rather, you need to learn when to use a feature, and by induction, when not to use it."

It is you who miss the point. C and C++ are self-obsessed languages. You absolutely have to know what there is and why it might be marked deprecated or something softer. To understand C++, you must understand that.

"revere academic oddities like Eiffel" That comment is no misinformed, it does not even deserve a response.

"It's a different mindset, and one that may not be available to you."

No, it is a mindset I consciously reject because it is a wrong mindset – one stuck in the mistakes of the past. One that is locked in and unwilling to see that there have been other much better things.

Ian Joyner Bronze badge

Re: C++ – never classy

Someone Else: "Well then, I bet you're crying in your beer that C++ has effectively deprecated #defines with modules...."

No, the Burroughs defines were argued about when they were first included. They were much more powerful than the weak C defines, with several levels.

It was very much text-based processing along the lines of the General Purpose Macrogenerator (which one of my language teachers worked on).

https://en.wikipedia.org/wiki/General-purpose_macro_processor#General_Purpose_Macrogenerator

Amazing what you find. Here is an article by Andrew Herbert (who I knew at ISO ODP meetings) on GPM:

https://www.computerconservationsociety.org/software/elliott903/more903/Manuals/CCS%20Resurrection%20GPM%20Article.pdf

Strachey then developed CPL, which became BCPL, B, and C.

A couple of us Burroughs people did a language called SDL on the Apple II which was a cut down (because it had to be) of Burroughs and Elliott ALGOL. The main language designer did not have define # in it – he invented a rather neat macro mechanism to directly put in 6502 assembler code for things not worth doing in the language. However, in a bigger system (which we have these days) you would put such systems stuff directly in the language.

Now here's the thing. People who have used such other languages and systems seem to be able to move on. Sadly the same is not true for C and C++ people who seem to become rusted on to the deficiencies and bad ideas in C and C++.

Ian Joyner Bronze badge

Re: "Competent, core language"

Doctor Syntax:

Re: "Competent, core language"

""A systems language should only be used for systems, not for applications. Where a systems language is used for applications, it means the systems programmers have failed to do their job."

I think there's a couple of non sequiturs there."

You are wrong. And you don't even bother to note what the non sequiturs are.

I'll say it again, because it is right:

A systems language should only be used for systems, not for applications. Where a systems language is used for applications, it means the systems programmers have failed to do their job

And more than failed to do their job – it probably means they don't even understand what their job is.

Ian Joyner Bronze badge

Doctor Syntax:

""'Best approximation of C++ ideals so far,' ... but is it too big and complex?"

I thought too big and complex were the ideals."

No, sophistication is the idea – and sophistication results in simplicity. It is the complexity of the problems we should handle, not the complexity of the tools. Problem complexity is essential complexity – complexity in tools is accidental and self-inflicted complexity. That is the problem with C++.

Ian Joyner Bronze badge

Re: Too large and complex

"There a huge body of software written in C and its derivatives (including Java)."

Popularity is not a measure of quality. Neither is the misguided belief in the industry that C and derivates are good and all else bad.

Ian Joyner Bronze badge

Re: Coroutines?

"C++ was an OO systems programming language leveraging the advantages of C and providing alternatives"

Most programming is NOT systems programming. The fact that systems programmers think all other programmers should be using C and C++ indicates the systems programmers have NOT done their job (they probably don't even understand that is their job).

Simula (like any language) can very simply be extended for system facilities – system facilities are actually very small and they should be limited to small areas of an OS and device drivers.

C and C++ are completely the wrong approach.

Ian Joyner Bronze badge

Re: C++ is great

The problem in computing is programming is really powerful and you can do anything in just about any language, no matter how bad it is. BASIC programmers used to extol how good BASIC was and would not move on.

You must learn what orthogonality is – it will greatly help build better software.

Basically orthogonality means inclusion of features that are independent of one another and that don't interact in bad ways. It is the independence of 90º. Non-orthogonality means things interact in surprising and not pleasant ways. Complex languages are more likely to suffer from non-orthogonality. This makes them difficult to predict what will happen.

http://searchstorage.techtarget.com/definition/orthogonal

Orthogonality is fundamental to good software design. Mind you, I do sympathise with you, because of lots of academic-speak (the kind of pretentious academic speak I hate and usually the user does not know what it is either) around 'orthogonality' it took me years to understand it – but the definition is simplicity itself (see what I did there!).

Ian Joyner Bronze badge

Re: @Ian Joyner

"Quite frankly, you are coming off as a jaded, disillusioned academic."

You descend to ad hominem. I just wrote this in another answer: my first exposure to this world was I was taught Simula by a professor who had worked in the UK with some of the great minds in computing (he did not say it at the time, but I have found out since). But the term OO was not mentioned. I then became one of the first adopters of OO in this country, have done languages and compilers, many large software projects, and studied OO deeply.

So you can take your false assessments and you know what to do with them.

As for giving you (Jake, whoever you are) examples – well you can do your own research. C++ people just bash everything.

Ian Joyner Bronze badge

Re: The old problem of the programmer perisope view of the world..

"> Multiple inheritence is too scary ...

One. Do you actually know how M"

I'm not sure who 'Anonymous Coward' is addressing here. However, the fundamental thing wrong in this answer is the ad hominem attack on 'academia'. This is a problem in C think that 'oh we deal in the real world, those academics only give small examples'.

That is because academics see a problem and boil it down to a small example that demonstrates the problem, so it is easily understood. They should not feel that they need to apologise for that. In fact, academics have considered programming in the wide and wild, far wider than most real-world practitioners. So I am very against this faulty C thinking that 'oh things are very nice in academia, but we know what really goes on'. That is garbage thinking.

"So I suggest you go back to your academic textbook MI examples. Because that is the only place you will ever see them."

Textbooks also boil it down to small examples. Orthogonality means that many elements can be combined. Non-orthogonality means things combine in surprising ways – I don't mean with pleasant outcomes, quite the contrary. Now these 'real-world' practitioners might think that non-orthogonality is a necessity in their 'real world', but it is not.

In fact, saying there is this difference between 'real world' and academia is wrong. Programming is not in the real world – it is in the virtual world of electronic circuits. Virtual worlds are what software is about. Software does not deal with physical things – that is why it is really powerful and easy to change and flexible.

Those 'real world' difficulties and complexity are because someone did not understand this along the way, did not guard against non-orthogonality.

"So how did you learn your "real" OOP skills? From some TA who never wrote a line of shipped code?"

Again I don't know who 'AC' is addressing the remarks to, but this again is an example of the false attitude. As an example and general refutation, my first exposure to this world was I was taught Simula by a professor who had worked in the UK with some of the great minds in computing (he did not say it at the time, but I have found out since). But the term OO was not mentioned. I then became one of the first adopters of OO in this country, have done languages and compilers, many large software projects, and studied OO deeply.

Alan Kay noted, when he coined the term 'OO' he did not have C++ in mind. So accusing others who criticise C++ as just being taught by some 'TA' is nonsense.

"Because in my experience those are usually the only people who think stuff like MI is either important or relevant. Those of us in the the trenches shipping product most certainly don't."

You don't understand MI, or have seen it badly applied (it is easy to abuse MI). But MI allows you to break a system down into even smaller classes and then to recombine those abstract classes in very flexible ways. That means a concept is not buried in some other larger class that you then have to repeat that code somewhere else.

As for 'trenches' – again, that attitude coming up because you are working within badly designed systems.

Ian Joyner Bronze badge

Re: "Competent, core language"

"Pascal is nice to teach programming with, but can't be used in the real world"

Absolutely wrong. I have extensively used Pascal. Now most universities had Pascal IMPLEMENTATIONS that only needed to support students. Both Apple and Burroughs had real-world implementations of Pascal that were extremely useful.

That is not to deny that Pascal's type system was a bit severe and better has been done since. But that does not excuse C's almost complete lack of types apart from int, float (over BCPL). But Wirth moved on to take the lessons from Pascal and create Modula and Oberon.

Others took lessons from Simula and created Smalltalk and Eiffel. C++ took Simula and corrupted it. I don't think at the time Stroustrup had much clue.

The world moved on, but C remained in a time warp along with its followers who believe these myths that nice looking languages are just that way for teaching.

"Seriously not getting your complaint about C/C++ being old, stuck in a quagmire, etc... and then going on to quote languages which are fossilised or only of interest inside academia."

You prove in that your lack of understanding and narrow view of computing. You need to get rid of this false classification of languages into real world vs teaching or academia. C and C++ have become fossilised even though C++ comes out with a new standard every three years – but that is still trying to fix the mistakes and move towards those other languages that you consider 'fossilised'.

And you should speak for yourself. Many people in the industry are now questioning using this complex language where the complexity is accidental – that is part of the language itself, rather than essential complexity, which is part of the problem.

Your response and other responses are based on the cult of C-think of 'other languages are just for beginners', etc. It cannot be simpler – you are wrong.

Ian Joyner Bronze badge

Re: Object

"Programming – making things that do something. This goal can coexist with others, but you are always building a thing that does something."

And in computing that is an intellectual activity. Oh, I get it – C and C++ encourage this kind of anti-intellectual mentality.

Ian Joyner Bronze badge

Re: C++ is great

"The best thing about it is that if you don't like it you don't have to use it!"

It is much more than that. You need to learn what to use and not use and that means understanding why it was there, or still is there. More than any other language you have to understand the history of C++.

C++ is non-orthogonal which means you might end up using a feature in a way you didn't anticipate.

https://dl.acm.org/doi/pdf/10.5555/1352383.1352420

Ian Joyner Bronze badge

Re: C++ – never classy

"Can't see why. They both express concepts and change to express new concepts... or die from lack of use."

Because you seem locked in to the C-think cult. C changes very little.

https://www.quora.com/Will-C-get-new-features-in-the-future

C++ just keeps trying to fix the flaws.

http://trevorjim.com/c-and-cplusplus-are-not-context-free/

https://cacm.acm.org/magazines/2018/7/229036-c-is-not-a-low-level-language/fulltext

https://dl.acm.org/doi/pdf/10.5555/1352383.1352420

Like most bad technologies, they become black holes of lock in. With other languages and technologies, the practitioners seem to be able to move on.

Ian Joyner Bronze badge

Re: "Competent, core language"

"By Burrows system languages you mean ALGOL. And then you complain that C++ is old and tired."

Well, you don't even bother to spell Burroughs right.

ALGOL is still a more modern language than C or C++. But ALGOL moved on. It moved to Simula, ALGOL-68, Pascal, and then many more. C and C++ are just stuck in the same quagmire.

The worst technologies lead to lock in. That was the case with IBM, and certainly is now true for C and C++. The others have moved on.

Even so, the Burroughs languages make C look like a toy.

Page:

SUBSCRIBE TO OUR WEEKLY TECH NEWSLETTER

Biting the hand that feeds IT © 1998–2020