Re: Ugh! PASCAL
C – game over? If you like weak toys. C has too many flaws. It is not a solid language like ALGOL.
546 posts • joined 6 Jun 2014
Burroughs ALGOL originally implemented define … # from a suggestion by Donald Knuth. This was adopted in C as the #define (# introducing the define rather than terminating it).. Like many copied things in this industry, the original is usually better and Burroughs defines are much better than C defines.
Yes, Burroughs system languages beat C hands down. C looks like the toy it is in comparison. And MCP makes Unix look like a toy.
Burroughs machines are secure – they enforce boundary protection at the smallest level, none of this weak pointer overruns or indexes out of bounds junk in the weak C language. Bounds protection is done at the hardware level so it cannot be subverted by C or assembler as on other systems.
BNF or EBNF is still used to better define language syntax or you end up with non-context-free messes like C (luckily C is simple, or it would not be predicable).
Other techniques like denotational and axiomatic semantics are used for semantically defining the effects of language constructs.
BNF can be used not just for languages but any system designs.
I even use EBNF to present topic and lecture organisation to students.
BNF has a somewhat arcane syntax that was cleaned up, simplified and extended by EBNF.
While the videos sort of tie ALGOL to paper tape and cards, ALGOL is far in advance of these long-gone technologies.
Burroughs systems were much more disk-based like modern systems. When Edsger Dijkstra visited the plant in Pasadena he was amazed that the ALGOL program he had compiled in seconds. He wanted several Burroughs systems for Europe, but Ray Macdonald, then CEO of Burroughs did not want to sell systems to Europe at that time.
There is a lot of programming done in ALGOL 60 on Burroughs machines (now Unisys MCP machines). Burroughs extended the language with IO, some an extension of FORTRAN-style IO, but most programmers did their own direct formatting, since the FORTRAN IO was interpreted and slow.
I believe Burroughs ALGOL was based on Elliot ALGOL. I later worked with another Burroughs guy who developed a language based on Elliot ALGOL for the Apple II. This was at a company called Netcomm in Australia.
Actually Don Knuth wrote one of the first Burroughs ALGOL compilers on a summer break as a student. It was on the B200 (from memory), but it predated the B5000 ALGOL compiler.
Burroughs ALGOL is a really heavy-duty systems language that makes C look like the toy that it is. It is a shame that C has effectively killed language development. If ALGOL or its next generation (ALGOL-68, ALGOL-W, Pascal, CPL, etc) had continued, we’d probably have pretty solid languages by now, rather than the rather flimsy C.
I remember the Bondi Blue iMac being put on display in a roped-off area so you could see it and not touch it.
I shared my room with a teacher from a school (Apple Australia asked if I wouldn’t mind sharing) who was worried about dropping the 3.5” drive. Well, he pulled this trick of asking me to settle the hotel bill and he’d fix me up later. He did not before we left, and after six months of emails, I gave up. I hope he is ashamed if he is reading this!
In 2000 Apple paid for the room and I had it to myself, whereas others shared because Apple was paying.
So Apple do the right thing, design phones that last a long time, so people aren’t continually forced on to new model. Reg spins that as ‘iPhone sales slump’. So, those Apple disparaged ‘fanboys’ aren’t rushing out just to buy the latest new thing.
It is a maturing market, and Apple is leading the way. But other companies will still be selling stuff that breaks, so users have to upgrade more.
Not just what the others are capable of – what they actually do. Android – you do not actually own the device, the device is for the use of Android vendors to monitor you. With iOS, you own it, Apple says they do not use it in the same way others do – and if they were shown to do so, they could be taken to court and sued.
And yes, I do believe we should enforce Apple to keep their word on this and not go down the path of Google, Amazon, etc – their intrusion into our lives is pernicious.
A lot of people here realise spending a little extra for Apple is a good investment. They are not idiots.
Let’s consider some UK companies – Rolls Royce (oh well BMW now), Dyson – 10 times the price of a regular product?
Apple has brought down the cost of personal computing. As always happens when others see success, the most obvious thing they can do is sell a cheaper less durable product and undercut on price.
Stop making the obviously political point all the time of ‘idiot tax’. It is both wrong and childish and undermines the Register’s credibility.
And who are these professionals? Many professionals have no idea. If it were up to them, the GUI developed in Silicon Valley, Xerox, and then brought to us by Apple would not exist and we’d still be doing things on the command line. Computer professionals more-often-than-not stand in the way of progress. They have been taught – a computer looks like this. Processor architectures are supposed to be ... RISC, CISC or whatever. They can’t stand anything that is different. They used to be IBM people (even when IBM had really inferior products), then they went to Microsoft – all with the same single-vendor-supporting prejudices.
Take with a grain of salt what a lot of computing professionals say – that have been fed and taught a lot of false notions. Computers should really be imagining what could be, not stuck in the past of what is or was.
1) one a touchscreen device – THE SCREEN IS THE TRACKPAD.
2) Apple has insisted that apps are specifically developed for iOS, not just lazily ported from Mac or elsewhere. That means apps must use the touch facilities and USE THE SCREEN AS THE TRACKPAD.
3) Microsoft has taken the lazy approach to just put Windows on its pads. Yes that suits developers – no rework for a new form factor, but the user pays for that in device not being as easy to use. Then you get this nonsense about “no trackpad – it is not a real computer”. That is garbage thinking, misleading the masses. THE SCREEN IS THE TRACKPAD.
4) When using a device you should not think of it as a computer or even be aware it is a computer. The whole criticism of iOS devices not being a computer is pure ignorance on the part of those who say it. The fundamental lesson of computer science is abstraction – making one machine look like something else (a Universal Turing Machine can emulate any Turing machine). Don’t buy a computer, buy a device that will run useful applications for you. Users should not have to do those computer-type things like set IP addresses, etc. Doing computer things on the device makes it less useful – NOT MORE USEFUL.
I hope Apple aren’t losing focus on this and making things easy for lazy developers and not standing up for the end user. The computer industry is NOT for computer people but for the end user who should need to know very little about the workings of a computer. If we insist on that, we computer specialists have failed.
Processor instruction sets won't be interesting until they have security built in. The main support must be support for structured memory with bounds checking. Building such facilities into hardware is the fastest way to do it. Without this, personal devices can never be secure.
The claim here is that RISC-V will support AI. But AI is being used more and more against the individual. Security protects the individual.
Computing is moving in the wrong direction – support for anti-individual AI while ignoring and denying individual security.
Well, it was good to see IBM toppled by Microsoft (after being loosened by Apple). But Microsoft replaced IBM with something just as evil (they inherited IBM's practices). But are we in danger of Google displacing Microsoft with something just as bad or worse?
Seems instead of people being in control of computing and the world, computing giants have found ever new and pernicious ways to control us. Mind you Satya Nadella does seem to be changing Microsoft into a more reasonable company.
So how do we escape this trap of computing?
How do we force computing to be the servant of all, rather than just for the benefit of the few?
I suspect it looks like a lot of Samsung gear in the shops – looks pretty and enticing in shop, but not so good when you get it home. The few Samsung things I bought did not last long (TV needed replacement after just one month), external computer monitor lasted for about two years.
IBM was the originator of dirty tricks. Thomas J Watson senior learned the tactics of Patterson of NCR. Patterson would denigrate all competitor's cash registers spearing FUD. Patterson was gaoled, Watson got off, but then learned how to apply dirty tricks more stealthily.
All this can be read about in Thomas DeLamarter's "Big Blue: IBM's Use and Abuse of Power".
IBM were truly a nasty company.
why Samsung is one of my least favourite companies along with IBM and Microsoft, companies that want to take over everything. But at least I can name very relevant researchers with IBM and Microsoft – Samsung has no one. It obviously likes its employees to be anonymous – that way they contribute nothing to the scientific world, only to Samsung's profits.
I took my relatives from Southampton to the locked gates of where the Skippy park was last week. Glad to know Skippy has defeated the wicked empire! Shame everything was blanketed in smoke for them. I remember after a bushfire burnt out that area when I was young ending up in the Skippy park overlooking the valley with small fires still burning.
So when exactly did Google become evil? Is it when advertising became the big revenue, and advertising corrupts? Is it because of the lies advertising and propaganda spread, and Google's (and Facebook, etc) revenue now depend on that? Is it because when you are funded by advertising you don't have to have good products anymore because that is not where your revenue comes from? Is it because the majority of people want to have the bad beliefs propped up by the lies that can be propagated on these platforms?
For a start Apple's advertising is much more subtle that many others. You have no point there apart from Apple bashing.
As far as Apple marketing goes, the strategy is put into making great products. The others copy what Apple does (even the advertising).
Apple analyse the usage of these devices – that is how they set the form factor that the others have followed with inferior products based on off-the-shelf software of Linux and Android. They others are just hardware companies that copy, but really don't have an understanding.
The others also subsidise their prices from other markets in electronics and white goods as well as selling your details to other advertisers. Apple protects users from that – but this is obviously resented by the advertising industry who want your details.
If you want small, things are not repairable and reliable at the same time.
The smaller you want something the more expensive it is, until enough people buy to get economies of scale and thus cheaper manufacturing.
(I'll stick to the inconvenience of wired earbuds for now.)
"We're a little surprised not to see either the inexplicable Apple mark-up, nor the stupidly expensive Xbox Elite Controller, which, at $179.99, would have been more the fruity firm's style."
The Register's surprise is due to the Register's misconceptions and continual spreading of garbage about Apple. Waiting for your next fondling and loving article about 'Sammy'.
"Your total misrepresentation of my ability is due to your total miscomprehension of my post."
My comments on your ability are based on the explicit misunderstandings in your post. All you want to do is throw around lazy phrases like 'walled garden' as if they prove something. But these phrases come from lack of understanding of security.
"A platform and OS has no right to force its boundaries against the *owner* of a device against their will."
I don't know where to start on your lack of understanding. Computers are about whatever a person can do, the computer can do. So if an end user can install something from anywhere (at great risk) someone else can as well – that is how viruses and worms spread.
Hackers have no right to your device and it is the OS's duty to protect the device and its owner. That is about enforcing boundaries.
You really don't understand what security is about.
Strange how you are protecting yourself by using the 'walled garden' of 'anonymous coward'.
People keep throwing in this now pejorative phrase 'walled garden'. This is nonsense.
The basis of security is to set boundaries and respect boundaries.
Actually, it is more than respecting boundaries it is enforcing boundaries. A platform and OS must enforce boundaries.
Some spread this myth that such controls are against freedom. For some this is childish, others naive, some dishonest – at the very least it is a complete misunderstanding of security.
(Now, I must go, I have to lecture on the topic of security for 8 hours today.)
Foxconn builds electronics for many companies, not just Apple. Maybe people attack Apple because they are the most influential and they do care about their image. But remember many other companies are complicit as well.
In fact the whole issue of widespread almost-slave labour is what gives the west so many cheap products, especially clothing and shoes which we might get cheaply, or pay $100s for when the labour costs a few dollars.
"And they always lie to us." Do they? You must be very paranoid if you think that is true.
That ad was based on a truth that there were two opposing views of computing 1) computers were there to control people and a workforce 2) computers are the servants of people.
IBM was there to control people. Silicon Valley took view 2, and this was the philosophy behind the Macintosh – it was no lie.
Even IBM's PC were still about controlling the office environment, and I still see that in Windows today.
Re: Typical Corporate Greed
"You don't need a headphone jack, so it was removed. To support old devices, they included an adaptor. I use it all the time."
Then YOU do need a headphone jack.<<
No, I told you I don't, and why I and others don't. This headphone jack thing is just a typical anti-Apple beat up.
"They have taken a big leaf out of Apple arrogance"
Right, Samsung steals from Apple. But arrogance. Apple broke the arrogance of the computer industry. IBM and then Microsoft's arrogance (the arrogance that says when you start up an application, it takes over the whole screen). The arrogance of 'boffins' – well it is still around. With Apple you have far less dependence on support people. Of course those same boffins and support people don't like that and they put it around it is Apple's arrogance, not their own.
"Headphone jack removed."
You don't need a headphone jack, so it was removed. To support old devices, they included an adaptor. I use it all the time. It does what is needed. Saying otherwise is just spreading FUD.
"Besides, reducing customer choice is TM Apple Inc."
In one sense you are correct, but I fear not in the way you mean. Ridiculous choice is what the market people want to give you – keep the consumer confused. Apple did that in the 1990s until Jobs came back and reduced choice to a table with four cells. He identified what people need, not what they think they want.
“I find out what the world needs, then I proceed to invent it.” Thomas Edison
Note he says “needs” not “wants”.
Karlkarl: "But that's just the thing, C++ isn't hard, you just need to not waste time faffing with other languages."
That is exactly why programmers should learn other languages. C++ makes things inordinately difficult. Every workplace will have different style guidelines of what to use or not use depending on their understanding.
Even for languages that were perfect and simple I'd still say learn another language. But if the languages are simple that will be easy to do.
Of course if they not simple like C++ there will be a lot of resistance to doing that – just as karlkarl does.
I object to being called shill. And 'big language' you completely misunderstand what I'm saying. I'm saying C++ is such a big language and we should replace it with small, powerful and to-the-point languages.
Read this paper for John Backus' comments on fat and flabby but weak languages:
Biting the hand that feeds IT © 1998–2020