* Posts by Ian Joyner

622 publicly visible posts • joined 6 Jun 2014

Page:

WannaCrypt: Pwnage is a fact of life but cleanup could and should be way easier

Ian Joyner Bronze badge

MS-DOS was distressing

Interesting Quora question when someone asked why Alan Kay said MS-DOS was distressing.

https://www.quora.com/Why-did-Alan-Kay-say-MS-DOS-was-a-distressing-thing

We are still suffering in 2017 because MS foisted this junk on us.

Ian Joyner Bronze badge

Software AND hardware need improvement

It is true that higher layers of software introduce system vulnerabilities which should be avoided at those levels. System development software also needs to be more secure. But we need security built in at the lower levels. We must have hardware that detects out-of-bounds accesses - that is fundamental to both security and software correctness.

While I agree much of the blame is on MS, a lot must also be put on systems developers from the Unix background. C is an inherently flaky language. Languages developed for writing an OS should NOT be used for other applications. Much of what they provide should be forbidden at applications or even higher-level system software. But the cult-of-C has seen it used everywhere, even for some OO languages I believe. This is not a good situation. The mean look of C syntax is even used for many other languages.

While the C philosophy of "trust the programmer" now seems at best naive, it really was stupidity, and perhaps in the future should be treated as criminal negligence. It really is time to sit up and take notice of the warnings many of us have been making about C for a long time, and the inherently weak processor architectures that are underneath C.

http://ianjoyner.name/C++.html

Ian Joyner Bronze badge

IT people also to blame

Sure MS did the wrong thing. But the IT crowd came from the mentality of everything had to be IBM to everything had to be MS. IBM architectures were never really that good anyway. The IBM 360 a horror. It was all marketing. MS inherited that from IBM. The PC and MS-DOS were really inferior products, but IT people just wanted to go along with the lazy mainstream instead of setting up the industry to be good and secure. We are now paying the price for these lazy IT people who set this up in conjunction with MS and IBM.

Note that while MS were encouraging developers to use undocumented hacks that Apple always discouraged developers from using anything that was not documented. This is the basis of object-oriented and modular programming (as proposed by David Parnas). Interfaces should be small and neat for both security and correctness. But IT people and developers wanted to ignore that.

And now hackers have been the beneficiaries. But the innocent in this have been the end users that is true. They are caught between greedy monopolistic companies that have ignored quality and security and developers who did not want to be told what to do, and lazy IT people who just wanted to maintain the status quo.

Samsung Galaxy S8+: Seriously. What were they thinking?

Ian Joyner Bronze badge

Fingerprint Camera

Oh, I get what Samsung is doing - they are trying to integrate the fingerprint function with the camera to save on space and cost.

OTOH, it does not matter how many more pixels your camera has than competitors, if lens has finger smudges on it, forget picture quality. Photographers rule number 1 is never touch the lens with finger. That is why Apple continually points out that camera quality is much more than just quoting megapixels. Again, you have to dig deeper with this complex technology.

Why being late isn't fatal for Samsung Pay

Ian Joyner Bronze badge

Security?

>>it will get you through the barriers without having to wake up and unlock the phone, as rivals do<<

Security is always a balance between convenience and inconvenience. Convenience appeals on the surface, making it attractive to users, but the inconvenience of security breaches is hidden. It is always the same with computing products - you need to dig deeper and not just be fooled by any in-the-shop whizzbangery.

Samsung are masters of making stuff look good in a shop, but the deeper functionality is a bit thin meaning in the long run less convenience.

It has been the age-old argument in computing to trade security (and correctness) for speed. Mostly, programmers come down on the side of speed, but this has been the wrong decision. We now know that security is of the utmost importance. It sounds from The Register's article that Samsung have done it again and come down on the side of shallow convenience which can lead to long-term inconvenience of having your security compromised.

It's 2017 – and your Mac, iPad, iPhone can all be pwned by an e-book

Ian Joyner Bronze badge

Language

>>Apple has kicked out iOS 10.3.2.<<

Do you mean dropped? Oh you mean released. Stop with this trendy, yet ambiguous language - I had to stop and think about what you meant. You are wasting my time.

WannaCrypt ransomware snatches NSA exploit, fscks over Telefónica, other orgs in Spain

Ian Joyner Bronze badge

The IT industry must change - time to address security

If anyone is hit by this today, because you are using un-updated Windows, this might be a fix.

https://www.youtube.com/watch?v=uVLDIynuaL4

Our warnings that Windows is not a good environment have not been heeded. Here is my computing landscape:

Windows - don't use it, bad security, bad UI, use Linux.

Linux - don't use it - more secure than Windows, but not enough, and still bad UI - really for use in servers, not end-user systems. Use MacOS.

MacOS - better security because each IPC call is brokered through Mach. Excellent UI. If you really want security use Burroughs/Unisys MCP systems.

MCP systems - mainframes meant for servers. Each instruction is checked for things like out-of-bound access. Something that must be in all system architectures in the future. As Rik Ferguson (security expert at Trend Micro) says - security should be built in from the ground up. All these virus scanners, etc are just after-the-fact detection. MCP-like systems for end users are critical to develop in the future.

Nothing is invulnerable to failure though. As Mark Nunnikhoven (associate of Rik Ferguson), Vice President, Cloud Research Trend Micro noted in a talk to our students recently (while running Keynote slides on MacOS), go out and buy a second 3-4 TB hard drive to backup your files. Way cheaper than $300 which ransomware will claim.

Keep backup disks offline - only putting them online to do backup, so minimise any risk. That kind of defeats TimeMachine (Apple's realtime backup and versioning system), but it means backup disks won't be encrypted by ransomware because they are also online. I haven't done this yet, but have budgeted in next month's expenditure. Meanwhile, I just turn on backup disk and TimeMachine once a week. Most stuff also backed up in iCloud for $5 per month.

Apple leaks new thinner, lighter iPad ... revenues

Ian Joyner Bronze badge

Re: you really got to admire apple

>>quite comparable smartphone from any other manufacturer<<

Not comparable at all. If you look at comparable offerings, price difference is usually within $50.

Ian Joyner Bronze badge

Non sequitur

>>Cook said, blaming the flat sales instead on anticipation of the not-yet-real iPhone 8.

That's right, Apple blames hype around its iPhones for its sales problems. The irony.<<

That is a non-sequitur. Sales will always drop on products where consumers are waiting for new model to come out. They also wait for price drops on current models. Remember Apple does NOT comment on new products - that is anti-hype. Unlike other companies that talk up what they might release six months down the track.

Cook's observation is quite correct. Register just makes something out of nothing. Nonsense.

Apple fanbois are officially sheeple. Yes, you heard. Deal with it

Ian Joyner Bronze badge

All computing locks you into a platform.

Ian Joyner Bronze badge

Fanbois and Sheeple

Fanbois and sheeple are two words I despise. Those who use them usually have an arrogant sense of superiority who they see as feeble minded sheeple.

Here the Register uses both words, and constantly whenever Apple is mentioned use fanboi.

Time somebody at Register grew up and stuck to holding technology companies to account without insulting those who have to buy technology to participate in the modern world.

Dormant Linux kernel vulnerability finally slayed

Ian Joyner Bronze badge

Security problem is not servers - it's consumers

>>Given the flaw's age, Linux enterprise servers and devices have been vulnerable for some time<<

Such servers are run by professionals. Updates and application installation are only done after much testing, etc. These are well-regulated environments. Security is provided in a manual sense.

But consumer devices are run by people who like to download latest app, don't control their machine - they need automatic security.

While Linux might be good for servers, Linux - and Android - are not so good for end users.

Samsung Smart TV pwnable over Wi-Fi Direct, pentester says

Ian Joyner Bronze badge

Sammy?

Really - stop calling Samsung 'Sammy' as if we are all big friends. Samsung wants to dominate the market and cuts corners to do so. They are yet again caught out with sloppy practices and bringing substandard products to market. Samsung make things that look good in a shop, but dig deeper and it does not stack up. Security breaches (and anything software) are not as obvious and exciting as hardware catching on fire as in Note 7 and washing machines.

People accuse Apple of making bling and only being interested in how something looks with UI. But that is not true. Apple digs deeper, did not take the lazy path of adopting Linux (less security) and makes sure that underneath things are as secure as possible. However, with software, even with the best development practices, things go wrong and issues that weren't considered arise. Yet Register always disparages Apple as 'that fruity company', while being loving towards 'Sammy', whatever they do.

As Dan 55 noted before "Samsung are everything that's wrong with software development."

Apple's zippy silicon leaves Android rivals choking on dust

Ian Joyner Bronze badge

Re: It's just a shame

>>That iOS is such a piece of out of date crap.<<

Oh dear. With solid reasoning like that, where do you go?

Instead of being technical and using facts, let's just use emotional language to sway the masses. Sadly, your method works - it's how we got Brexit and Trump.

Ian Joyner Bronze badge

>>It's nothing to do with silicon. iOS is a ancient platform...<<

Really? iOS is based on one of the best and most secure Unixes. Linux trades security for performance. Darwin and Mach are more secure - yet as the benchmark shows wins in performance as well.

Writing in C? Another tradeoff of security, correctness, and development speed for performance. And yet C will most likely lose in performance as well.

For security reasons both C and Linux should be avoided. That would be a sophisticated system.

Linux is good for servers where they are tightly controlled for security - but for end-user devices, no. In end-user environments the built-in security of the OS must be instead added on top of Linux. That is a poor approach to security.

Security is now the biggest issue in town - sophisticated systems like iOS - address security from the base up, not as an afterthought.

The more I read your comments that >>very limited multitasking essentially a toy OS<< the more I think you are wrong and your post is nonsense.

Teenagers think Doritos are cooler than Apple

Ian Joyner Bronze badge

Ugh

Entities (people and companies) that proclaim themselves as cool or try to act cool certainly are NOT cool.

Marketing at Oreo proclaims itself 'the world's favourite cookie". Something that tastes like cardboard - roll eyes "Only Oreo"!

It's 30 years ago: IBM's final battle with reality

Ian Joyner Bronze badge

Big Blue: IBM's Use and Abuse of Power

To understand the lead up to PCs people should read Richard DeLamarter's Big Blue: IBM's Use and Abuse of Power.

It goes back to the 1890s and shows how IBM became dominant through not such ethical practices. The antitrust suit against IBM was dropped by Ronald Reagan, which prompted DeLamarter to write the book.

Banking group denied access to iPhones' NFC chips for alt.Apple.Pay

Ian Joyner Bronze badge

Banks and who else could get access?

Problem is if Apple gives banks access to NFC, others with not such noble intents could also get access. A case of Apple protecting the consumer again?

'Clearance sale' shows Apple's iPad is over. It's done

Ian Joyner Bronze badge

Re: Register Bias

>>Most people would like the security fixes on their existing OS, instead of forced upgrade to newer more resource hungry OS. SO uses apple products, made conscious derision to be more at risk by not upgrading due to bad previous experience of once zippy device becoming too slow & useless to use effectively.<<

This is difficult to answer - because your second sentence barely makes any sense.

The first sentence uses emotive language like 'being forced'.

Apple users don't feel forced to upgrade iOS - they gladly do it for the improvements made.

Security fixes do come more frequently.

Ian Joyner Bronze badge

Re: So...

>>Buy according to size, RAM, processing power, budget, etc, and pop whichever OS and software you like on there.<<

No, you buy according to the usefulness it is and what you are using it for. About the only measurement there that applies is screen size. RAM, processing power, and OS are irrelevant. These measures are all relative and depend on the software being run. Some software runs poorly and hence makes fast processor irrelevant compared to software that runs efficiently.

But really, it is still the user functionality that is important - not raw tech specs.

Ian Joyner Bronze badge

Re: Margin is the issue here...

>>Apple are addicted to profit margin.<<

Where do you get that simplistic analysis from. It is wrong. Companies must make a profit to stay in business. Apple are addicted to that.

Other companies try to compete with Apple by undercutting on price. That does not mean the make their devices for less - equivalent devices cost more or less the same.

How do they do it? They cut the quality - drastically. They use free open source from Google and Android - software that has been developed freely by many open-source developers.

Secondly, a large, diverse company like Samsung can subsidise their tablet/phone market from other sources. Once they succeed in putting Apple out of business (not that I'm saying they will), they put up the prices again.

Ian Joyner Bronze badge

>>Style over substance. Wont be investing in another, it's Surface Pro for me next time around.<<

Surface Pro? You must be kidding. Who would want the nightmare of running Windows on a tablet. Surface really is an unintegrated device just cobbled together to grab some market. They come with a trackpad which means MS has not integrated touch screen sufficiently. It has also not encouraged/forced developers to embrace the form factor - just run same old tired crummy Windows apps.

I don't know what apps you are running to draw down the iPad's power. Have you tried terminating some apps?

>>aside from the worries about breaking an expensive piece of kit<<

I just used my iPad Pro 12.9 on a 5 week trip through Africa - came back without a scratch, very useful.

>>Style over substance<<

That is the old lazy argument against Apple - it is completely wrong. Apple have adopted the form factor and made it work. Made touch screen work so you don't need a track pad for anything. But you can't get away without a trackpad on Surface.

The inclusion of a trackpad shows that Microsoft really don't understand touchscreen. With touchscreen a trackpad should be obsolescent. I pointed that out to someone with a touchscreen computer that included a trackpad the other day. His response was that all he can do with touchscreen is scroll up and down, otherwise it is useless without the trackpad.

That indicates that Microsoft has not thought it through and has not required application developers to also rethink their applications for the smaller form factor with touchscreen.

Apple - on the other hand - took significant trouble to completely rethink the user experience, design their software around it and encourage application developers to do likewise.

Had Apple done the same as MS, touchscreen would just be a curious fad, instead of the very useful input mechanism it has become. So your "style over substance" comment is quite wrong.

Ian Joyner Bronze badge

Re: Tablets

Before the iPad, tablets were crap. After the iPad, they became a very useful device as can be seen by the number of people using them as well as iPhones.

Alan Kay's original 1970s idea of the Dynabook was not sufficiently implemented until the iPad.

Ian Joyner Bronze badge

Register Bias

Once again Register bias shows in an Apple story. Most of the article is just spin.

For example 'Apple makes your device go slower with each iOS upgrade' as if this is a deliberate strategy by Apple to force and upgrade. Each OS upgrade provides new features, things like better graphics, etc. The trade off here is that new features use more processor. Apple optimize the hell out of anything they do due to their integrated hardware/software approach. Other manufacturers can't do that. A five-year-old device will naturally start to groan under the load.

Apple has also upgraded the processor from A8 to A9. Imaging if Samsung did that - Register would be singing Samsung's praises.

As many of us know - from other technology news sources - Samsung has had many woes in the last few months. Samsung's reputation has become severely tarnished. They rush things to market, and they have had dodgy dealings in South Korea.

But stories about Samsung, Register lovingly calls it 'Sammy' and has been pretty silent on Samsung's woes.

Samsung's Bixby totally isn't a Siri ripoff because look – it'll go in phones, TVs, fridges, air con...

Ian Joyner Bronze badge

Many say that Apple don't invent things but get them from elsewhere. That is true, but not in the intended pejorative sense people mean. For instance I work on the same floor in the building where WiFi was invented. But Steve Jobs demonstrated WiFi first at WWDC in 1999.

Apple frequently takes these ideas and gives them life outside of being curiosity in labs. Even the GUI, which would have remained a curiosity at Xerox PARC. However, Apple and Jef Raskin were working on similar ideas at the time.

Apple works out how people can use these things. Other companies just add the hardware, etc into their products but don't really work out how people will use them. Apple is a champion at putting technology together with usage patterns to make the technology really useful.

Their products become successful and profitable. Other companies see the profit and want some of that so just copy - and usually in an inferior way because they are trying to get their product to market as quickly as possible. Samsung is such a company and some of their failures are testament to that. Before it was Microsoft with the inferior Windows. But MS is still doing that with its Surface products - tailor the OS to portable form factor? No, it's quicker just to force Windows into everything so they can compete now.

Apple empties gas can, strikes match, burns bridge to hot-patch apps

Ian Joyner Bronze badge

Re: Code injection.

I mainly agree with DougS. I can think of lots of things that Apple has done, that includes original ideas, or ideas taken from others and popularised (including Wifi that was invented in the building where I work at Macquarie University). I can put names and faces to people who have worked at Apple. The others are just electronics companies. I can't think of what Dell or Samsung have contributed to computing, nor of any names that are know or revered in the industry. Yes, Google (where I can think of things they have done and names) also has just become an advertising company, making their money out of anything else but computing.

Ian Joyner Bronze badge

Re: Yeah

>>Rolling out flawless code is not technically impossible...your application was not released yet and even crap software is better than no software<<

There is some truth to that. But we should also not give up. Precedent says that Apple spent years developing the Macintosh. Microsoft ripped some code off them (Raskin's Quickdraw), quickly released Windows 1 before Macintosh. Windows 1 was compete 'crap', but Windows ended up winning. But it won because IBM backed MS got that horrible QDOS installed everywhere, and all the IBMers were happy because IBM almost crushed another good company - Apple.

After that experience did Apple 'learn its lesson'? No, they stuck to producing better systems, spending the time, effort, money, risk to do it. Eventually, that model has proven a success.

People talk about the software crisis - that is rubbish software getting out there without sufficient testing, leading to bugs and security flaws. If we are to get around this problem, we need to become much more professional and follow Apple's lead in this. We must keep going and put the cowboys with their 'crap' software out of business. Under the cowboy model, the consumer loses and will continue to lose.

Ian Joyner Bronze badge

Re: Code injection.

>>Apple couldn't care less about you. All they want to do is stop you bypassing the app-store to do things.<<

That is nonsense. Apple very much cares about end users. Security is paramount. End users need security provided for them. Even as a security 'expert' I value being protected. It is just impossible to keep up with everything that is happening, or even to understand it all (in that stack of security books I have on my floor).

Should we rely on Apple's review process? No - nothing can be 100% secure. We don't know how Apple tests the apps. I suspect some automated tool to look for funny calls, i.e. a virus scanner that scans for odd bits of code, long before it gets onto any end users machine. No virus scanner can be relied on 100% either.

No company wants its systems that it sells violated. But Apple have the strongest ecosystem to do that - to protects its own systems and its end users.

While you made some good points about not relying on that, since security cannot be 100%, your concluding paragraph is complete rubbish.

Apple's macOS is the safer choice – but not for the reason you think

Ian Joyner Bronze badge

Not - secure or not secure - it's a scale

You can't say a system is safe or not safe, it's a scale. MacOS is more secure than Windows and Linux - maybe much more secure - for architectural reasons.

Hackers are still going for low-handing fruit it would seem. Yes, Apple takes security seriously. Security means certain things can't be done, or you must live within the restrictions. Seems reasonable - security has by far and away become the most serious issue in the world of computing and its interaction with the physical world.

Apple vs. Samsung goes back to court, again, to re-assess the value of a rounded corner

Ian Joyner Bronze badge

Re: Why do people buy non-Apple products?

Well, Mr Imaginarynumber, you certainly are living in an imaginary world (sorry cheap shot).

No one denies that many technologies were developed long before they were put into a product. Most stuff like touch were demonstrations like 'look you put your finger here and it does something', but nothing particularly useful. In other words just little hardware demonstrations.

Apple put the whole bundle together and in such a way that showed people 'this is how you can use this stuff'. All these little demonstrations come together to make something useful.

You and many Apple detractors make out this is something trivial, that the real work is in devising the basic hardware. That is where you make your mistake, although you persist with this mistake deliberately in order to disparage Apple. Knocking and mocking is just so easy, but it doesn't really show you know what you are talking about - in fact, the opposite. This is a long tradition in the computing industry of spreading FUD against those who come out with products first. This kind of bashing actually proves the point.

What Apple has done is non-trivial. Take a look at the status-quo of phones before iPhone, and then suddenly afterwards - because iPhone was a success - the rest all adopt Apple's look and feel. Strange that. I won't say any more, but this has been adequately address in Quora answers:

https://www.quora.com/The-iPhone-was-not-the-first-touchscreen-phone-nor-the-first-phone-with-apps-so-in-what-way-did-it-change-everything

Especially the answer from Karta Sutanto.

Ian Joyner Bronze badge

Why do people buy non-Apple products?

People buy other brands than Apple for one simple reason - they work much the same as Apple at a base level. This is because companies like Samsung and Google with Android have substantially copied the Apple look and feel.

They really have stolen much of Apple's investment and development, which is non-trivial. Much of the industry sits back, waiting to see which new products are successful or not. They can easily capitalize on the successful ones and save themselves the unsuccessful ones.

Catching Samsung on rounded corners is really like gaoling Al Capone for tax evasion, which is all they could get Capone on.

Ian Joyner Bronze badge

Re: Ah, yes the Apple idea

Taking out a patent on an idea and developing a product are two different things. The concept of patent is to protect the investment of those who have developed products.

This gets misused by people who just want to stop others making products altogether, without the original patent holder actually making a product themselves. This is against the spirit of protection, which is why some people get really upset and passionate against the patent system.

The key point here is that Apple actually built the product and spent a good deal of risk and investment to do so. Now others just copy that, and bring products to market for far less investment. Apple's use of patents is legitimate and in the spirit of patents.

Android tops 2016 vuln list, with 523 bugs

Ian Joyner Bronze badge

Linux trades security for performance

Because of the Linux monolithic kernel architecture that provides speed instead of the inherent security of a microkernel, Linux is more susceptible to security flaws.

Security is best built in intrinsically at lowest levels. Adding security as an after though still leaves the original problems there.

While Linux has proven good for well-managed server systems where performance is required, it is bad for end users who don't maintain their machines or want the freedom to download apps and use their devices for 'fun'. These users want automatic security built in, rather than managed security.

This does seem like a paradox that security is more important on end user devices than servers. However, it is how that security is provided - built in to the OS, or managed by IT professionals. When a user's machine is compromised, it does not just affect that user - hackers can mount DDoS attacks against servers. This also applies to unmanaged security on IoT devices.

'Twas Brillo but then Android Things, which watched as Google Weaved its Nest

Ian Joyner Bronze badge

Re: @ Ian Joyner

Allan George Dyer

I thought parts of your answer were good, but you ended with the old 'real world' chestnut. That is exactly what we are talking about.

So, here are some responses.

Yes, iOS and MacOS (formerly OS X) are more secure being based on Mach. See my explanation of how IPC works in these different environments of micro-kernel and non-mk.

CIA certainly exists at higher levels of abstraction and security must be applied at all levels. However, when your lowest level is insecure, all higher levels are affected. Security must be built into systems, not bolted on as an afterthought. However, the discussion was about microkernels, which is technical. So your criticism point ii really amounts to nothing, and really that means not your point about context either.

iii I think we know what malware is - how does giving a 'precise' definition here help? It doesn't. OK, malware is software that has been installed on your machine for nefarious means. As I said, data centres and servers are carefully managed, end user systems not (mostly). Thus the more sensible tradeoff is performance for security, not the Linux tradeoff which is the other way.

iv well most servers (at least ones of any scale) are set up by experts - I never applied the adjective 'perfect'. No, even 'experts' need to review their security constantly.

Of course, there is always more to security, but every part must be right - our discussions was about microkernels.

Ian Joyner Bronze badge

Re: @ Ian Joyner

Well, 'common-sense' you really don't know what you are talking about, and put up nothing against what I have said.

It is a fact that Linux trades off security for performance.

Security is very important in data centres and servers, but these environments are very well controlled by experts who only install new software and versions after planning and then testing. Thus they are less dependent on the OS being strong.

End users - at the edge of the Internet - don't plan or test before installing new software. That software can come packaged with Malware. In Linux it is easier to get into other processes, since IPC is fast and direct, but not brokered through a Microkernel.

Now, go away and do some thinking before you come and call what I said claptrap.

Ian Joyner Bronze badge

Linux should be kept away from security. Contrary to what most people think, security is most important on end-user and IoT devices. These are uncontrolled devices. On servers, security is not so much of an issue on servers (although maybe more important), since servers are very controlled and behind physically locked doors.

The tradeoff is performance. Security gives a massive performance hit. That is significant on servers trying to serve 1,000s of users at once. But on end-user and IoT devices doing one specific thing, performance is not so important.

Linux trades off security for performance and is thus not so desirable out there in the wilds (at the edge) of the Internet.

Oh no, software has bugs, we need antivirus. Oh no, bug-squasher has bugs, we need ...

Ian Joyner Bronze badge

Re: Software/hardware paradigm is the problem

9rune5. I'm not really sure you are replying to my comment in the last part or the original article.

If we get our hardware/software paradigm right with defences built in from the ground up, the need for bolt-on defence goes away.

What we have at the moment is watch towers dotted sporadically around the country watching for advancing attackers. The encampment itself is surrounded by a garden fence, not a castle wall. The watch towers are put up by the anti-malware industry. But we need to be more intrinsically secure. It can be done.

But I agree that even when we clean up that part, we still need to be vigilant. However, end users are not vigilant. So there will always be opportunities for attackers if we rely on vigilance. We actually need to protect the end users of devices. Today's architectures and languages do not do that.

I hope that clarifies what I am saying.

Ian Joyner Bronze badge

Software/hardware paradigm is the problem

We have ignored this problem for too long. The problem is insufficient architectures coupled with low-level programming that target the weaknesses in those architectures. People like C.A.R. Hoare knew in the early 1960s that software should be verified and built into Elliott ALGOL - verification like bounds checks. These checks were dynamic, thus slowed processing down. Performance was critical in the 1960s, and the scientific programming/hardware community won out and did not put checks in, especially soft checks like in Elliott ALGOL.

However, Bob Barton at Burroughs in 1962 decided such checks would be better done in hardware - for speed and for security. However, this still came at a small performance penalty - but it was an example of complete systems design, not just a CPU. Such checks are not just software verification checks - in a multitasking environment they are critical security checks. Burroughs released the B5000 in 1964 and these machines are still going in Unisys Clearpath MCP. The scientific community hated the B5000 because it spent cycles on in-built security checks. (Burroughs came out with a scientific processor BSP as a backend to the B5000 - watch for this architecture in quantum computing.)

Then it was decided we could statically check software with type checks. Programmers hated types "why should we have training wheels" - this thinking is a completely false analogy.

Fast forward to 1969 Dennis Ritchie throws out most of the advances of ALGOL over FORTRAN, except for the better ALGOL-based syntax and block structure. C was built around low-level CPU instruction sets (PDP-8 where the awful ++ operator came from). That was a strength of C, but also its prime weakness. Yes, you could let the programmer do anything which appealed to programmers egos, (and it is also great to teach this level to programmers, but that would be the equivalent of training wheels) but it has proven to be completely the wrong approach to non-scientific, everyday computing. End-user computing needs to be more secure than anything else. Server computers are run by professionals with tight controls. (Linux is good here, but not appropriate for end-user systems, but that's another, although related topic.)

C's philosophy was 'trust the programmer'. But in retrospect, that was naive because not all programmers have noble intentions. At the least now it is a stupid philosophy, but more likely negligent, and due to security problems, it should become criminally negligent. If engineers built such a sloppy bridge, they'd be gaoled.

We could build verification into code generated by compilers. But that is still not good enough. We need to build verification checks into CPUs as in the B5000. We have plenty of silicon on a chip to do it now. Programmable Logic Controller (PLC - the hardware that directly controls physical-world objects) designers are coming to realise this due to Stuxnet, but we now need to apply it to rational CPU design as well. Security experts and CPU designers need to study the B5000 architecture to understand the basis of what to do in the future. (The current release is downloadable from Unisys and runs on PCs.)

Of course, there are security flaws at higher-levels of abstraction, but until we build strong legs and a sufficient foundation, the rest of the body will be vulnerable at the lowest levels.

Make no mistake, the big elephant in the room is low-level programming with languages like C, C++, and assembler. C, C++, and most CPU architectures must be replaced and the sooner the better. Stop ignoring the elephant in the room.

Note: this is not a popular message. Like the issue of climate change, it is unpopular with many people, who will try anything (mostly bogus) to try to deny this message. The problem is that they are having fun, and those with messages like security and climate change (planetary security) are unpopular party poopers.

Cynical Apple gouges UK with 20 per cent price hike

Ian Joyner Bronze badge

Re: All the good engineers are dead

>>Obviously you've never used Windows 10 with a touchscreen because if you had you would know it's actually very good<<

It's still Windows - I need say no more.

Ian Joyner Bronze badge

Re: All the good engineers are dead

>>touchscreen laptops they are still stubbornly refusing to do it, giving you a crappy ribbon touchscreen and all because their OS on the desktop is so terrible<<

So many errors in less than a single sentence.

For other companies it is about the electronics - yeah let's put in a touchscreen and sell that. Oh, OK, it's not about electronics, it's about marketing. But what do you use that for and how do you use it? These are the questions that Apple asks and solves before it just adds hardware features.

Apple is still full of excellent engineers who ask and solve these problems.

Ian Joyner Bronze badge

Is it that bad?

I complained about Apple doing this in Australia. But their reply made sense. Apple does not put up and down prices at regular intervals. That is so people buying their products don't wait because 'currency might rise' making it cheaper. Apple are setting their prices seeing pound is continuing to fall. That way you can expect the Apple price to remain stable for six months, even if pound drops more and even more than 20% - then Apple will take the loss. Who will be complaining then? It is not Apple's fault - it is the fault of Farrage, Johnson, and those idiots that voted for Brexit.

This creates a stable price for consumers. Even among different stores the price remains the same.

While most marketing people love to create a plethora of options and price differentials, Apple goes against this. I think this is good for the consumer. You don't pick up your iPhone from one shop and then pass the next and find it is £50 cheaper which results in dissatisfaction.

So, I was brought to understand this, and I hope that helps others to understand that Apple have a stabilising strategy.

Ian Joyner Bronze badge

Re: All according to plan

I agree with you. Although not living in the UK, it is in my advantage that the pound is low so I can come back for a cheap visit.

However, selfish motive aside, it seems that governments are all to pleased to devalue our savings in order to prop up industries that should be more efficient because they could cut out middle (and even upper) management doing rubbish jobs.

http://www.economist.com/blogs/freeexchange/2013/08/labour-markets-0

Samsung's free-falling financial flameout

Ian Joyner Bronze badge

Exploding washing machines

This is not the first time Samsung have released defective products.

Ian Joyner Bronze badge

>>Remember that story about Samsung paying shills to big up their company and to bad mouth the competition.<<

I missed that - do you have any specific references? Thanks.

Ian Joyner Bronze badge

Re: I'm sticking with Samsung

>>As has been pointed out by another commentard this could happen to any manufacturer.<<

Yes, it could, but it doesn't because Samsung's business model is against Apple to quickly copy and rush a product to market. This strategy failed spectacularly and exposes a lot about Samsung.

The other difference is that Samsung is an electronics company, not a systems design company. Alas a lot of people when they look at a phone think electronics, but it is the systems and applications ecosystem that make smart phones interesting. That is what Apple invents.

Microsoft keeps schtum as more battery woes hit Surface sufferers

Ian Joyner Bronze badge

Business model

Microsoft have long had a business model, like IBM before them, and more lately Samsung of just putting all others out of business - especially Apple.

One strategy to do this is once Apple have put in a few years of research to develop a product, to quickly make an ersatz copy and try to undermine Apple and steal their advantage.

IBM also had this tactic as explained in Richard DeLamarter's Big Blue: IBM's Use and Abuse of Power.

Microsoft rushed out Windows 1, even before the Macintosh, but it was absolute rubbish, even though Windows 1 stole Macintosh code, which is why Jobs went ballistic at Gates.

Lately Samsung has been literally burnt by this tactic. Alas, capitalism says leave it up to consumers to decide. Most often consumers don't know the facts and background in making a purchasing decision, and are hardly concerned with business ethics.

Samsung to fab 10nm FinFET SoCs for next year's exploding phones

Ian Joyner Bronze badge

Jokes?

Seems pro-Samsung people are making out that last week's very serious and legitimate criticism of Samsung's rush to market of faulty products are just jokes. No joke. Not funny.

However, I'll give Samsung credit for their legitimate business. And chip fabrication is something Samsung does well.

Samsung should stick to chip fabrication and TVs, not rush products to market to compete with Apple, who are in fact a large customer for Samsung's chips. Samsung should stick to their legitimate business, not be pirates in others'.

Cheer up Samsung! You might get back $400m for copying the iPhone

Ian Joyner Bronze badge

Anonymous Coward

>>What is this shit?

El Reg, start blocking some trolls, perhaps?

Grown-ups only from now on!<<

You seem so grown up, you won't even put your name to your post.

Ian Joyner Bronze badge

Well, you get that one right. Samsung's reputation is really damaged. However, people will go on buying it, because they didn't notice, or they forget (who remembers washing machine fiasco now?). Some feel they must justify their position and get cognitive dissonance over it.

Ian Joyner Bronze badge

Actually, Michael, I mostly respond to the unfounded criticism of Apple and those who buy Apple. The assumption is that to buy Apple you must be a deluded fool and fanboi. I reject that notion because Apple has developed a lot of this stuff, led the industry, and made very good products. To buy Apple is not foolish fanboy stuff.

That might come across as being an Apple fanboy to some. Those who have some sort of religious faith here are those defending Samsung even though it has made some very bad blunders in moving into Apple's space because it sees Apple has established a product, a market, and Samsung wants some of that. It happened before to Apple, and Microsoft almost put them out of business.

I also write from the perspective of one who saw the industry before Apple or Microsoft were big, when other companies had much better computers than IBM, but the pro-IBM people rubbished the developments of others. When IBM failed they just moved into a hatred of Apple and aligned themselves to Microsoft.

So, actually, perhaps it is you who has " inability to recognise your own biases and narrow viewpoint than it does about them."

Page: