* Posts by Ian Joyner

622 publicly visible posts • joined 6 Jun 2014

Page:

Yes, your old iPhone is slowing down: iOS hits brakes on CPUs as batteries wear out

Ian Joyner Bronze badge

Re: No, this was a stupid idea, or half of an idea at best

"in Apple's case its to get more money from you"

No, they are extending the life of the product. Physical things deteriorate over time. This is extending the life of the battery, so you won't have to change before really necessary.

It is amazing people are getting this exactly backwards. And this is being whipped up by the usual anti-Apple brigade who aren't interested in truth.

Ian Joyner Bronze badge

Re: Not much else you can do

"You'd think the Greens would be all over this, yet the handful I know in the office all have iPhones"

I think Greens and anyone intelligent can see this is extending the life of batteries and the iPhone.

Ian Joyner Bronze badge

Re: Apple acknowledged the situation in an email to The Register

"No, I think most people seeing their phone battery slowly reducing in capacity"

They might also think it's time for a new phone. What Apple has done is giving the phone and battery extra life. That is a good thing, but it leaves them open for the anti-Apple spinners to make it look otherwise.

Ian Joyner Bronze badge

Re: Big Bro Apple knows whats best for us

Apple is the new IBM? Hardly. Do you understand how IBM did business to dominate the industry in really bad monopolistic fashion. Apple still has nowhere near the power to do that, but they wouldn't anyway.

Read Richard DeLamater's "Big Blue: IBM's Use and Abuse of Power" to see just how different Apple and IBM are.

Ian Joyner Bronze badge

A good feature

Apple are extending your battery life. Slowing down operations keeps your phone going all day. Nothing would give people the incentive to update phone than battery that just lasts 2 hours a day.

Batteries are an environmental problem so keeping batteries going longer is a good thing. If Apple did not do this, people would be exchanging their batteries and phones even more frequently. Apple is thus extending your products life and protecting your investment.

As for disclosing. Yes it would be nice, but they have done so now. They get criticised (mostly unfairly) either way. Don't listen to the critics – they are usually wrong, and pushing a barrow for the competition.

Ian Joyner Bronze badge

Re: Apple acknowledged the situation in an email to The Register

"Fucking hell, Apple really are scum."

Why do you say that? If you hate Apple so much, you had better not use any personal computer or smartphone or tablet because they are all modelled on what Apple has done.

In fact, you don't understand that Apple are extending your battery life. Slowing down operations keeps your phone going all day. Nothing would give people the incentive to update phone than battery that just lasts 2 hours a day.

The End of Abandondroid? Treble might rescue Google from OTA Hell

Ian Joyner Bronze badge

Re: Yet Again Fail

drewsup: "I call BS in this, modern phones ARE a pc, with keyboard,modem and screen built in, in fact high end phones are being touted as "pc power""

No, if you read what I said before, many devices use computers inside. The user is hardly aware of the computer, just the general device functionality. Just because they run a computer does not make them a PC. The form factor of a phone is far away from a PC and the functionality is different. I still have a Macintosh because I don't expect iPhone form factor to do the same things, even the 12" iPad.

Same thing will be for Windows, even though full Windows is run on everything. That does not make it a good idea. Different systems for different forms is a much better idea.

Ian Joyner Bronze badge

Re: Yet Again Fail

"You really don't get it do you? Why are you comparing PCs to mobile phones?" Someone said.

"A smartphone IS a PC" I think someone replied.

A smartphone is absolutely NOT a PC. Would you suggest your washing machine is a PC because it has a computer in it? Your car? Etc? Etc? Etc?

No. These are all devices that happen to be driven by a computer. The thing is not to focus on the computer but focus on the device. You can use a PC for many things, but phones you use in completely different ways.

An example – you can write a book on a PC. While you could possibly do this on a phone, you most likely would not want to.

I think whoever said a smartphone is just a PC was being an apologist for Microsoft "Windows Everywhere" approach. This is the wrong approach. It is the cheap way of doing it. Software is much better when it is tailored to the functionality and the hardware form factor.

Ian Joyner Bronze badge

Java advantage - only if it makes coffee

"Android has two advantages - it's essentially a Java phone, and Java was designed to be portable, and the base is open source code"

Those are not advantages. As we have seen time and again, consumers don't care what language any software is written in – only on what it does.

Neither is open source an advantage. But the Darwin base of iOS is also open source – so you are wrong on two counts there.

Apple sprays down bug-ridden iOS 11 with more fixes

Ian Joyner Bronze badge

Root - not a flaw in MacOS

This is not a flaw in MacOS. It was a flaw opened up in the 10.13.1 release. Before then it didn't exist. I clarified this with its discoverer, since experiments on my machine did not show up the problem. It was because it was very quickly patched and I already had updates installed.

Prior to this, root (the Unix all-powerful user) is disabled by default on MacOS (and OS X before). If you really have a use for root, you have to do a few things to enable it. Without root, no one can log in to it, but when enabled you can do things that only researchers and hobbyists might be interested in (not even developers generally need root).

So someone had temporarily broken the way this worked. The guy who spotted it reported it to Apple and it was quickly closed.

Voyager 1 fires thrusters last used in 1980 – and they worked!

Ian Joyner Bronze badge

Code does not deteriorate

If code was error free decades ago, it is still error free. If there are bugs in there they were there in 1980. However, NASA spends a lot of time and money on testing – the kind of time and money most projects don't have.

So it is not surprising that code written and working in 1980 still works today. Hardware deteriorates, so that is the miracle that the hardware still worked.

This illustrates a difference between software and hardware. But bespoke hardware built and thoroughly tested for NASA at great expense would be much more rugged than modern off-the-shelf hardware that most of us use and can afford.

Arm Inside: Is Apple ready for the next big switch?

Ian Joyner Bronze badge

Re: Complete rethink

"Well, if things like tagged memory are so cheap to implement, why haven't they been implemented in an opt-in fashion already"

So what is your point here? You are not making a technical point of any credibility. Besides, I told you it is not necessarily tagged memory (a topic you raised), but it is the kind of security tags give you. If there are other ways of doing this fine.

Yes, change is radical and breaks things. In my experience I - and others - have taken buggy code (from well-known developers), run them in environments where their behaviour is checked and found many latent bugs. That is a good breaking of legacy applications, it tightens them up.

"Another possibility may be to implement this at an OS level to make it less architecture-dependen"

That will be much slower and not nearly as secure.

"Name an instance where a computer directly killed a user (analogous to when a car crashes into a tree or the like)."

I'm sure there have been many - like the recent Tesla car crash. Remember the Ariane V disaster - luckily no one was killed, but software interacts with the real world. Again you have no point because your original point that software doesn't kill people is patently false.

Ian Joyner Bronze badge

Re: Complete rethink

"You either take a noticeable performance hit or pay through the nose."

No, you are way overstating your case against a straw man argument, since you introduce tag architectures, which I have not mentioned in this thread. Tagged architectures are one way of enhancing security. But the performance hit probably negligible because you can offset the need to concatenate tag+instruction with sophisticated look-ahead logic. Cost? Well memory is really cheap these days so for a four bit overhead on a 64-bit word, not much.

Compared to the cost of security breaches these days, the cost of hardware is negligible. Many of these arguments come from a time when hardware was expensive and the equation was to make hardware cheaper. But the assumptions have changed. Now security problems cost a bomb and hardware is cheap.

"computers can't kill them"

Oh, yes they can.

Ian Joyner Bronze badge

Re: Complete rethink

"Well, security doesn't sell unless it's part of your job description."

Right. Security is a negative - it is trying to stop people doing things. An unfortunate but regrettably necessary aspect. The philosophy of 'customer knows what they want so do what they ask' has given way to 'customer does not know what they want, so do what they need'. Security is in this category.

But we do know that security must be build at lowest levels, or we are always chasing our tail, trying to install some magic software to provide security, and that doesn't work.

Security is also the balance between making computing easy for a legitimate user, but as hard as possible for a malicious attacker. As far as the legitimate user is concerned security facilities built in at lowest levels, such as bounds checking, actually makes no difference and certainly does not adversely impact anything that is computable. In fact, in addition to security, it helps developers develop correct programs.

Ian Joyner Bronze badge

Re: Complete rethink

Hello Milton. "As for security vs science, I respectfully suggest that may be missing the point."

I'm suggesting science as a very broad spectrum, but maybe it needs to be more tightly defined as really time-critical computations. These can indeed benefit from very low-level coding, compilers directly generating microcode as in RISC. When computers were big and expensive, of course getting every ounce out of a CPU was of primary concern and a large proportion of computing was used for scientific purposes.

But now that is a really small part of the market. And such systems should most likely not be connected to anything. But that is now a small part of the market, so I'm saying the rest is where security is critical, much more so than performance. However, building security such as bounds checks into hardware can get security and performance for most applications.

Unfortunately the performance considerations of the 1950s have long dominated the thinking of many computing people and in the minds of many, performance still trumps security and correctness. That is what I am saying needs a complete rethink, or at least a change to this perspective.

Ian Joyner Bronze badge

Re: Complete rethink

JoS. I mostly agree. We can get so much on a chip now that we really don't need RISC anymore. But we should also beware the excesses of CISC. I was going to say the Niklaus Wirth was doing RISC as Regular Instruction Set Computers a long time ago. But a little search found he is still working on such ideas:

https://www.inf.ethz.ch/personal/wirth/FPGA-relatedWork/RISC.pdf

Rust? Although I have had a quick look, I think we need to get further away from C and move towards zero-syntax or at least syntax-independent programming. C syntax now looks incredibly dated.

Ian Joyner Bronze badge

Complete rethink

What we need is a complete rethink of computer architectures. We have two kinds of processing requirements - scientific and the rest. Scientific needs raw speed, but everything else needs security. In scientific number crunching very little is shared – the processor is dedicated to a single task. Low-level RISC seems a good thing with a flat memory model.

However, for the rest, they system is shared between many applications and connected to the net. Security should be built in. If security is not built in at the lowest levels, it must be added as an afterthought. This means it is less secure since you are always trying to catch up. This wastes more processor time and people time.

Shared systems work on non-linear memory allocated in blocks. Memory blocks should not be shared among processes (perhaps an exception can be made for very well-controlled and tightly-coupled processes – but memory-block sharing results in tight-coupling which is mostly not desirable).

Processes should only communicate via messaging. This means that applications can also be distributed, since processes are loosely coupled.

Since memory blocks should be kept separate and memory viewed as blocks and not sequential, support for blocks and memory management should be built into the way a processor works – not added on as a memory-management unit (MMU). Bounds checking would become fundamental and a whole class of security problems in viruses and worms destroyed.

Program correctness would also receive a great boost, since one of the most common unwitting errors that programmers make is out-of-bounds access.

A special systems programming language for lowest layers (memory allocation, etc) should be developed, and only used for the lowest layers of OS, nowhere else. Higher levels of the OS begin to look like applications programming for which a more general purpose language could be used. C should be relegated as a legacy systems language with languages that avoid the undesirable non-orthogonality of C (and especially C++) being developed to improve the software development process.

There is a lot of work to do to achieve secure and correct computing. It does need to start at the lowest levels with a complete rethink.

Apple quietly wheels out 'Voxelnet' driverless car tech paper

Ian Joyner Bronze badge

Notorious?

"Apple is notorious for keeping its technology advances largely under wraps"

Please don't use silly adjectives to cast aspersion.

Apple is secretive because others are notorious for ripping off their ideas and work. Look at what happened when Gates rushed Windows to market after stealing Macintosh code. Apple decided - not again.

Some 'security people are f*cking morons' says Linus Torvalds

Ian Joyner Bronze badge

Re: Google's Pixel security team

"C Programmers are magicians." More like "Systems programmers are the high priests of a low cult." (1967)

"The Open Channel". Computer. 13 (3): 78–79. Mar 1980. doi:10.1109/MC.1980.1653540.

https://en.wikipedia.org/wiki/Robert_S._Barton#Quotes

It is a frequent excuse for C that you will get bugs in whatever language. But other languages will check for common mistakes and build an abstraction that is checkable when the system is built. With C you mainly have to wait until the system is deployed and hope that some nice person reports the bug, rather than sues you, or worse is a malicious hacker that will take advantage.

"You will have to worry about them later as someone will discover them and let you know, you then need to go back and fix them."

Bad philosophy - too late, too costly.

Apple succeeds in failing wearables

Ian Joyner Bronze badge

Re: "You have to tap or flick your wrist."

>> An expensive and clunky-looking watch that can’t tell you the time*

*Footnote

You have to tap or flick your wrist. Still.<<

So, your footnote contradicts your earlier sentence - it does tell the time, and it tells the time in whatever format you want. And if it didn't dim the watch you would be complaining that battery life was not very long. But it is good that mostly watch is not showing until you make a motion to look at it. It's called contextual computing.

Samsung shows off Linux desktops on Galaxy smartmobes

Ian Joyner Bronze badge

What for?

While hobbyists (which include a lot of IT people) might want to tinker about with Linux on a small device, most people want to use their phone as a communications device. Linux to them is a big ho hum. But it just shows that Samsung want to pander to the IT hobbyist people, probably because lots of people go to them and say "you are a computer expert, what should I buy", "Oh Samsung because it runs Linux". But that is a silly answer.

Brace yourselves, fanboys. Winter is coming. And the iPhone X can't handle the cold

Ian Joyner Bronze badge

More childish guff

More childish guff from the Register. When are you going to grow up and do sensible reporting on Apple?

From the headline, it sounded like iPhone X would be completely inoperable in slightly cold weather. But the admission half-way down:

"After several seconds the screen will become fully responsive again."

Before that though, the inclusion of your typical cliche "idiot-tax". Oh, all those other profit-making companies like Samsung are so noble - as if!

No, Samsung, you really do owe Apple $120m for patent infringement

Ian Joyner Bronze badge

Re: Who invented what?

"Quick Google of Samsung patents reveals they filed more US patents last year than any other company."

And patents are more-often-than-not used to impede progress and hamper other companies that are prepared to put money into those ideas. That makes Samsung to down even further in my estimation.

Ian Joyner Bronze badge

Re: Who invented what?

Well, I guess no one can name anyone at Samsung who has made any - let alone significant - contribution to computing. Apple has long supported such people and their ideas. After several days, no response to my challenge.

That is a big difference between Apple and Samsung.

Ian Joyner Bronze badge

Re: Who invented what?

"Name any famous names associated with Samsung"

Well, still no one has risen to this challenge.

Firstly, I'll say that I admire what Samsung does in electronics. They mostly do a good job there. But these are the components out of which devices are made. (And Samsung have had a couple of disasters there, apart from the TV I bought that broke down after a month, and the external computer monitor that developed problems after only a year.)

But assembling electronics into devices is another complex process. For instance, we know touch screen had been around for years, but used in niche areas – it took Apple to design that into something generally useful, and touch screen became useful because of the software and UI that Apple designed. Samsung (nor the others) did not do that.

Perhaps an electronic engineer could point to academic research that Samsung has contributed.

But I still make the point that while I can name many people associated with Apple, Microsoft, IBM, Burroughs/Unisys who have furthered computing at one level or another, I can't think of anyone at Samsung.

Yes, Samsung make bold claims about their R&D:

http://www.samsung.com/us/aboutsamsung/samsung_electronics/business_area/rd_page/

but in many ways that R&D has just been to study Apple's ideas:

http://bgr.com/2012/08/08/apple-samsung-patent-lawsuit-internal-report-copy-iphone/

Samsung and other companies are also using open source for their research base:

http://ianjoyner.name/Open_Source.html

Ian Joyner Bronze badge

Re: Who invented what?

"That says a lot about your ignorance of Samsung, and almost nothing about their inventiveness."

Well, fill us in. Name any famous names associated with Samsung, their contributions to furthering computing, and papers they might have published.

Ian Joyner Bronze badge

Who invented what?

I can name a whole lot of names of famous names who have worked and contributed at Apple. People who are well-known in the industry apart from Apple. Apple understood their ideas and employed them and they were happy to contribute their ideas because Apple was prepared to take the risk on what were unknown ideas at the time, even ideas that were contrary to industry-accepted wisdom.

When it comes to Samsung, I cannot think of a single name, or really any contributions Samsung have made to computing research.

I don't really like Microsoft or IBM, but at least I can think of names and significant contributions they have made. Google also. But Samsung a big ?.

Samsung to let proper Linux distros run on Galaxy smartmobes

Ian Joyner Bronze badge

Re: What I've been searching for... almost

No, I disagree. It is the wrong approach. See my response under "You are right!"

Ian Joyner Bronze badge

You are right!

From the article:

>>Your correspondent imagines that plenty of Linux users will enjoy the chance to run their preferred distribution on a smartphone. But the notion that developers will code “on-the-go” using Linux on a five-inch screen seems largely fanciful. Laptops are pretty good these days, as are the Android emulators that run on them. One of you will doubtless prove me wrong in the comments.<<

No, you are quite right. We have many useful devices and computers are useful in the implementation of these devices. The fact that they use computers at all should be transparent to the user. Computers have become ubiquitous.

So why this fetish with trying to make every device that runs a computer into a development system, or a system you want to play with different operating systems on? This is ridiculous. Development systems should be kept far away from user devices. Samsung is quite wrong in trying to make these devices into computers for computer hobbyists. Most people want useful devices, not computers – those who want to play with computers (like myself) are now a small minority.

In fact, Linux should be locked up in rooms for secure servers run by professionals and not for end-user devices. This comes from the view of the world that computers are for computing people. Well that is not true anymore and has not been true for a long time. Linux really isn't secure enough for end-user devices. End users require security to be taken care of automatically for them, whereas servers operate in secure environments. Of course the debate about Linux security has been going on for quite a while:

http://www.cs.vu.nl/~ast/reliable-os/

http://www.oreilly.com/openbook/opensources/book/appa.html

GarageBanned: Apple's music app silenced in iOS 11 iCloud blunder

Ian Joyner Bronze badge

Re: Yawn

It is not only as you say, all those options (as Joel Spolsky says – developer of Microsoft's best product!). Throwing in many options, I agree, shows lack of thought. But it is also a marketing thing "let's make this product look powerful and complete". So you can fool many of the people much of the time – and that is what Microsoft does.

Apple by contrast works out carefully what will be useful, what we are really trying to do with a product and sticks to that. They thus come up with far fewer options, but it takes sophisticated users (with a lot of analysis) to realise this minimalistic-looking product is just as powerful and useful (probably even more) than a system with lots of options and buttons.

This has even been the problem in computing - people think computers must be complex and expose that complexity. On the contrary, computers are to control the underlying complexity of the world and present users with something that is both usable and useful.

Ian Joyner Bronze badge

Re: "Countless 32-bit apps"

>>It IS Apple's fault, because APPLE is the one who changed. Sorry if some of us buy things and expect them to last for a few years.<<

And Apple things tend to last for years longer than products from any other manufacturer. Your whole point just disappeared.

Ian Joyner Bronze badge

Re: "Countless 32-bit apps"

We've already been through this. Firstly countless - no not countless. And out of those apps, most developers will have already recompiled in Xcode long ago and converted to 64 bit.

The example that Register gave was the Pure sound system. Well, that is Pure's fault, not Apple's. It is Pure that has not looked after their customers and (as I understand it), Pure outsourced their software development to a company that has disappeared along with the source code.

Apple has given developers more than enough warning to update their apps.

Ian Joyner Bronze badge

Re: Yawn

>>If you buy Apple expect to be made redundant<<

No, that is technology in general. Technology moves.

Actually Apple do a very good job of making sure that applications and their developers remain technology independent and thus move with the times. Some developers ignore Apple's guidelines thinking they can do better, or they might save a few processor cycles – but they pay for that in the end.

The move from OS 9 to OS X certainly did not happen overnight. Blue Box was provided for several years to allow OS 9 apps to be run under OS X. Blue Box was named after a telephone spoofing device Jobs and Wozniak had used in their youth.

https://en.wikipedia.org/wiki/Blue_box

So really, if you buy Apple, you are mainly protected from technology change – but Apple is also a leader in putting old technologies to bed in order to force the situation.

Ian Joyner Bronze badge

Re: "Countless 32-bit apps"

>>Apple fragmentation. Android just works<<

A touch or sarcasm there. But it's not true. By Apple insisting that apps be updated to 32 bit is actually reducing fragmentation. And it has done so in an orderly way.

A major part of application development is to test and make sure your app works on a new system release.

Linus Torvalds lauds fuzzing for improving Linux security

Ian Joyner Bronze badge

>>Rather a new tool called fuzzing to use in something still called testing, according to the Wiki anyway.<<

A new trendy name does not mean a new technique.

“The fuzzing of programs with random inputs dates back to the 1950s”

https://en.wikipedia.org/wiki/Fuzzing

Apple's iPhone X won't experience the joy of 6...

Ian Joyner Bronze badge

Why not buy the original?

Because Samsung is not the original – sure they can make claims that the glass wraps around the edge, but what use is that. Like Samsung, it looks good in the shop, but the reality is something different. Display is distorted by the curve so fairly useless. Then you want to protect the glass, so you put a cover over it, which covers the edge.

Apple does not compete with that kind of 'innovation'. If there is a legitimate use for something Apple will include it, but not just include things for the a 'wow' factor to make a sale like Samsung does.

Ian Joyner Bronze badge

Logic up in a puff of smoke.

Let's see sales won't be as many as expected, which means the expected sales will be less than that, so the new expectation was less than the original expectation, which wasn't an expectation at all, because we weren't expecting the sales to be that high.

But maybe sales will be even lower than that expectation, meaning that wasn't an expectation either. Pretty soon, we get down to the expectation that Apple won't sell any iPhone Xs, or maybe people will be giving them back to Apple even before they have been manufactured.

So for analysts or journalists to make a claim that sales won't live up to expectations mean they just disappeared in their own logic.

Hitting 3 nanometers to cost chipmaker TSMC at least US$20 billion

Ian Joyner Bronze badge

Price or opportunity?

Another Register spin trying to make it sound bad that TSMC must keep ahead in fabrication. Actually, since Scamscum will do this, Register could put it as the price to compete with that Korean company. So scam scum are the bad guys in this.

But actually to suggest that it is either Apple or scamscum is rubbish - the ever shrinking fabrication processes due to Moore's law is what all fabricators are chasing. It's just a matter of their customers want it and the competition is also trying to do it.

So Register's attempt at a negative portrayal of Apple here is just more childish garbage.

But I'll balance that with some kudos because most Register articles are good and informative - just drop the negative Apple thinking that brings out the Apple bashers.

Mainframes are hip now! Compuware fires its dev environment into cloud

Ian Joyner Bronze badge

Burroughs was best

Probably the most advanced architecture ever was the Burroughs B5000. It should be studied now for a regular instruction set with security baked in. The B5000 was not just designed as a CPU, but an entire system.

https://en.wikipedia.org/wiki/Burroughs_large_systems

Ian Joyner Bronze badge

COBOL?

Dijkstra's take on COBOL was

"The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offence.”

Now we have 'newer' languages to fulfil that role:

The use of C++ cripples the mind; its teaching should, therefore, be regarded as a criminal offence.

Microsoft: We've made a coding language for a quantum computer that may or may not exist

Ian Joyner Bronze badge

Re: Good old Microsoft

Microsoft did not pioneer vapourware they got that from IBM. IBM pre announced their STRETCH computer in order to slow the march of CDC which was doing very well with their machines. A long running anti-trust case against IBM was just dropped overnight by Ronald Regan.

See Richard DeLamarter's "Big Blue: IBM's Use and Abuse of Power".

Australia commits to establish space agency with no budget, plan, name, deadline …

Ian Joyner Bronze badge

Space Aid?

Some Australians argue that foreign aid should be cut because we should help our own first. The thing is that those same people are the last to help our own anyway.

Now we are asking them to put money into space?

Bill Gates says he'd do CTRL-ALT-DEL with one key if given the chance to go back through time

Ian Joyner Bronze badge

Just horrible

It's just an all-round horrible concept. Makes using a computer seem like using a computer - cryptic.

How Apple is taming the ad biz. Just don't expect Google or Zuck to follow

Ian Joyner Bronze badge

The real price?

I think people are finally starting to realise the real cost of some of that hardware they buy – it is at the cost of their privacy. Still other electronics companies have very large markets apart from computing and smart phones and subsidise their phone business to compete (steal sales) from Apple.

Behold iOS 11, an entirely new computer platform from Apple

Ian Joyner Bronze badge

Computer?

A computer is exactly what a device like iPad should not be. Computers are for computer people in white coats to tinker with. Apple changed the paradigm with the Macintosh introducing an appliance. These devices and appliances happen to be implemented with electronic computer technology. But users should not be exposed to the fact of the implementation. Apple even dropped the word computer from its name.

I'm a bit concerned about exposing things like file systems, although as an old-time computer person, file systems are ingrained in the way that I think. But files expose the memory hierarchy which itself is just implementation. Users should never have to think in these terms - only in terms of what they need to do.

I hope this is not a backward step for iPad, but I have only had iOS 11 for 24 hours and mostly it seems pretty good.

The OS X dock came in for criticism by an old Apple designer and UI expert Bruce Tognazzini:

http://www.asktog.com/columns/044top10docksucks.html

Apocalypse now: Ad biz cries foul over Apple's great AI cookie purge

Ian Joyner Bronze badge

1984

The two competing views of computers were that 1) they were centralised and could be used to control people and direct an army of workers to do what management wanted – this was business. 2) Computers should be a tool for people to use for their own creative processes (Doug Englebart, Vannevar Bush, Ted Nelson, etc). That set up 1 against 2. IBM in the first corner and Apple in the second corner.

There are still companies that want to control how people think and that is what marketing and advertising are about. Apple is not going to be popular over this one.

https://www.youtube.com/watch?v=lSiQA6KKyJo

So now maybe 2017 won't be like 1984.

Samsung mobile launches bug bounty program

Ian Joyner Bronze badge

User testing

Like Samsung (which the Register continues its love fest with by calling it Sammy) did not test sufficiently before and left it to users to test whether their batteries burst into flames.

Sacre bleu! Apple's high price, marginal gain iPhone strategy leaves it stuck in the mud

Ian Joyner Bronze badge

Re: Compare like with like

The thing about open source is it is being used by companies as a quick entry to market to make huge profits out of the free work of many open source contributors, who work on open source precisely because they don't like those kinds of companies.

http://ianjoyner.name/Open_Source.html

Apple’s facial recognition: Well, it is more secure for the, er, sleeping user

Ian Joyner Bronze badge

Security

What we are trying to do is to make computers as easy as possible to use for legitimate users, but as difficult to use as possible for illegitimate users.

Those two extremes are difficult to achieve.

Security is based on what you are, what you know, and what you have. Facial-feature recognition is the what you are factor. Two-factor authentication is also important since other mechanisms make things more secure.

Kerchoff's principles are still important.

>>In 1883 Auguste Kerckhoffs [2] wrote two journal articles on La Cryptographie Militaire,[3] in which he stated six design principles for military ciphers. Translated from French, they are:[4]

The system must be practically, if not mathematically, indecipherable;

It should not require secrecy, and it should not be a problem if it falls into enemy hands;

It must be possible to communicate and remember the key without using written notes, and correspondents must be able to change or modify it at will;

It must be applicable to telegraph communications;

It must be portable, and should not require several persons to handle or operate;

Lastly, given the circumstances in which it is to be used, the system must be easy to use and should not be stressful to use or require its users to know and comply with a long list of rules.

Some are no longer relevant given the ability of computers to perform complex encryption, but his second axiom, now known as Kerckhoffs's principle, is still critically important.<<

https://en.wikipedia.org/wiki/Kerckhoffs%27s_principle

Five ways Apple can fix the iPhone, but won't

Ian Joyner Bronze badge

What I expected and what I got...

When you said five ways Apple can fix but won't, I expected some sensible points that would stop me going and buying the iPhone 8 this year (since I am most likely to put of purchases because I think feature x, y, z will be in the next model).

But instead I only really got one 'sort of' reason about the battery. Actually a battery that lasts 10 days would be really great. I'm sure if they had a two-day battery (which for most of us it is anyway), Register would be saying Apple should have put in a four-day battery.

Well, at least Apple tests their products (really thoroughly) and don't put in BATTERIES THAT EXPLODE like some!

Meanwhile Apple is putting research into batteries, even car batteries from which everyone will benefit, even Apple's competitors which aren't ashamed to copy anyway.

https://www.macrumors.com/2017/07/20/apple-catl-battery-research-report/

As for the other four reasons, Register even contradicts itself by saying Apple probably will do something along these lines. That contradicts the 'but won't'. The headphone debate has already been won by Apple.

So, why I won't buy a new iPhone 8 this year is because I only just got the iPhone 7 in January, and it's still doing fine.

Page: