* Posts by Ian Joyner

622 publicly visible posts • joined 6 Jun 2014

Page:

Apple Pay a haven for 'rampant' credit card fraud, say experts

Ian Joyner Bronze badge

Re: ApplePay is very secure

"I guess "ease of use" and "ease of fraud" go hand in hand"

I'll address that another way - making systems hard to use is security by obscurity and that is known not to be a good security strategy. Excellent security systems are also simple and provide ease of use. Apple has really excelled on that count with ApplePay.

Ian Joyner Bronze badge

Re: ApplePay is very secure

"I dont get how Apple Pay is more secure than carrying your credit card in your pocket.

With my card, its a physical card, it has my signature, chip and a PIN number to verify that it is in fact me using a physical card."

The physical card can easily be stolen. If they steal your iPhone, they need your finger print to access the credit information. ApplePay is more secure than physical cards.

"I guess "ease of use" and "ease of fraud" go hand in hand"

No that is absolutely not true.

Ian Joyner Bronze badge

ApplePay is very secure

ApplePay is far more secure than carrying your credit card around in your wallet. It is far less likely for fraud due to the inbuilt security mechanisms.

However, fraudsters will always try different ways - and these apply to any contactless payment, not just ApplePay (just Apple is so visible, putting Apple in a headline makes for a good headline).

If fraudsters access a retail vendor's server database it is the fault of the vendor who are in turn a victim of the fraudsters.

However, loading it on to a smart phone is a risk to the fraudster since it is more likely to be tracked (as in Find My iPhone). Maybe those anti-fraud measures are not in place at the moment, but it is easy to see that the backend could be tightened up in this way. No need for FUD against ApplePay, thanks.

IBM chasing ex-staffers for $20 payments

Ian Joyner Bronze badge

Read Delamarter's Big Blue: IBM's Use and Abuse of Power

http://library.alibris.com/Big-Blue-IBMs-Use-and-Abuse-of-Power-Richard-Thomas-Delamarter/book/689875

Windows is TAKING the TABLET market... what's left of it, anyway

Ian Joyner Bronze badge

What are computers for - the two basic paradigms

The discussion here is really based on the two basic views of computers. IBM's view was based on business - computers become the focus of work and drive the workforce to do data entry, etc. This is the first paradigm - computers control people. That's the 'Business' in IBM.

Apple came along (with Xerox PARC) and said people should control computers and use them as a tool for creative purposes. This overturned the thinking of a lot of the industry. A lot of people still don't like this reversal of the paradigm and hence bash Apple at every opportunity.

There is actually a third paradigm that John McCarthy (inventor of LISP) believed in - that AI would completely replace human intelligence. But that paradigm seems rather pointless.

Ian Joyner Bronze badge

Re: Slabs losing sales? Not surprising

"Execs have tried iPads and finally admitted that they were a solution looking for a problem."

Just a meaningless slogan. In fact it is the opposite. Apple look at the way people will use the platform and build the platform around that. MSes attempts to build a pad earlier were very much a solution looking for a problem, but the success of iPad has very much been because they found what the problem was and built a solution around that. They have always worked out ways for the masses to use computers - not geeks, nerds, and managers.

Ian Joyner Bronze badge

Re: 2.3%?

That just shows a lot of old IT people just want to damage Apple.

Give up Apple have for a long time produced better hardware and software as well as having the right ideas in the first place.

The old computing environment was propping up IBM by FUD against worthy others such as Burroughs, this was then applied to MS against Apple.

This is the old industry - time it died.

Ian Joyner Bronze badge

Re: I'd consider Surface more of a super ultrabook than a tablet

Yes, sadly, there are still a lot of old-time IT people out there who in the old days were only allowed to buy IBM (when there were vastly better mainframes out there) and then succumbed to Microsoft as the replacement IBM, even though the IBM PC was way behind the Macintosh in 1984.

I get sick of reading specs for orthogonal systems (one's that aren't dependent on an MS ecosystem) that say the solution must be MS based because the IT people are way behind the sophistication of the average user these days.

Ian Joyner Bronze badge

Re: I'd consider Surface more of a super ultrabook than a tablet

The tablet form is much more suited to content consumption rather than content production. If you are a serious producer, still desktop is best, and laptops have the advantage.

Apple tailored the OS to the tablet form and it will be more efficient in terms of using limited resources such as memory, processing power, and battery life.

They have tailored their content producing iWork package for the tablet and this is useful, but you don't want to use it for really serious, every day work. Any tablet will have exactly the same limitations, whether you attempt to run heavyweight software on it or not.

Of course, the form factors may merge down the track, but for now a rethink of the software as Apple has done is a breath of fresh air. The others have followed, except for Microsoft, which have rushed to play catchup.

'If you see a stylus, they BLEW it' – Steve Jobs. REMEMBER, Apple?

Ian Joyner Bronze badge

Re: Fascinating how things always evolve back towards the Dynabook

>>Anything good enough to allow for experimentation would, in effect, be a programming language, and the Apple Store has explicit rules against that.<<

Absolutely not. There are lots of ways of learning that don't require a programming language.

If you want to do programming - do it on a Mac.

Ian Joyner Bronze badge

Re: Fascinating how things always evolve back towards the Dynabook

Dynabook was the brainchild of Alan Kay at Xerox PARC. Kay was a student of Bob Barton - designer of the B5000. When Xerox did not want to take PARC's work anywhere, Kay and many others (Larry Tesler) etc joined up with Steve Jobs and Jef Raskin at Apple who were doing similar things.

The iPad eventually fulfilled Kay's vision. Many of Kay's writings are online and are all interesting reads.

Ian Joyner Bronze badge

Apple regularly challenges and breaks the rules - including its own sometimes.

They were first to introduce 5.25" floppies instead of cassette tapes. When IBM PCs adopted 5.25s, Apple went to 4.5" 800k hard-covered disks. The IBM brigade were horrified - 4.5s "weren't real floppies".

Apple then were first to drop 4.5s. Then they dropped CDs and DVDs. You just download content directly and more cheaply from the net.

Apple has adopted many technologies only to drop them. They are not precious and religious as many in the computing industry are.

This latest story - IF TRUE - would just be another example of that.

Scientific consensus that 2014 was record hottest year? No

Ian Joyner Bronze badge

Re: Well

code junky >>100%. 90%? 80? 0.1 in a margin of error of 0.5? To provide such wonderful benefits as producing more co2 for new methods of 'ehem' energy production which doesnt really work (but dont say it too loud) and is massively inflating the cost of energy while increasing the chance of energy shortage.<<

Science is never 100% sure. Even Newton has been proven to be wrong - however his equations are still good enough to get us to the moon and planets.

That is the same kind of certainty that we have with global warming. As I say, even if global warming is wrong (and we all hope it is), the effect of getting more efficient (read cheaper) less polluting (read expensive that polluters don't pay for) energy that is available to many more people, is a good thing.

So irrespective of whether AGW is correct, it looks like a win-win situation and at worst a lose-win.

Ian Joyner Bronze badge

Re: warmists or sceptics

I think what you are describing is arguments on the denialists side of straw men, ad hominem attacks, cherry picking, false petitions, etc, etc.

However, we should not be affected by the bad arguments on one side or the other. Truth is separate from the aggressive arguments of some.

Ian Joyner Bronze badge

Re: Well

Where do you get this nonsense from. Are you just putting it in here to confuse people? That seems to be the tactic these days:

http://www.filmsforaction.org/watch/nonlinear-warfare-a-new-system-of-political-control-2014/

Ian Joyner Bronze badge

Re: Well

High water marks change all the time. The highs and lows of tides are not constant. I have very amusing pictures of a king tide at Richmond last year when the Thames came up over the road and a jeep parked there looked like it was walking on water.

Tides change the world over. Can be 1 meter in one place and 9 meters in others. Low-level island nations are complaining that the sea is rising on them.

How much will sea levels rise if the arctic entirely melts - I'll tell you, by nothing! But the arctic ice is melting, and the antarctic will follow, which will result in rises.

Even if we cannot be 100% sure - the safest course of action is to take evasive action. This will result in a whole lot of beneficial effects like cheap energy for many people who don't have it now.

One thing about people and scientists who acknowledge AGW as real is that we hope we a proved wrong - but until then, don't take the risk. That is a huge gamble that denialists are taking.

Ian Joyner Bronze badge

Re: You want some real nonsense?

You got it exactly James. This isn't the first topic where I have had to point out Anonymous Coward's nonsensical statement. He or she can't even put their name to what they say - too ashamed I'd say.

Ian Joyner Bronze badge

Re: Decide for yourself.

Complete lies Haefen. Scientists are not after money and power - it is the climate-change deniers who have money tied up in fossil fuels that are after money and power.

I see so many lies going around on the part of climate deniers that they have no credibility left. Money and power is just one of the lies. Cherry picking hot years, lack of understanding about what CO2 is, etc, etc.

Having concern for the planet and people is not being after money and power at all.

Ian Joyner Bronze badge

Re: Picking ever shorter time periods to deny climate change

Anonymous Coward cites:

http://www.drroyspencer.com/latest-global-temperatures/

Looking at this site - it is a climate denier site and says:

"Believe it or not, very little research has ever been funded to search for natural mechanisms of warming…it has simply been assumed that global warming is manmade"

This is a common anthem of climate deniers of the warming figures don't take into account natural causes (Bob Carter quotes volcanoes and heat island effect of cities). But this is not true, climate change science does take into account these factors.

>>The earth surface measuring network is so flawed that it beggars belief that so much bad policy is based on it as the primary source.<<

Where do you get this assertion from?

Ian Joyner Bronze badge

Lewis Page's reading of the BEST study

http://static.berkeleyearth.org/memos/Global-Warming-2014-Berkeley-Earth-Newsletter.pdf

is extremely selective. Note they say.

>> The'global'surface'temperature'average'(land'and'sea)'for'2014'was'nominally'

the'warmest'since'the'global'instrumental'record'began'in'1850;'however,' within'the'margin'of'error,'it'is'tied'with'2005'and'2010'and'so'we'can’t'be' certain'it'set'a'new'record.''

2. For'the'land,'2014'was'nominally'the'4th'warmest'year'since'1753'(when'the' land'surface'temperature'record'began)'

3. For'the'sea,'2014'was'the'warmest'year'on'record'since'1850'

4. For'the'contiguous'United'States,'2014'ranked'nominally'as'the'38th'warmest'

year'on'record'since'1850.''<<

Point 3 says for the sea 2014 WAS the warmest year on record. The oceans account for 70% of the Earth's surface area and absorb most of the heat (90% if I recall right). Combine this with point 2, the land being the 4th warmest year, and it is easy to see that for the whole planet that 2014 was the hottest. Of course, that could still be an anomaly and if so, denialists in the future will be cherry picking 2014 and saying "there has been no appreciable warming since 2014", just as they cherry pick 1998 and 2004/5.

Ian Joyner Bronze badge

>>But it might be worth remembering that the former are arguing for massive government and economic action, action which people would not take voluntarily - that is action which will make people poorer, then. In other words the warmists want to take away your money and your standard of living (for your own good, they would say).<<

Now that IS complete rubbish. People who are concerned about global warming are concerned about preserving our way of life, making it even better, NOT destroying it. Lewis' statement here is alarmist against the 'warmists'.

The proof is in Lewis' next sentence:

>>And standard of living is not just consumer goods, it's health care, it's regular showers and clean clothes, it's space programmes<<

Right, so NASA is really behind the warming agenda trying to kill off their own programme. If NASA were dishonest they would be keeping global warming a secret. Either that of Lewis statements about 'trying to destroy lifestyle' are wrong.

Big Blue's biggest mainframe yet is the size of a fridge

Ian Joyner Bronze badge

More information about Burroughs is available at:

http://ianjoyner.name/Burroughs.html

Ian Joyner Bronze badge

Hello Michael. >>Similarly, while ALGOL has ... some things to recommend it, it also has some grievous infelicities, such as pass-by-name<<

No, pass-by-name is the basis of today's higher-order functions. It is pass-by-reference that is a disaster and the mainstay of C. That is a disaster and makes C a low-level language rather than a HLL.

C really isn't a HLL. Operators such as ++ attest to that. OK, characterize C as a low-level language instead of structured-assembler if you will.

>>As for Dijkstra's comments - well, the man was a crank, frankly.<<

That pretty much undermines any value in your comments. Dijkstra was one of the programming geniuses of the 20th century. What are you thinking of as his complaints? "Goto considered harmful" perhaps. In fact, Dijkstra hated that title, but was forced to use it by his publisher. However, if we are to have structured programming, unbridled gotos are forbidden. Donald Knuth showed how gotos may be used in an entirely structured way, but that does not justify the unbridled goto.

Dijkstra's writings on many topics are available on the web and still make very good reading.

One of his observations was that there is a big difference between European computation science (software based) and American computer science (hardware and product based). That is really the difference between Burroughs and other vendors - Burroughs (and Robert Barton) believed in computers that should be programmable and designed by programmers. All other vendors more-or-less designed circuits first and then told programmers to try and make something of it.

http://en.wikipedia.org/wiki/Robert_S._Barton

The IBM 360 does have something to commend it - the best thing to come out of it was Fred Brooks' Mythical Man Month. But he might just be another crank curmudgeon in your assessment?

Ian Joyner Bronze badge

Re: Mainframes are dead are they?

There is a very good book and Eckert and Mauchley and Univac called "A Few Good Men from Univac"

Ian Joyner Bronze badge

Re: Burroughs beat IBM by over a year with the B5000.

Hi Arnaut. >>The B5000 may have been groundbreaking but, as it ended up being emulated on Xeon processors, doesn't that mean that the original architecture was a dead end?<<

Not at all. The B5000 (Now Unisys Clearpath MCP) was designed as a high-level architecture with features such as virtual memory and security baked in. Security was baked in since processes are not allowed to write out of bounds of allocated memory (both on and off stack). This was surprising for 1961, when security was hardly an issue. The basis of most viruses and worms is being able to write out of bounds.

The B5000 took a different approach to computer architecture - it had software people design the architecture to be programmable, rather than leave it up to electronic engineers who designed circuits pleasing to them then turning it over to software engineers to program in assembler and find the machine was pretty well unprogrammable.

http://www.computer.org/csdl/proceedings/afips/1961/5058/00/50580393.pdf

Towards the late 1970s with more powerful hardware it was realised the architecture could be done on cheap commodity processors and emulated. This is the basis of today's virtual machines such as JVM. It also came out of the B1000 which had a different emulated machine for every language.

http://www.computer.org/csdl/proceedings/afips/1963/5062/00/50620169.pdf

The B5900 came out early 1980s.

http://jack.hoa.org/hoajaa/BurrMain.html

Since then the B5000 line has practically been all emulated.

As for IBM 360's success - IBM had a very 'effective' (to put it nicely) marketing force. Burroughs marketing was hopeless. But this kind of success does not reflect on the architectural merits of the machines, because Burroughs wins hands down. It is not just a clever architecture - it is a very practical, secure, and efficient architecture. HP also came out of Burroughs and is where their revers-polish calculators came from.

"Affordable and scalable downwards has turned out to be the key to success"

That is exactly why current B5000s are emulated. You can go from running it on a laptop to biggest transaction-processing mainframes.

Now I could bemoan the fact that Unisys management did not get it (the way Xerox management did not get what PARC were doing). The architecture should have been decoupled from mainframes and made for personal processors.

Any future development in architecture should take what is in the B5000 and extend it to today's needs, rather than wallowing around in low-level insecure architectures.

Ian Joyner Bronze badge

IBM did not usher in the mainframe era - Burroughs beat IBM by over a year with the B5000.

https://wiki.cc.gatech.edu/fol...

The B5000 not only beat IBM in terms of time, it was a far more capable machine, the first commercial machine with virtual memory - from Manchester university, ten years before IBM claimed to have invented it. In fact, most systems need MMUs to implement paging, but in the B5000 it is built in from the ground up with no need for MMUs.

Burroughs used plug and play - not only did it allocate and deallocate memory on the fly (part of its stack mechanism as well as virtual memory), but peripherals could be plugged in without the need to do an IBM-style SYSGEN. I was told last year that on IBM zOS you still need to allocate memory partitions for programs - surely this cannot still be true?

The B5000 line is still well in advance of IBM in the Unisys Clearpath MCP systems.

There were so many other innovations in the B5000 that it is still ahead of the time of 2015. In fact, all computing people should study its architecture and understand why it is still an achievement over 50 years later.

http://en.wikipedia.org/wiki/B...

The IBM 360 by contrast was retrograde and a disappointment to the computer architects of the time, notably Edsger Dijkstra who said "In my Turing Lecture I described the week that I studied the specifications of the 360, it was [laughter] the darkest week in my professional life".

http://cacm.acm.org/magazines/...

https://www.cs.utexas.edu/user...

The achievement of Burroughs and its chief designer, Bob Barton - who very few computing people have heard of, knowing the names of Amdahl and Cray - should not be underestimated, especially as Barton went on to impart many of his different ideas to students as a professor at University of Utah, including Alan Kay and John Warnock.

The B5000 was also designed around a high-level language, ALGOL, which is far superior to C, which is not really a HLL, but rather a structured assembler, exposing all the foibles of underlying machines. The B5000 had its OS and all systems software written exclusively in ALGOL, years before Unix with C.

If one gets excited by mainframes, the Burroughs line is it. IBM is just ho hum.

Ian Joyner Bronze badge

Unisys are also releasing mainframes all the time, notably ClearPath MCP machines continuing on the original high-level system structure of Bob Barton's B5000 from 1964.

http://en.wikipedia.org/wiki/Burroughs_large_systems

http://en.wikipedia.org/wiki/Burroughs_MCP

http://www.unisys.com/offerings/high-end-servers/clearpath-systems/clearpath-libra-systems/clearpath-libra-8380-and-8390-systems

and you can get it on a laptop

http://www.unisys.com/offerings/high-end-servers/clearpath-systems/clearpath-libra-systems/software-developer’s-kit-on-clearpath-lx-laptop

Apple's 16GB iPhones are a big fat lie, claims iOS 8 storage hog lawsuit

Ian Joyner Bronze badge

Re: Would a car buyer complain about the space the engine takes up?

More interesting reading:

http://en.wikipedia.org/wiki/Von_Neumann_architecture#Von_Neumann_bottleneck

http://web.stanford.edu/class/cs242/readings/backus.pdf

https://news.ycombinator.com/item?id=5038330

https://books.google.com.au/books?id=D4nLBQAAQBAJ&pg=PA18&lpg=PA18&dq=burroughs+von+neumann+bottleneck&source=bl&ots=EwrEP_nBzD&sig=ds59XJIJpu7yGw27O5l46RUQqMs&hl=en&sa=X&ei=dQiyVNvWFcm58gWBhoDwBw&ved=0CFcQ6AEwCQ#v=onepage&q=burroughs%20von%20neumann%20bottleneck&f=false

Thanks for pointing out the Harvard architecture - it has clarified what is meant by the Von Neumann bottleneck.

Ian Joyner Bronze badge

Re: Would a car buyer complain about the space the engine takes up?

Charles - logical and virtual partitioning is fine, but a physical partition, or one that is fixed at system compile time (or SYSGENs as this horrible mechanism was called on IBM) is a very bad idea. Modern computing moved on from that. You mean Android really goes backwards and does that (aside from Dalvik making registers visible to programmers - a really bad, but common idea).

Partitioning is static and is known to waste resources, particularly memory. Customers aren't happy when their disk space runs out and there is heaps of free wasted space because the OS can't make use of it.

Your PS is confusing two things. Programs are loaded into the same RAM memory as data. This is the von Neumann model. However, a program should not be treated as data itself which can be overwritten (except in a virtual environment like LISP and descendants).

If a program can overwrite program code, that results in all sorts of security breaches.

I really suggest you study the B5000 architecture where both these aims are achieved - so yes the two are possible (because it is dynamic and the system configures itself on the fly, not at SYSGEN time).

Ian Joyner Bronze badge

Re: The storage is there, as advertised.

Actually, the web was developed on NeXT machines which is the basis of today's OS X on Mac.

[1] An OS is a resource manager. As such it gives programs as perfect an environment as possible. Program loading is part of this function, as is memory allocation (and garbage collection), media management, etc.

[2] That is the power of computers - you can really get it wrong and they are still very powerful (Church-Turing thesis).

[3] I can't improve on that one!!!!!!!! ;)

Ian Joyner Bronze badge

Re: Would a car buyer complain about the space the engine takes up?

Charles 9: >>Well, whatever happened to TWO nonvolatile stores: one for the OS that ISN'T counted, and one for the user space which IS counted?<<

That went out sometime in the 1950s when John Von Neumann (and actually people before him, but he is the most widely recognized one) recognized that software and data could both go together in main memory. Before programs were separate.

Partitioning is a pain. If the manufacturer kept a partition for the OS it would

1) risk that the next version of OS was larger than the partition or

2) waste memory space in the partition for updates.

Partitioning is a bad thing (I believe IBM still sets up their systems this way - separate partitions for programs in main memory - when I asked someone last year. I'd hope he is wrong).

So that idea of separate partition for OS is not practical of flexible. We like to keep flexibility in computing, even if it means overhead.

Ian Joyner Bronze badge

Re: @Vic (was: The storage is there, as advertised.)

MSDos on IBM PCs - who cares. Garbage OS on garbage machines.

Ian Joyner Bronze badge

Re: The storage is there, as advertised.

Jake:

1) For a start I'm not self-proclaimed anything. Did I say anything to that effect.

2) Which wiki are you talking about, there are lots of them.

2a) if you mean wikipedia, it is quite good for technical issues.

3) if you think I am wrong, put up an argument not smug derision and personal attacks.

Ian Joyner Bronze badge

Re: Would a car buyer complain about the space the engine takes up?

Charles 9: >>Then how about this? The interior of the car has 10 m^3, but the seats and dash occupy 4-5 m^3 of it. At some point, this smacks of "half the truth, twice the lie." <<

You are trying to disprove the original analogy by obfuscation and complexity. I think the original clearly explains to users why you don't get all the space as it is specified.

Software that takes up non-volatile storage space is the norm. It is just too complex to measure it another way.

Ian Joyner Bronze badge

Re: @ Ian Joyner (was:@Vic (was: The storage is there, as advertised.))

Vic: >>programmers should never have direct access to registers

Errr - bullshit.<<

Defend that claim. I'll defend mine (I hope you can understand it). All computable computations can be done with a single flat memory space. Hierarchical memory adds nothing to computational power - it is merely an implementation detail for efficiency (computations go faster when data is close the the CPU).

When programmers start deciding what registers data should be loaded into, their programs become implementation dependent and not so easily movable between machines. Register allocation should only be determined by compilers and better yet, the runtime system, (Google's Dalvik is just weird).

Consider Turning machines for flat infinite memory. Then for practical example look at Burroughs B5000/Unisys MCP systems - the best CPU and systems architecture ever.

http://en.wikipedia.org/wiki/Burroughs_large_systems

Yet another example is cache memory - what is saved in cache is completely done without knowledge of the programmer - that should be extended to registers which are a kind of cache. In fact, variables in programs are just a cache of previously computed result. Perhaps we can do away with all programmer accessible variables.

What programmers must learn is that most of what they have been taught is rubbish.

Ian Joyner Bronze badge

Re: @ Ian Joyner (was:@Vic (was: The storage is there, as advertised.))

>>"Gates' mother was on same charity committee as IBM chairman"

Rumor & innuendo. Post proof this had anything to do with anything? I have never seen any. Wiki is not "proof".<<

Another reference: Overdrive by James Wallace p 145:

"Gates said his mother never stopped stressing the importance of family... He recalled the well-known story of how his mother's connections with former IBM Chairman John Opel had helped Microsoft make the deal of the century when Big Blue needed an OS for it first PC in 1980 and came calling on MS. At the time, Opel and Mary Gates served together on the national board of United Way."

See - you really can't make this stuff up.

Ian Joyner Bronze badge

Re: Would a car buyer complain about the space the engine takes up?

>>No, the case is equivalent to you ordering a car with space for 7 people, but, when it's delivered, there are only 5 seats. The engine is taking up the space where the other 2 people could have fitted.<<

Seemingly a good point but that's not the analogy at all. It is just saying the total space is something like 10 cubic meters. Computer storage has nothing as concrete as number of people you can get in the car. Your files can be any size whatsoever. The 16, 32, 64, 128 GB is just an indication of how much user data you'll get on, but there is no such definite measurement as a person's size.

Ian Joyner Bronze badge

Re: @ Ian Joyner (was:@Vic (was: The storage is there, as advertised.))

>>"Gates' mother was on same charity committee as IBM chairman"

Rumor & innuendo. Post proof this had anything to do with anything? I have never seen any. Wiki is not "proof".<<

Not rumour or innuendo at all, but fact.

http://en.wikipedia.org/wiki/Mary_Maxwell_Gates

http://www.nytimes.com/1994/06/11/obituaries/mary-gates-64-helped-her-son-start-microsoft.html

The whole story including QDOS (do you deny that as well?)

http://forwardthinking.pcmag.com/software/286148-the-rise-of-dos-how-microsoft-got-the-ibm-pc-os-contract

This stuff is not hard to find. I suggest you do some research, as with my comment on computational science and infinite memories. And actually wikipedia is quite good for technical issues.

>>"(which was very similar to the Apple II in overall architecture)."

So 6502s are similar to 8088s? Interesting. I never observed that. Care to share the particulars? I might just learn something.<<

Not saying that 6502 is similar to 8088, although programmer-accessible register-based microprocessor architecture is similar (programmers should never have direct access to registers).

The system architecture is very similar. 5.25" floppies, slots for extra devices. Those were copied from the Apple II.

>>Note that I'm not a PC+Microsoft fan, nor am I anti-Apple[1]. <<

Good.

>>I'm just irritated by all these supposed "facts" (which aren't) being propagated as "truth".<<

No, you are irritated by the fact that you have an erroneous picture of the world in your head and that is being disproved by THE facts.

>>I was hacking BSD on vaxen back then, and rather perplexed at the interest in the fairly useless single-tasking word-processing machines.<<

Unix was a toy OS compared to OSs like Burroughs MCP - the first OS written in a high-level language. Unix and C came along five years later and C is a terrible language - could not even copy the Burroughs define mechanism right.

You see I have the facts (although happy to change when new information comes along) and I put my real name on everything I write, whereas people like Jake throw mud around and hide behind anonymity.

Ian Joyner Bronze badge

Would a car buyer complain about the space the engine takes up?

The heading says it. This case and story is equivalent to someone complaining to a car manufacturer that the engine is taking up room that could otherwise have been used for their shopping.

I hope the lawyers think of that one - case closed - thrown out of court.

Ian Joyner Bronze badge

Re: @ Ian Joyner (was: The storage is there, as advertised.)

Me: "But in CS theory, you must have an infinite amount of memory"

Jake: "No. Just no. Kids these days."

Jake just showed complete ignorance of computational theory. It is important to remove any restriction on memory size when considering computability. Of course, to do a computation that requires a lot of memory will take a long time, and infinite memory will take an infinite amount of time. Thus time becomes the only constraint (and the only thing relevant to what is mistakenly called 'software engineering'). Go away and do some research. OK, I'll help you start:

http://en.wikipedia.org/wiki/Turing_machine

Ian Joyner Bronze badge

Re: You're GigaByting it wrong

AC: >>"The folks that cling to Apple, it's exorbitant prices and its censorship do so for a reason."

Yes, they don't want to admit they made a tragic mistake.<<

No, they got it right. You seem to think that people who buy Apple are stupid or something. Well, I know computing theory and what goes on in the industry really well along with a lot of others who are certainly not just 'dumb' users. We know that computing systems should be designed to not burden the user with the details of running a computer - that is just moving work from one place to another.

Many IT people who have a job running round after IBM and Windows junk don't like that and put up all of this nonsense against Apple and equivalents.

You assessment of tragic mistake is tragically wrong.

Ian Joyner Bronze badge

Re: The storage is there, as advertised.

>>Sometimes I look at the modern world and despair over the sheer waste ...<<

But in CS theory, you must have an infinite amount of memory (that's flat memory, no hierarchies of registers, cache, RAM, disk, etc all those are an implementation issue) to do all computable problems. This is the Turing model.

Some computable problems will fail because you run out of memory, others because you run out of time (intractable), and still other problems are impossible. Computing tends to stick to easy, tractable problems.

Ref: Algorithmics, The Spirit of Computing by David Harel.

Ian Joyner Bronze badge

Re: @Vic (was: The storage is there, as advertised.)

The original IBM PC was nothing two kids in a garage could not have done - only two kids probably would have thought of an operating system which IBM overlooked. Hence they ended up at Microsoft (Gates' mother was on same charity committee as IBM chairman), which sold IBM a system called QDOS (for Quick and Dirty OS) from Seattle Computer Systems.

Burroughs already had the B20 on the market (built by Convergent Technologies) a much more capable, multiprocessing machine using the 8086.

Apple was well advanced developing the Macintosh (released less than a year later) to do away with the sort of garbage the IBM PC was (which was very similar to the Apple II in overall architecture).

Ian Joyner Bronze badge

Re: @Vic (was: The storage is there, as advertised.)

Jake: >>And don't get me started on Intel's lack of MMU ...<<

MMUs are not needed - if you design your virtual memory system right. The only system I know to do that was the Burroughs B5000 (now Unisys Clearpath MCP) which was the original virtual memory system in 1964. No MMUs needed on those machines, since virtual memory was tightly coupled to the design.

Ian Joyner Bronze badge

Anonymous Coward: >>>>It is time to stop the industry using these tactics.

Agreed. It's worse in the business side of things. RFPs are often written by vendors on behalf of customers, to ensure the competition will fail. It's corruption, basically.<<

Now there we agree, AC. I'm sick of seeing system specifications built around particular solutions. Specific technology is the last thing that should be specified. See the ODP Viewpoint analysis scheme.

http://ianjoyner.name/ODP.html

Ian Joyner Bronze badge

Re: You're GigaByting it wrong

SQL God: >>Apple has always been know for its Mother-Hen attitude in protecting its users, vetting everything that goes on an iDevice and charging a premium for the service. The folks that cling to Apple, it's exorbitant prices and its censorship do so for a reason.<<

Garbage. What Apple is known for is making computers directly useful to non-technical people, not to tech heads or to non-techs with an army of technical support to fix up the mess.

Apple protects its users from malicious people - that could be overt as one who wants to put viruses on your system, compromise your data, or more subtle ones, such as those who want your details for marketing campaigns.

Apple does a good and successful job at that - their users are not inflicted with numerous viruses and worms, etc.

As for exorbitant prices? You can't buy cut price machines which you then have to add so much to and then pay for expensive software like Microsoft Office. You get iWork (a very capable package which does 99% of what you want) for free. If you look beyond the purchase price of the box, Apple will most often work out a cheaper buy.

Then there is TCO. You spend less overhead time in running and maintaining your computer - that's things the OS should do. Then you are far less likely to be affected by malicious software as I have already explained.

So there is a very good reason for sticking to Apple - it IS far better. You can get on with your work and life and NOT have to be a tech nerd.

Ian Joyner Bronze badge

Anonymous Coward >>"Funny you mention that, because Windows Phone does (or at least did) merge its main memory with the SD card memory."

Nope - Windows Phone has never done that. External storage is selectable, not merged.<<

That is exposing the memory hierarchy - which is an implementation detail - to the user. You probably don't understand what Apple has done (or is doing) or will argue against it as an old techie. Apple stores you data safely where ever. The user does not have to think about where or more importantly, how. Users no longer think in terms of lower-level abstractions such as files to store entities.

On iOS, there is no 'save' - whatever you are working on is automatically saved to SSD. Save is being deemphasized on OS X.

Programmers also should not think about memory hierarchy - they should not think about moving data from main memory to registers or where data gets allocated. That is a real backward move in Dalvik (Android's OS), where Google has exposed registers in the memory hierarchy.

Apple is going in the right direction here.

Ian Joyner Bronze badge

The Anonymous Coward writes: >>No, what it SHOULD be is that you have 32GB (ALL 32GB) available to you and the OS resides in its own dedicated space or whatever. Yes, I consider the PC analogue to be deceptive also. At least with a hard drive, you don't expect 20% of the space to be stuffed with programs, and I hate shared-memory graphics chips.<<

That is much too confusing to users. Which version of OS do you have - they all take different amounts of memory. With 16GB OS takes up proportionately more that 32GB and that more than 64 or 128GB.

This really is still a non story.

Ian Joyner Bronze badge

Anonymous Coward wrote :"This is par for the course with how Apple designs products. It is far more important that they maximize their profit than to make the product useful and usable. So no SD cards as then people wouldn't pay the exorbitant Apple storage ta..."

No wonder you post as anonymous coward - what you write is complete and utter rubbish. I have written elsewhere in this thread that there is no shortage of idiots like you who want to hold on to the old industry where computers rule people. Apple changed that to people rule and use computers. There are still many people who resent that.

The power of the IT professional is broken - get over it.

Ian Joyner Bronze badge

It is a non story in the sense that this is not peculiar to Apple - it is done by every vendor that sells an OS with their machine. The whole focus of the story was a beat up on Apple as if it is something that Apple does that no one else does.

This kind of story has to do with IBM since it was the way that they obfuscated and confused the industry by throwing mud at other vendors. Their army of followers were quite happy to spread the mud and FUD, and these people while dying off thankfully still have their presence felt.

In fact, it goes back to before IBM at NCR where Patterson put around stories about rival cash registers. His partner in crime, T.J. Watson senior who went on to found IBM and use the same tactics except from his experience at NCR, he learnt how to be more subtle about it so as not to get caught.

Reference: Richard Delamarter "Big Blue: IBM's Use and Abuse of Power"

It is time to stop the industry using these tactics.

Page: