flame on
Let me say up front this gentleman has given the industry and indeed the world more than I ever will and RIP but man that CV after 2000 is painful to read. Wanted to get paid I suppose.
Larry Tesler – self-described "primary inventor of modeless editing and cut, copy, paste" – has died at the age of 74. Tesler had a hand in many of the computing concepts taken for granted today. On his website he wrote: "I have been mistakenly identified as 'the father of the graphical user interface for the Macintosh'. I was …
Personally, I don't like GUIs, and I don't like modeless UIs - I actually had an extended argument about this topic with a fellow student way back in my undergrad years, and have not changed my position in the decades since.
Nonetheless I recognize that user interfaces, user interaction models, and user experience, while not part of computing theory in the strict sense, are very much a part of computer science. They are consequential. Tesler was responsible for some extremely influential innovations in those areas.
The worst anti-intellectuals are those who denigrate work outside their narrow domain of preference.
Tesler was also known from the so-called 'AI Effect', which he formulated as follows:
"“Intelligence is whatever machines haven't done yet”. Many people define humanity partly by our allegedly unique intelligence. Whatever a machine—or an animal—can do must (those people say) be something other than intelligence."
Very sad to hear that he has passed away.
While we use consciousness less than we think we do. If I consciously think about typing I make more mistakes for eg. But I needed it to learn to type in the first place (we did typing, on manuals not even golfballs in the first year of Secondary in NZ). There’s also the thing where you are driving/cycling/running/walking along a well known route and you don’t remember sections of it. You were in automatic mode.
But we are quite a way from genuine AI. We have a good inkling that thought relies on the fact that our neurons have configurable on the fly receptors and other flexibilities such as reverse flowing action potentials which do things to the dendrites. IOW consciousness requires architecture flexibility.
Therefore trying to create it in inflexible silicon is not going to cut it. On the bright side fears about the Singularity are overblown. But then to us Neuroscientists* listening to Physicists and that ilk talk about consciousness is often absolutely hilarious. Many of them seem to see at as something akin to magic. Quite what it is about the photon that enters my eye causing the universe to split vs the one which I missed is never explained. There’s a strong whiff of ‘god did it’ about the whole thing.
*Muscle is an excitable tissue, it fires action potentials so it falls under the aegis of Neuroscience. I have cut and counted spinal ventral route axon numbers as well.
Quite what it is about the photon that enters my eye causing the universe to split vs the one which I missed is never explained.
You're a big collection of atoms, as soon as you interact with anything you become entangled with it and you're too big to be in states that remain coherent enough to show interference. The photon that misses you does the same thing to whatever it hits, you catch up when you interact with that.
We tried to deliver your second photon, but unfortunately our driver had 100% certainty regarding your address, so we could not make the delivery. We left it with a random inhabitant of another universe. Please do not call customer services as we will not be able to assist you.
Probably not. It will take a few components that we have as yet no real grasp of put together in the right way. Looking at nature we see things with a few hundred neurons doing tasks we find difficult to do with massive computers and yet if we take the time to emulate the structure of (say) a nematode brain it is farm more capable than we had imagined. Arrogance is getting in the way of research here.
A few hundred neurons yes, but with extremely complicated connections, and a billion years of natural selection to get it right.
It is interesting to compare an analog computer with a digital computer. Many years ago I designed a simple piece of test equipment which used three analog ICs, one of which was a multiplier. Because people wanted a digital readout (not just numbers) we redesigned it using a microcontroller, programmed of course in machine code in those days, an A/D converter and a two character display. The analog version would have had well under 100 transistors in total across its three ICs, the microcontroller had to do actual arithmetic and required thousands.
Yes you could do more other things with the microcontroller, but nematodes and sea slugs don't spend much time thinking about the meaning of life.
Well, that was Rogers Penrose's argument, that consciousness may depend upon quantum phenomena.
Back to the question of Artificial Intelligence, I'm with Iain Banks' reasoning of why General AI should be possible: because to accept otherwise means believing either that nothing is intelligent, or that intelligence requires some magic that we humans can never reproduce through technology.
Penrose can spout a lot of bollocks when he puts his mind to it. As far as I am aware the only thing that relies on quantum processing in the biological world, other than its 'normal' chemical or physical effects, is improving the efficiency of photosynthesis in grasses etc.
Very cool, but not really related to consciousness. ;)
I think part of the problem is no one can agree on what consciousness *is*, exactly, or how to test for it. Until we can answer whether a chimpanzee has consciousness, or a raven, trying to create it in an AI is pointless -- we won't know it when we see it. As was noted above, we've essentially defined it as something unique to humans, which makes it a bit tautological.
Damn - I thought I invented that! It does seem to be an emergent property of meat based computers when faced with smart silicon. Wont be long before AI insecurity is built in.
I'd not come across him before - possibly due to being pissed of with his homonym being regarded as a God by those without electrical knowledge. Seems I missed out on a lot!
Fascinating. I remember enquiring on a Games forum last year what a particular poster knew about "the AI Effect" seeing as how he came on as being so incredibly knowledgeable about everything.
He replied: "Nah. I don;t use it anymore. The M6 may not be the best way for heading up north, but it beats the A1 any day."
Although I have never actually seen or touched one, I always hankered after a Newton.
Yeah, it’s probably just as well I never got my hands on one, but 25 years later the appeal of a screen that I can scribble on, which then magically becomes orderly text is appealing...
... a screen that I can scribble on, which then magically becomes orderly text ...
Your salvation is at hand! Hie thee to any recent Samsung tablet wiv a stylus and fire up Samsung Notes.
For me it's been a dogsend, as handwritten notes on an A5 reporter's pad are a bugger to search, while my tablet takes my filthy scrawl (took it a bit of time to get used to my handwriting, but these days it's better at recognising my scrawl than I am) and turns it into proper, searchable text so that finding something from a meeting 3 months ago becomes a breeze. Unlike my Surface Book, which still recognises 1 word in 10 and insists on putting everything on multiple postit notes
Of course, it means being seen in meetings with a Samsung* tablet, but you can't have everything.
* other stylus driven tablets are (presumably) available.
I'm not sure they would be that much different. At the time silicon was simply too puny to do much and data connections were too feeble. The smartphones included a lot of different technologies that had been developed separately over the previous decade. Apple understood how to package these into a new and exciting, nay, magical product, but it couldn't have been done if the phone industry hadn't bankrolled much of the development.
Agreed. Palm and others showed that there was a market for handheld computers, and they pretty well showed what could be done (in an economically feasible manner) with the technology available at the time over the course of that era. We got smartphones when smartphones became technologically and economically viable.
The closest we came in that era was the Handspring Treo series of PalmOS phones. They were nicely integrated, not just a phone taped to a PalmOS device -- for example, SMS and phone calls were handled by PalmOS apps, and PalmOS apps could use cellular data. But the market for such things at the time was not that big, and their hardware was a bit flimsy.
Apple had to wait for multitouch technology becoming available. Smartphones and tabled PC were already there and worked, but without multitouch they were much less versatile to use. And displays too didn't have much resolution. Nor cellular data was fast and especially cheap.
Palm devices were all you could get with the Newton-era technology. Not enough for Jobs, evidently.
Could have mentioned that instead of the Newton. NoteTaker was a 7" touchscreen 5 Ghz 8086 computer complete with Ethernet developed in 1978. I would advisably skip the word "portable' as it weighed a hefty 22kg, nevertheless Tesler did once demonstrate using it in an aircraft.
I have memories of times in front of character based screens controlled purely by the keyboard where certain repetitive workflows achieved speeds that cannot be achieved by the dynamic nature of pointers. The keys are always in the same place but using a mouse to find a menu item starting from an variable place in a document requires a far higher level of interaction to achieve the same result. Two or three unconscious keystrokes can deliver something a lot faster than waving a mouse around and requiring to hit the right menu items in quick succession.
The great thing about key-bindings is that they're so flexible. The worst thing about key-bindings is that they're so flexible.
They're great for frequently performed tasks and bloody useless for infrequently done ones. And they're system-specific, which the ones that work for one program / environment won't work for others.
GUIs have basically won because, as Luke Wrobleweski puts it, "obvious always wins".
A menu doesn't mean you can't have "shortcuts".
But the problem were not keyboard-controlled apps, it was applications that need to enter and exit command and edit modes, like vi - and where keys do different things depend in which mode you are.
Ah, he was the guy that fought for modeless interfaces. Apple was good at promoting that in the past, but they lost it a bit: non-modal is also that you don’t have to click OK for everything, or force the user to do something before he can continue. Like in Adobe Photoshop (*froths at the mouth*)
Somebody said that he just c&p'd Pentti Kanerva's work (fact check: "wiki", "clipboard").
But those Finns are doomed for that. Now Trump is supposed to copy & paste idea for assimilating (rest of) Nokia, as a salvation of MAGA idea (this time 5G instead win mobile). And Linus already saluted c&p to WSL. So, it has to be Larry.