back to article The Canon Cat – remembering the computer that tried to banish mice

Vintage systems guru Cameron Kaiser documents stripping down and fixing a Canon Cat – the most revolutionary computer you've never heard of. Inside the Apple Museum. Canon Cat V777 computer. A Canon Cat V777 computer. Pic: Grand Warszawski/Shutterstock The Old VCR blog – it's short for Old Vintage Computing Research – is …

  1. An_Old_Dog Silver badge

    A Pebble is Not a Raindrop

    There have been a succession of computer UI creators who have attempted to "simplify" computers by papering-over the details. Doing so limits the usefulness of those computers.

    There's the "app mentality", wherein you do things by interacting with apps, and the data is considered a non-concern -- it's just an artifact of using an app, and lives in some unspecified data store. Data export/import/revision control is left up to the app, typically poorly-implemented, if implemented at all.

    There's the "spreadsheet mentality" where people use spreadsheets for everything. Want a database? Put it in a spreadsheet. Want a text document? Put it in a spreadsheet ...

    As much as I dislike using mice and other pointing devices (does anyone remember "light pens"), they have their place. I would not want to try creating an AutoCAD drawing with keyboard commands and the arrow keys. Nor would I want to attempt to edit still images or videos with a keyboard.

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: A Pebble is Not a Raindrop

      [Author here]

      > I would not want to try creating an AutoCAD drawing with keyboard commands and the arrow keys. Nor would I want to attempt to edit still images or videos with a keyboard.

      Absolutely fair.

      The thing is, how many people need to do that?

      I have never build a CAD drawing in my life. Indeed, all I do with images is rotate and crop them. My preferred graphics app is IrfanView, which replaced PaintShopPro when PSP became a bloated mess because it tried to be all things to all men.

      I loved Acorn RISC OS. I loved Classic MacOS. I loved Psion EPOC and EPOC32. I loved BeOS. I was extremely fond of NewtonOS.

      I am a Unix professional and have been for nudging 40 years now, and I don't like Unix and never have. I am not a developer. I don't want to be. I do not want or need an OS that is designed to be the weapon of choice for a C developer. But that is what I use most of the time because it's the best option available to me today.

      There is plenty of room for lots of OSes with different use cases for different people.

      Of course there are tasks that are absolutely essential for some people and I want them to have a tool which is good for them. But I don't want their tool.

      I don't make music. I don't draw pictures. I don't record or edit video. I don't write code. I don't want tools optimised for any of those things. I am uninterested and indeed distinterested.

      I write words. I would like an OS that is heavily optimised for that, with extremely rich keyboard controls, based on the CUA standard and extended in clean linear ways. I am not interested in learning new UIs and new controls at my age. I want loads of formats, I want outlining built in, word counts and character counts and paragraph counts. I want online quote dictionaries and grammar and style checkers, and if it EVER tries to parse my text as program code it goes in the shredder.

      I want tools optimised for the job. When I am cooking I want a Sabatier knife, not a Swiss Army Knife and not a chainsaw.

      One size does not fit all and it is foolish to think it can.

      1. Mike 137 Silver badge

        "One size does not fit all and it is foolish to think it can"

        "PSP became a bloated mess because it tried to be all things to all men"

        "I am not a developer. I don't want to be. I do not want or need an OS that is designed to be the weapon of choice for a C developer"

        Surely this is the key point. The very best (i.e. most convenient, useful and effective) technologies tend to be dedicated to single tasks (or at least to single categories of task) if for no other reason than the need to thoroughly understand the problems to be solved in order to design an appropriate tool for their solution. An 'all things to all men' tool will inevitably both fulfil some purposes less well than others and also omit some use cases that have been overlooked by its (inevitably generalist) creators.

        This lesson should be broadcast loudly and often in the face of attempts to create general purpose AI, where the distinction between the incipient offerings of that and application specific AI is already clearly apparent. It's a fantasy that 'technological progress' will automatically eliminate that distinction -- it's inherent in the nature of real world problems and the capacities (whether humann or machine) needed to solve them.

      2. Jeff3171351982

        Re: A Pebble is Not a Raindrop

        This reply could be lyrics to a song.

      3. doublelayer Silver badge

        Re: A Pebble is Not a Raindrop

        This is where I think it is good to make a distinction between a computer system and an application. By computer system, I mean not only the hardware, but the kernel, the UI(s), the tools available for developers, and all the things that, as a non-developer, you don't want to have to deal with. Computers should not be designed for a single purpose, because for everyone who has slightly different priorities to you, it won't be worthwhile. So they won't buy it. So the company making it will have to increase the price so that only those with your priorities can support their development efforts. So you won't buy another one and people like you won't either. So your version won't get updates or support. So the entire thing will be seen as a failure and dumped into the dustbin of computing history.

        The computer system should be designed in such a way that as many applications are possible, and then the applications can be written to fit your requirements. What you appear to want is a really full-featured word processor. If you had that, you could stay in it for almost all your time. The problem with making something only that word processor is that there might come a time when you need to do something that it can't, and then you'll want some other application and you probably don't want to buy new hardware to get it.

        That's why you need general purpose tools like a mouse. You may not need it very often in your word processor, but other applications will, and the computer will only be useful to anyone, including you, if those other use cases are possible with someone else's applications. If you don't want to use the mouse, you can always unplug it. Removing those things won't help you even if you don't use them.

        That's also why ditching the filesystem doesn't work, because in order to move data around in an organized way, you have to be able to find the specific chunk of data you're looking for. The highest-profile attempt to hide that recently was Apple's IOS, and it kind of worked for a while because you can't create that many things on an iPhone and, in the early days, Dropbox was a de facto filesystem for a lot of IOS apps. Of course, it didn't work forever and there's now a partially available file system and a client on every IOS device.

      4. An_Old_Dog Silver badge
        WTF?

        Re: A Pebble is Not a Raindrop

        Is it that you want an operating system optimised for your specific task, or that you want an environment optimised for your specific task?

        A program can easily-enough be that environment. For people who "live in an app", e g., an editing or typesetting system, an auto-parts catalog/ordering system, SAP, or whatever, they can (if the system is properly set up) live in that app/environment while being uncaring and unknowing about the underlying operating system and how it works. They show up at work, turn on their terminal, desktop PC, or thin-client micro-box, sign in with their user id and passphrase, and are automatically taken into their app or menu shell.

        The problems with CAT-like devices come when (a) the user wants/needs to back up their data, restore their data (or just *some* of their data), export their data to the outside world, or, discover (as many people do), that they now wish their single-purpose device to perform an additional function which it doesn't currently do.

        I believe that, for example, dedicated word processors were obsoleted because they could not import data (customer address lists, customer account amounts owing past 30 days, etc.), and they could not export data, or export data in a form which the rest of the world could understand ("Just email me a copy of the doc so I can bring it up on my PC. No, don't snail-mail me hard-copy, or a 'CPT Data Cassette', whatever that is...").

        The old Atari 400/800 series home computers seemed to bring the best of both worlds. Pop a a game cartridge into the right slot, press [RESET], and you had a "dedicated" game-playing machine. Pop the Atariwriter cartridge into the right slot, press [RESET], and you had a dedicated word processor. Pop a programming cartridge e.g., 8K BASIC, into the left slot, press [RESET], and you had a "regular", user-programmable computer.

        Pity the slot and cartridge contacts wore out so quickly.

        1. heyrick Silver badge

          Re: A Pebble is Not a Raindrop

          Another example could be the Amstrad E-Mailer. While the primary failure of this device was the hidden costs (you paid per minute online, and per email sent - both via premium rate numbers), the secondary failure is that it was primarily aimed at firing off short emails....and not a lot else (other than intrusive advertising). They eventually made a souped up version that could (drum roll) play various classic Spectrum games...that were rented.

          Essentially a basic single purpose device that was rather expensive for what it actually did, it was quickly destined for the dustbin of tech failures.

          This was then followed by the Bush Internet box, a better idea that connected between your phone line and TV to provide "the internet" (and without the Amstrad premium rate nonsense). Unfortunately being based on a version of Fresco running on a version of RISC OS, and both stripped down to the absolute minimum, it failed to support various commonplace protocols (I think it originally only supported 40bit SSL), only two or three specific printers, and more importantly no Flash (which was starting to be used heavily at the time) and very basic JavaScript (ditto) which meant that an increasing number of sites were either essentially blank pages, or short messages telling you to turn on features that you didn't have access to.

          Alas, even though the tech specs weren't bad, it had absolutely no offline functionality. You couldn't compose emails peacefully at your own speed, or even run any applications. So it too was eventually flogged off dirt cheap and thrown into the dustbin of tech failures. Though it's worth noting that a Zip drive and some fiddling can get a proper RISC OS desktop (and apps) running on it. It was just dumbed down too much, too hard targeted at it's one single purpose.

      5. BobChip
        Pint

        Re: How many people need to do that?

        How many people need to do that?

        Well I do! - and I can confirm that trying to use CAD (Libre CAD), graphics (Inkscape), or mapping software (QGIS) without a mouse, while just about do-able is, for any practical purpose, essentially impossible. On the credit side, the effort WILL drive you to drink. My wireless mouse broke this afternoon, and I can't find a spare!

        My next strategy is to have another generous glass of Ozzie 14.5% Shiraz - sh***! is that all that's left in the bottle already!

        I'm on Amazon for TWO new mice first thing tomorrow, and then off to the local wine shop...

        1. Alistair
          Windows

          Re: How many people need to do that?

          While you're at the mouse wrangling, grab a trackball and give your wrist a break.

          *grin*

          1. An_Old_Dog Silver badge
            Go

            Re: How many people need to do that? [Pointing Devices]

            Oh, yeah, trackballs for the win when using CAD-like programs, or game-map-building programs (Worldcraft, Jackhammer, etc.)!

        2. An_Old_Dog Silver badge

          Re: How many people need to do that?

          As a computer tech, I had to modify AutoCAD drawings all the time, to specify where in a room a customer wanted network jacks installed. Facilities had all the buildings in our central campus + our metro-wide dotting of auxillary locations in AutoCAD, and gave the PC techs their own layer to play in, while all the other layers were read-only for us.

          I use GIMP heavily for hobby still image stuff. I use Inkscape lightly. I've used, and supported, ArcGIS for work. I use several video editors for hobby stuff. And by preference, I use text editors in keyboard-only mode.

        3. Ian Johnston Silver badge

          Re: How many people need to do that?

          I haven't owned a mouse for twenty years, every since I discovered that IBM made keyboards with trackpoints. I'm writing this on a Lenovo X220 which has both a touchpad and a trackpoint. The touchpad has been disabled since the day I bought it ...

          1. Missing Semicolon Silver badge
            Joke

            Re: How many people need to do that?

            ... weirdo. :-)

        4. John Brown (no body) Silver badge

          Re: How many people need to do that?

          "Well I do! - and I can confirm that trying to use CAD (Libre CAD), graphics (Inkscape), or mapping software (QGIS) without a mouse, while just about do-able is, for any practical purpose, essentially impossible. On the credit side, the effort WILL drive you to drink. My wireless mouse broke this afternoon, and I can't find a spare!"

          Not so very long ago (might even still be the case for all I know), if you used CAD in a drawing office, you very likely had an A4 or A3 size drawing tablet with pen and maybe a puck. Most or even all of your menu options and choices would be on an overlay on the tablet so you rarely ever had to leave that input device and use the keyboard other than to enter text on the drawing or when naming a file to save.

      6. An_Old_Dog Silver badge

        Liam, Why are You Suffering?

        It sounds as though you know what you want, and do not want, in your word processor, and, that current programs don't fit your needs.

        Why are you suffering with this situation? Why not recruit a like-minded group of wordsmiths, establish a list of features and non-features, and a time- and money-budget, write up an RFP, and see if some company or programmer(s) will configure and/or modify an open-source editor to do what you want?

        Emacs seems the logical choice of editor to base this project on.

        1. John Brown (no body) Silver badge
          Trollface

          Re: Liam, Why are You Suffering?

          "Emacs seems the logical choice of editor to base this project on."

          Shirley you mean VIM!!!

          1. An_Old_Dog Silver badge

            Re: Liam, Why are You Suffering?

            VIM is a fine editor, and I use it about as frequently as I use Emacs, though for different sorts of things.

            I suggested Emacs as a base for Liam's potential project because I know one can change Emacs keybindings via built-in menus. I'm guessing there are Emacs modes already written which will do much of what Liam desires.

            As to which is "easier", modifying the guts of VIM (written in C), or modifying the guts of Emacs (written in Elisp), I can't say.

      7. Yet Another Anonymous coward Silver badge

        Re: A Pebble is Not a Raindrop

        >I would not want to try creating an AutoCAD drawing with keyboard commands and the arrow keys

        What I miss about classic Autocad is having the commandline.

        Most drawing construction consists of entering lines as angles and offsets, it's only the editing that needs a pointy thing. And even then a pad+pen is easier than a mouse best.

      8. SammyB

        Re: A Pebble is Not a Raindrop

        So what you are really looking for is Wordperfect.

      9. deadlockvictim

        Re: A Pebble is Not a Raindrop

        Back when I wrote as a hobby in the mid 1990s, I liked the Powerbooks and Nisus Writer. I had a PowerBook 520c and then a Wallstreet with Nisus Writer versions 4 to 6 and they made a great combination. I could type in Japanese (as I had wont to do), formatting was great, spell-checking was great and there were keyboard commands for everything.

        This is not a recommendation or a suggestion, I'm just waxing nostalgically.

    2. ovation1357

      Re: A Pebble is Not a Raindrop

      I'd argue quite strongly that that current offenders in the UI sphere (although they like to call it UX these days) are all in a race to the bottom right now to see who can paper over the most details.

      I'm a huge fan of keyboard-driven applications and there are cases where you still find people using an old mainframe terminal application such as in a car spares department where the operators are lightning fast on the keyboard, no mouse, no graphics; they can find a full list of parts for your vehicle along with stock level and pricing (even location in the warehouse) within a split second of you telling them your registration. The 'modern' equivalent driven by mouse and keyboard is likely to be much slower.

      In my world I'm fighting the UI zealots behind the Gnome/Wayland projects. Not that I object to them doing wild and radical things in their UI but because those wild and radical changes are being imposed on users of other UIs, choice is being removed and long-standing features are being removed on the basis that they're old.

      We've had a period of relative stability for a couple of decades, where most OSes and applications support doing most things using either the keyboard or the mouse. Now they're all chasing after touch screens and forgetting all the people who prefer keyboards and mice :-(

      1. Gene Cash Silver badge

        Re: A Pebble is Not a Raindrop

        Yes. I found FVWM lets you customize EVERYTHING, so everything now has a keystroke. I can blaze through whatever I need, then I go back to Windows and start screaming "Why isn't there a keystroke to minimize all the windows?!? Why isn't there a keystroke to make a window full-height?!?"

        Instead of having a machine that can do only one thing, like apparently the author wants, I have a machine that does everything well.

        When I'm creating something for 3D printing, and I have 3 OpenSCAD windows, 8 EMACS windows, 2 PrusaSlicer windows, and half a dozen xterms and browser windows, it lets me manage it all and stay on top of what's what, without ever touching the mouse.

        1. alanjmcf

          Re: A Pebble is Not a Raindrop

          Windows Key + M - Minimize all windows.

          Windows Key + Shift + Up arrow key - Stretch desktop window to the top and bottom of the screen.

          https://support.microsoft.com/en-gb/windows/keyboard-shortcuts-in-windows-dcc61a57-8ff0-cffe-9796-cb9706c75eec

          I never gets old, folk “screaming” because they simply expect all operating systems to work exactly the same way as the one they normal use.

          1. Evil Scot Bronze badge

            Re: A Pebble is Not a Raindrop

            I prefer Win + D to Toggle Desktop view.

            Also Planet + D on my Android Phone returns to the App Launcher.

        2. David 132 Silver badge

          Re: A Pebble is Not a Raindrop

          Not disagreeing with your fundamental point, but I thought I’d mention AutoIt (and its cousin AutoHotKey) as worth a look - they both make it pretty easy to create keyboard macros for just about anything.

          “Minimize the current window, whatever it is, and pop Excel to the fore, zoomed to 150%, with the mouse positioned over the first cell”? Easy.

          1. Missing Semicolon Silver badge

            Re: A Pebble is Not a Raindrop

            AutoHotkey was an absolute boon, I had windows zipping all over the screen wherever I wanted them, with (and this is important) no modality. If a window was 1/4 of the screen, that's the size it was. And cascading windows!

            Now, I'm pretty certain with a bit of Python I could create something similar - but it is going to be an uphill struggle.

            And why does no Linux desktop allow "cascade all matching windows" any more?

    3. Roland6 Silver badge

      Re: A Pebble is Not a Raindrop

      I think you need to refresh your memory of the personal computing scene in the 1980s.

      Tools like AutoCAD were still new and needed expensive (largely dedicated) workstations, which were different to the terminals (and PCs) spreading across the offices being used as “more functional” typewriters. Back in the mid 1980s I basically ran Word, Excel and Harvard Graphics/Powerpoint on my PC, ie the basic office toolset. For programming etc. there were the Sun’s.

      1. doublelayer Silver badge

        Re: A Pebble is Not a Raindrop

        If this was just an argument about whether the Cat should have had a mouse, that is relevant. However, with Liam suggesting similar restrictions today, it no longer is. The separate CAD machines in the 1980s had a point: the software and hardware on general purpose computers at the time was insufficient. That no longer applies today, and what was a minor problem in the 1980s would now be a major problem.

        That also assumes that CAD is the only reason why a mouse would be recommended. CAD is just one use case where mice are considered useful. I speak as someone who tends to avoid using one when possible, but Xerox and Apple didn't put mice on their GUI computers for CAD which neither of them could even run at launch. They were needed for lots of classes of applications, and those computers succeeded because they could run lots of classes of applications.

        Single-purpose devices will always remain niche, and where they take similar amounts of hardware and software to build, niche is very expensive when you have to pay people to build them. When you add in the extra proviso that the niche thing can't even do something that a different computer can't do, then you reach the fatal zone. For instance, people buy book reader devices with e-paper screens. Those can't do a lot, and therefore you spend a lot more for a relatively low-resolution screen, a cheap low-end SoC, and one application than you would for anything else. People still buy them because they like the screen. Now try to make and sell a book reader device with an LCD on it and I guarantee you a failure. People won't have to look far to realize that you're selling a tablet that's locked down to one app, and they could have any other tablet and run this app on it rather than buying your device. The standalone word processor is similarly doomed.

  2. 45RPM Silver badge

    With all the experience (prejudice) that have built up over nearly half a century of computer use, the Cat seems to me a howling at the moon mad concept - and I’m very glad it exists. We need more experimentation - or we won’t advance.

    The Z88 was also very interesting and somewhat appliance like. Everything on that was a spreadsheet. The Newton was also curious - if I remember correctly, there were no files or folders at all - everything was in one searchable ‘soup’.

    Nowadays, and much as I love Unix, everything works the way Unix does - under the bonnet at least. I wonder if that’s really progress?

    1. An_Old_Dog Silver badge

      Under the Bonnet / 9Front

      If you want something "different under the bonnet," try the open source descendent of "Plan 9" -- it's called "9Front". Warning: don't expect things to be like Unix or Linux.

      Plan 9 was the Unix inventors' next research operating system.

  3. AVR Bronze badge

    Older mice - well, circa 1990, the first time I saw one - were pretty crappy with a tendency to get stuck on one vertical or horizontal line and collecting dirt and hand grease like that was their purpose. I can see wanting to make a computer which bypasses the entire concept. As we know that didn't work though. Keyboards also suck in their own way when dealing with graphical displays.

    1. Yet Another Anonymous coward Silver badge

      In 1990 they were great, as long as you didn't rotate the special Sun optical mouse mat 90deg

      1. timrowledge

        Not that anyone would ever do that to some poor unsuspecting person. No way.

        1. David 132 Silver badge
          Pint

          You forgot the "muhahahahaha" on the end of your comment!

  4. simmonsm

    Secret Weapons Of Commodore ?

    I clicked on your link to the 'Secret Weapons Of Commodore' site and my first thought was "Really ? Unreadable. They need a web site designer !"

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: Secret Weapons Of Commodore ?

      Personally, give me text over fancy design any day.

      I recommend the Readability view in Firdfox for de-crappifying over-designed sites:

      https://support.mozilla.org/en-US/kb/firefox-reader-view-clutter-free-web-pages

      https://add0n.com/reader-view.html

      1. Martin Gregorie

        Re: Secret Weapons Of Commodore ?

        I'm with you about plain text over fancy graphics: the documentation that comes with a Fritz!Box router is, IMHO anyway, a good example of how NOT to do it, but fortunately, there's a huge and relatively comprehensive plain text manual that you can download from their website. I haven't found anything in the FritzBox built-in semigraphical manual about sharing a printer among attached PCs, but it is described in the comprehensive downloadable text version, which even has both a decent layout and an index!

        If your job involves designing user interfaces that will both help the operators to use the system you're designing and are a good match with the tasks they are carrying out, IMHO you should at least take a look at James Martin's "Design of Man-Computer Dialogs". This book describes almost any type of interface device you can think of, the types of interaction they suit best and any gotchas' they introduce. This book was published in 1973 and is still relevant even though some of the interface types it covers (punched cards, teletypes) are no longer in common use.

        In short, IME if you're a systems designer this book deserves a place in your library just as much as Wirth's "Algorithms + Data Structure = Programs" and Sedgwick's "Algorithms" belong in a developer's library and Khernighan & Pike's "The Practice of Programming" is a must-have for helping a new developer to acquire good coding habits and write easily maintainable code.

        1. Mike 137 Silver badge

          Re: Secret Weapons Of Commodore ?

          There's also been masses of solid research by our universities into the ergonomics of graphical screen interfaces over the last three or so decades. For example, there is (or was) a unit at the university of Bedfordshire (UK) where they did experiments in element clarity, eye tracking and mouse movement to optimise the user experience. Sadly none of it seems to have been read by the current echelon of "UX" designers, who seem to think that making things harder for users (e.g. pale grey text on a dark grey background, windows without obvious borders and narrow scroll bars that are invisible until you mouse over where they should be) is clever.

      2. This post has been deleted by its author

    2. tiggity Silver badge

      Re: Secret Weapons Of Commodore ?

      I could read it and navigate with JS disabled for the site (my default on "unknown" sites - I decide whether to enable some / all JS for a site after inspecting the site & update my allow / block settings accordingly) - so that is a big win from me (it is ******* ludicrous how many sites just show a blank page with JS disabled), and it was quick to load with a small footprint ("proper" small images instead of a multi Mb image and browser then displaying it as 100 x 100 (or whatever) as is far too common these days) compared to many sites.

      Did not mind the slightly retro look & feel as generally if I look at a web page I am after information, not pointless bells & whistles , so not

  5. Anonymous Coward
    Anonymous Coward

    FORTH ....

    If you think debugging Python or C++ is hard, try FORTH ...

    I might still have the assembler source code of the version I wrote knocking about somewhere lol

    Am amazed to see that they are still going ...

  6. anthonyhegedus Silver badge

    UI for its time

    It does appear that LEAP was a UI for its time. The mouse was still a new concept, and people didn't all take to it. Keeping your keys on the keyboard because you spend all your time typing was still a thing. The keyboard was familiar to computer 'operators' and a specially designed one with the two LEAP keys seemed to make sense at the time. In truth, it was just a more complex way of doing simple operations - albeit faster once you knew how it worked.

    The thing though is that it couldn't do _everything_ that a mouse could do, and it was just a clever sort of shift key. Keeping everything in one 'app' had advantages but ultimately it was a jack of many trades and a master of none. Other systems tried to do the same thing (Lotus Symphony anyone?) and ultimately failed. There might be quicker ways of doing things, but we need ways that introduce the least cognitive load, and that's what all these things don't do very well.

    The current paradigm of apps and files and a desktop controlling it is more like how we work (we use graph paper to draw a graph, lined paper to do an essay, blank paper to draw a picture) innately. Current OSes try to relieve any pain points and there are ways of automating repetitive work. But not everything is perfect. A current pain point with apps is when things are greyed out and you don't understand WHY. Another is having to dig through menus to look for an option (anyone tried to do anything in Word or Excel that isn't a common function, and tried unsuccessfully to navigate those ribbon menus?).

    So we're not there yet, AI control looks like the way forward: rather that work out how to do something, just tell the computer what you're trying to do and it'll work it out for you. That's not without its issues or dangers, but one thing modern UI designers have got right is keeping mouse control, keyboard control and voice control all at the same time. Everyone is different and works a different way.

    LEAP is just a different way of doing things, and maybe modern UI (UX) designers can learn a thing or two.

  7. ArguablyShrugs

    System‑wide LEAP keys could be pretty nice, even with our current mouse UIs

    Writing some long text, jump directly to a previous instance of "cat" by holding LEAP BCK and typing out c a t instead of cmd+f, cat, enter, cmg-shift-g

    Reading a long thread on a forum, jump to the very first mention of "cat" by LEAP FWD and c a t

    Et cetera.

    The key for me would be system‑wide, obviously. I know I can do some of the above in many different apps already, but each has its own way of searching, different find dialog etc. Some file managers have used a similar concept for years to quickly jump to a typed file or folder, but each does it a bit differently and not consistently.

    Keep the Find dialog in all its glory of regex and text replacements, but a very quick consistent search could indeed by nice. Could even use holding Space for a LEAP dead key, as I don't think any recent keyboards have them...

    1. doublelayer Silver badge

      Re: System‑wide LEAP keys could be pretty nice, even with our current mouse UIs

      I'm not sure you'll be able to make it system-wide because it generally relies on the application to search for you. If you made it common enough, people might adopt it voluntarily, but it wouldn't change the situation where an application decides to reinterpret what it does even if you'd be happier with using the previously standard behavior. For instance, I can see a browser using those keys as back and forward buttons, then because they've implemented behaviors for the keys, they don't bother implementing the hold and type method.

      I also have an ergonomic objection to your suggestion, which is that holding one key and typing out a string means either using one hand to type while holding it or performing finger acrobatics to use the remaining four, and the latter only works if you put it close to the character keys which probably requires moving more keys. You could implement it differently. For one example, not necessarily the best, press both leap keys to enter a term, then press one of them to move by that term. The term can be saved so subsequent presses of a single key jumps between them.

  8. cuvtixo

    When I first glanced at the title, my mind went back to the CueCat from 2000, a fairly useless barcode scanner, and apparently that memory is more relevant than I would have thought when first discovering my mistake. Ironically quite a few young people learned to hack on this device just because it originally it was free, then liquidation meant they could be bought for pennies, and... well it's yet another story to tell, perhaps some of you can connnect the story of this non-mouse peripheral to modern UI design, and pebbles and raindrops and whatever

    1. An_Old_Dog Silver badge

      Cue Cat

      The Cue Cat was a keyboard wedge barcode scanner which was intended to be used via a MS-Windows-only orogram which reported the unique Cue Cat serial number to the Cue Cat company's server(s), along with the bar code you had scanned. The servers saved that info to help them build a marketers' profile on you, then returned the text description of the barcode you had scanned to the Cue Cat program, which displayed it on-screen.

      That functionality -- including profile-building -- as been replaced by various smartphone apps / Google Android functionality which uses your snartphone's camera to scan 2-D QR barcodes.

      I had a few Cue Cats. As they were keyboard wedges, they worked fine as simple barcode scanners under any PC OS, though the product-lookup feature worked only via the MS-Windows program.

      Some simple hardware surgery would neuter your Cue Cat by returning a serial number of all-zeroes.

  9. Ian Johnston Silver badge

    Using a Cat, you just typed text… but you could edit it, or enter a table of numbers and tell the computer to total or average them. It wasn't just a word processor: its single program was also a spreadsheet. You could highlight some of the text on screen and format it, or tell the device to print just that part, or to send it over to another computer – because the Cat was one of the first computers to have a built-in modem to communicate over the telephone network.

    It sounds very much like the Cambridge Computers Z88, which had a combined spreadsheet/word processor program called PipeDream. Very nice, too - I particularly liked the very small outline of your document which appear to one side and made navigating with a 640x64 display surprisingly easy. No inbuilt modem, as far as I can recall.

    I wrote and published a Z88 <-> Atari ST data transfer program. It sold precisely one copy.

    1. ChrisElvidge Bronze badge

      Lotus had Symphony for MS-DOS - combined spreadsheet, word processor, database.

      Used it extensively in the early '80s on Apricot computers.

  10. Jason Hindle Silver badge

    Forth

    “ Its software was implemented in the legendarily efficient Forth language (as also used in the far less radical Jupiter Ace home computer)”

    I'm not ashamed to admit Forth was a paradigm too far* for me.

    * In my defence, I was 13 and haven't looked at it since.

  11. Anonymous Coward
    Anonymous Coward

    Back in 1987 nobody cared about the Cat because it was nothing orignal

    By 1987 I had shipped my third Mac product and was running my first dev team. For a startup. In California. There had been lots of rumors about what Raskin had been working on since he had been pushed out of the Mac Dev Team. And rightly so. The Mac he wanted to ship would have been as big a bust as the Cat. Even so the first time I saw a Canon Cat at a trade-show and read the reviews (in InfoWorld, MacWeek etc) it was a huge yawn - so thats it? Because it was little more than a rehash of a whole bunch of UI ideas that had been kicking around on various UI shells that had been running on MS/DOS, Apple II etc for years.

    The Cat failed because it was basically a vanity project which paid not the slightest attention to who the users were back then, how they used software, or more importantly, totally blew off anyone who might even think of writing software for it. A gloried VDU with more in common with the CDC PLATO system of almost two decades before than personal computers of that era. And more importantly, what they became in the next decade. It failed for the same reason X Terminals failed. There was no market. Of actual users.

    I bough Raskins book when it came out and it was also a big disappointment. Very thin gruel. In the era when HCI people like Bruce Tognazzini , John Carroll, and Bruce Shneiderman were writing really exciting and original books about Human Computer Interaction and better ways for people to interact with software and harness the power of computers Raskins book at the time struck me as neither original, exciting nor innovative. For those of us who were looking for ways of improving the ways software and users interacted. In the real world. It struck me as rather backward looking. Reminded me of something you would read in an issue of Creative Computing circa 1978. Interesting in 1978 but the personal computer universe was a totally different universe 10 / 20 years later.

    The Canon Cat was a product of a 1970's mindset. Which is why it totally failed. Honestly, no one cared about the Cat back in 1987. Far more interesting stuff going on back then. Which actually did change the way 100's of millions of people used computer software to get things done.

    Dont know about the details about upper management at Canon back then but they seemed to back serial losers. After losing about $10M+ on the Cat they turned around and blew $300M+ on Next, Inc. With only 30k seats sold by 1993 to show for all that money. And if Amelio had not been so desperate and so suckered in by Jobs double dealing, that would have been the end of that particular failed product investment.

    Things looked very different if you were actually there at the time. In 1987. Working in the business.

    1. fg_swe Silver badge

      False /Next

      Next is the root of both MacOS 10 and later AND iOS. Arguably two of the most important products of our age, measured by user count, revenue and profits.

      Also, Next is a leap forward in technology as compared to the Unix contraptions of its time, but it is still a powerful Unix at the core. Much better than X11/Motif and similar GUI monstrosities.

      Jobs was a genius and Canon could see that. Not sure they lost their investment.

      Compare Next to the drudgery of HPUX/AIX/SOLARIS and the beancounters who led those enterprises. Then you can see the light of the jobs genius.

      All of that is true even if you consider him to "just" be the orchestrator of Next.

      1. fg_swe Silver badge

        iphone

        Jobs also made the Smartphone happen, while Nokia was asleep at the wheel.

        He could imagine a software-heavy phone, he knew there should be a single appstore, a touch UI etc. An always on internet connection.

        All based on Next, to the present day.

        One of the greatest men of the western world, because he proved all the collectivists who inhabit the bank industry wrong. The MBAs ran NOKIA into the ground.

      2. Anonymous Coward
        Anonymous Coward

        Re: False /Next. So you were n't there either...

        ...In the Valley. In the early 1990's

        If Amelio had not fallen for the BS spiel that would have been the end of NeXT. End of story. Read Amelios book. It took real balls to admit you were a total sucker.

        What Jobs did with the iPhone was break the monopoly of the carriers which had totally bollocks up and balkanized J2ME so there was no actual viable app platform. So no app market. The Symbian platform was an ever bigger mess.

        The same thing happened with SMS in the US market. In 2006 SMS was ubiquitous in Europe and a huge revue stream. in the US texting was almost non existent with at least six carrier specific ring fenced systems with not very compatible (and expensive) gateways between systems. A tiny revenue market. The iPhone and Android broke the carrier ring fenced text monopoly and soon SMS was everywhere in the US.

        All the iPhone did was break the carriers control of the mobile platforms and then Android waltzed in and quickly took 80%+ of the world market. iPhones are for the rich peoples markets. Not Asia, Africa and South America which is actually most of the world market. Which the companies who publish market share "reports" dont give a damn about. Apart from China and India that is.

        As for MacOS X. A dogs dinner which eventually mostly worked (just like with XCode). Only took them 15 years. Shipped any products for it? I have. I'd take developing for Win32 any day. MacOS X has a unit market share pretty much the same as MacOS during the worst years almost 30 years ago. Less than 5%. But more importantly the MacOS software market share collapsed with MacOS X from 15%/20% of desktop OS software in the mid 1990's to the margin of error levels of the last two decades. Less than 3%. That's why there is no MacOS dev ecosystem anymore. Mostly just upgrades of legacy MacOS 8/9 titles. By revenue.

        Jobs a "genius"? Really? Then you obviously dont work in the business. Or know the history of the business. He was a really nasty criminal psychopath. Not only is every bad story published about Jobs true but the many unpublished ones known to those who were around at the time are much worse. NeXTStep was always Trade Show demoware software. As the people who tried to port heavy duty desktop applications to it discovered the hard way. Thats why there were so few successful ports to NextStep. Many tried in the early days. All had given up by 1992.

        But I suppose if you were only used to a command line on a SPARCstation or a SGI Indy those flashy NeXTStep demos must have looked great.

        1. fg_swe Silver badge

          F.U.D. ?

          1.) Apple is one of the highest capitalized companies on the globe. They rake in fantastic profits. Something they must do right.

          2.) Apple hardware and software looks nice and is ergonomic, unlike most competitors. I know from personal use of Linux, Windows, Apple.

          3.) I did a bit of MFC and a bit of the GNU Next UI clone. The Next thing was much better.

          4.) I also used Motif, which was horrible compared to Next.

          5.) Of course where is light, there are shadows. The walled garden of iOS, for example.

          6.) Android is a knock-off of iOS. Jobs/Apple invented it. All your explanations cannot change that.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like