Whenever your browser is in privacy mode it deactivates the device ;)
For years, the way we interact with computers hasn't much changed. Keyboards and mice have been the mainstay of computing for decades. Of course, there have been experiments, like clunky VR gloves, and more successful gizmos like Kinect. Leap Motion, for example, tried to bring gesture control to your PC in 2013, where it …
OK, with that out of the way, all these notion control things seem to forget one very important point - if you're not more convenient than the alternatives, no-one is going to care. Why would I want to drop £200 for the privilege of being able to control Netflix by flailing around wildly when I can achieve exactly the same using a remote or a mouse? Why would I want to be required to act out a charades routine when giving a presentation when a pointer or mouse works just as well? Or rather better, given that you'd appear to need a pointer in addition to this thing anyway. Motion controls have failed every time so far because they invariably make controls less accurate and less convenient.
The presentation mode does include an on screen pointer. But yes, while it's a neat idea I have mixed feelings about how useful it could be. I suspect that it may be a boon to people with some conditions who may find using other types of controller difficult, but for a lot of us you're entirely right that a normal remote is likely to be simpler for a presentation.
Until you pop into the kitchen to make a cup of tea and have a wank, only to come back into the lounge and find that your smart TV is now tuned to QVC and you've ordered thirteen gross of Disney Princess™ hand towels, an illuminated jacuzzi and booked a 14 day holiday for three to a Swedish cheese farm.
Dunno about masturbation, but I did get one of these and an Arduino with a Bluetooth and a motor control shield, wrote some software for my laptop, connected the Arduino to a vibrator, and made a gesture controlled sex toy. It's kind of fun, winding up a girlfriend just by gesture, though in fairness I doubt it will ever be a killer app.
I don't see why not - all it's doing is reading the electrical signals beneath the skin. Obviously the calibration is set up with arms in mind, but you can create your own profile or even access the raw EMG data.
So, you might need to do some fiddling to come up with appropriate gestures, but I don't see why not. Maybe I'll have a play later and see what I can make it do.
I bought 2, one for a friend with extreme hand problems and one for me to maybe extend its capabilities for the friend, if it worked out. It's slightly irritating, it kept wanting me to re-sync my gestures. My friend had problems using ti also. In the end its a bit too limite. It does work to pause videos from the lounge though.
Myo didn't live up to the potential as optimistically predicted by its Kickstarter campaign. I've not had a great run with Kickstarter projects, at least this one delivered.
Sigh. Another of these periodic attempts to map the "wow omg technology is so cooool" mindset of the !Bong¡ Startup Set into real life - and it will fail.
Douglas Adams characterized the biggest problem with gesture-based technology far, far better than I ever could:
"The machine was rather difficult to operate. For years radios had been operated by means of pressing buttons and turning dials; then as the technology became more sophisticated the controls were made touch-sensitive--you merely had to brush the panels with your fingers; now all you had to do was wave your hand in the general direction of the components and hope. It saved a lot of muscular expenditure, of course, but meant that you had to sit infuriatingly still if you wanted to keep listening to the same program. "
Humans evolved to make large motions with limbs; moving from one place to another, lifting things etc. and manipulate small things with small precision movements of the hands.
For the vast proportion of the population with no handicaps it is most efficient to use the small precision movements that hands are capable of. In our high data rate environment hands are the only thing that is going to cut it, and until a device is capable of operating with minimal force and displacement, whilst maintaining accuracy / minimising false readings it is not going to succeed.
We are constrained by our anatomy and I cant see the mouse and keyboard begin replaced until we have reliable brain interface translators to bypass our collection of bones and muscles.
"[...] until we have reliable brain interface translators to bypass our collection of bones and muscles."
Whether we execute thought actions is undecided until we actually activate the limbs or vocal chords. So any cyber interface ends up as having to depend on the nerve signals to the muscles. Merely thinking about something can be a "shall I - shan't I?" internal conversation in which we might "practice" the move several times before deciding what to do.
Biting the hand that feeds IT © 1998–2021