A wild rumor on some of what Apple might have up their sleeve sometime soon.
The Multi-Touch Screen
(David Pogue, New York Times)
After the Jobs demo, I called Jeff Han, fully expecting to hear how angry he was that Apple had stolen his idea without permission or consultation (it’s happened before).
Instead, he knew all about Apple’s project. He didn’t say that Apple bought his technology, nor that Apple stole it—only that he’d known what had happened, and that there was a lot he wasn’t allowed to say.
Anyway, he returned to TED this year for a new presentation, showing how far the multi-touch technology had progressed (hint: a lot). He also set up his eight-foot touch screens in the TED common area, so anyone could try it.
(via AppleInsider)
Anyway, it looks like Apple has hired this guy or licensed his technology or some such. (They also bought a company called FingerWorks who developed similar technology for touchpads.) They showed this technology working in the keynote where they introduced it for the iPhone. But this video shows it can be used way beyond a cellphone screen. Very cool looking stuff. It takes the original “you will want to lick it” of Apple’s Aqua a step or two further. You want to fondle it. Definitely watch the video. There is a second video linked from the Pogue post too, but at the moment it isn’t working for me.
I’m not sure exactly how it would work in, say, a new iMac. I mean, do I really want to touch my screen to move windows around and such? But if anybody can do cool things with a technology like that, it would be Apple. If it turns out this is one of the “hidden features” in Leopard and the new range of iMacs all end up having multi-touch screens and all, I will be quite jealous that Amy is the next one in the family in line for a new computer…
it reminds me of the futuristic movies (like Minority Report) where they grab screens with their hands and move them around.
Having had and used a FingerWorks keyboard for some time, I wasn’t the least bit surprised to have read the rumors (though I’ve seen no definitive confirmation) of the FingerWorks acquisition. The sort of gesturing that could be done on the keyboard was very similar to what is seen on the iPhone.
Unfortunately, it didn’t understand the gesture that I often give my computers. Apparently, it requires some contact.
That’s pretty much what it is… but real.
So Greg, did you actually use the gestures extensively? It looks cool and all, but I admit to having a hard time imagining actually using it a lot… but then again, I’ve never tried it.
I did use the gestures. In fact, about the only things I could do effectively on the keyboard were the gestures. Much as with my switch to the Kinesis, I couldn’t keyboard at the time, and learning to do so on the FingerWorks keyboard was near impossible for me.
As a result, I could do all sorts of cool things with gestures, copying and pasting, switching applicaitons, scrolling every which way, and all of the application-specific stuff. Unfortunately, I couldn’t ever get any text entered anywhere at a reasonable pace. I had to sit and stare at the keyboard to type.
In short, I see the gesturing as pretty useful for navigation and such. And now that I can actually touchtype, I might even be able to type on the thing. But I don’t know, because I gave the keyboard away to someone else at work.
Hmmm… guess I’ll just have to try it.