Apple patents touchscreens/The work of Jeff Han

Its certainly a fascinating video, though it would probably be best for clarity to describe it as being about touch screen interfaces that recognise multiple touches.

Also, nobody has mentioned any source for this news that Apple is looking into this at all. I have seen some of the NYU stuff before, and have seen the SmartBoards as well as a music controller for DJs that uses the technology. Have you got a link to the patent?
 
Yeah like I said I don't know any more about it than what is says on that there page, but this is the rumours forum, and its a fun video so there you go...
Apologies for apalling typing in the first post, it was very late...
 
Mikuro said:
They have a link to Apple's patent info.

Thanks. :)

The sensing display includes display elements integrated with image sensing elements. As a result, the integrated sensing device can not only output images (e.g., as a display) but also input images (e.g., as a camera).

A device comprising: a display area; an array of display elements located within the display area, each display element capable of displaying a pixel of information, either alone or in combination with other display elements; and an array of image elements located within the display area, each image element being capable of capturing visual information from a source in front of the display area; wherein each image element has a lens that does not interfere with any display elements.

This doesn't sound anything like the multi-touch technology shown in the NYU video. Rather, it sounds like an ingenious way to use the display as a camera/scanner. If this could be mass produced as cheaply as existing displays, you could use your laptop screen as a flatbed scanner ... I've said it before and I'll say it again, technology never ceases to amaze me.
 
By golly, my fantasy of converting my basement to a near functional scale replica of the USS Enterprise bridge is just this much closer! Was going to go orginal series all the way, but now...? :)

Just as the mouse and its characteristics defined interaction and the GUI, so too will a touchscreen based UI. It will be very interesting to watch this evolve.
 
I cannot wait (actually I can) until giant multi-touch screens are de jure for computing. Goodbye RSI, ass-growth and eyestrain. It'll be fun again.
 
I was reminded of this thread and thought I'd look to see if there is any more news from Jeff Han. The Perceptive Pixel web site still seems to be a single page, but I found a video interview with Jeff Han, where he demonstrates his creation to the interviewer, in follow-up to the Perceptive Pixel demo.

http://blog.centopeia.com/2007/03/21/jeff-hans-multitouch-demo-ii/

In the video, he mentions that 3D interfaces don't seem so suitable, as humans really need something to push against. It seems to me this also relates to some complaints that touchscreens don't offer enough feedback for users (eg. compared to pressing actual keys). Manipulating images of objects in 3D would prove all the more difficult, I'm sure...

A Wired article has a text-based interview with Jeff Han. In it, he says they are already shipping and that their customers include the CIA! Apparently, they use it for manipulating geographical data...
 
Back
Top