LucidTouch: See-through UIs come to life

I’m really having trouble with this video. I don’t think this whole idea — essentially the transparent devices we’ve seen before in patent filings — would ever really take off simply because this seems like a solution looking for a problem. See, there is a camera on the back of the device and a front touch screen. The camera senses your hand position and relays that to the screen, which draws your hands on the screen and reacts to touch and drags. Why, however, would you want to do this?

“As soon as you put your hands on the display you [obstruct] the screen,” he says, something he calls the “occlusion problem”. Users of iPhones have other problems too, he adds. “Multi-touch devices detect the entirety of the touch area,” Wigdor continues. “That’s what we call the ‘fat finger’ problem.”

That I understand but I’m just not feeling this. Anyone else having trouble with this interface design? Am I just being an old fuddy-duddy?

‘Transparent’ gadget could trump iPhone interface [NewScientist]