PHI:Phantom Haptic Interface

Status
Not open for further replies.

Rick

Valued Senior Member
Hi everyone,
==============================================
the Phantom Haptic Interface is an innovative technique
developed by Haptics Group of MIT's ARTIFICIAL LAB.This device imparts a realistic feel to an imaginary object.,allowing people to touch and feel it.
The Phantom exerts a robust external force on the user's fingertips,creating an illusion of interaction with "solid" but still virtual object.smooth spheres,flat walls,sharp corners,even texture can be felt!...you just have to insert your fingertip inside the Phantom Socket to start feeling the virtaul objects.PHI's technology makes it superior to buzzing tactile simulator.the device has low insertia,low friction and unbalanced weight,so movements through free virtual space are unimpeeded and smooth.

The Haptics lab of MIT,more famously known as touch lab,is also pioneering work on human touch perception.it is conducting vast research on neurology,pschophysics,motor control and computational models to understand human Haptics better to make this technology more effective.

bye!
 
Great idea. Hope we will see some real applications soon and not more white papers and seminars and funding for someones retirement program....
 
famine?

IS THERE A FAMINE OF POSTS OUT HERE????NO POSTS,NO REPLIES.:mad: Just couple of guys hang around a while saying something a few posts perhaps and then this goes blank...

whats up guys?
no posts,nothing...

i hope Merlijn you read this.

bye!
 
Interesting stuff. I attended a seminar a few days ago put on by the leader of one of the UI groups at the MIT Media Lab (there are a lot!). He was involved in fusing physical interfaces with virtual interfaces.

One of the more interesting he talked about was the use of standard image projectors along with some sensing equipment that would project your digital information from a computer onto a surface. The kicker was that you could interact with physical objects on the surface, whose movement would be noticed by the sensors, and would directly change the virtual environment. If you ever saw Final Fantasy, it's almost similar to their 3D holographic interfaces where you'd grab and manipulate information directly with your hands.

This sort of thing is best described with some examples (both of which are in systems designed for architects).

The first example was the use of real, physical models of buildings placed on a flat interaction surface with the projector mounted directly above. Using some geographical information, the shadow of the building was simulated on the table. Moving a building around updated its computed shadow in real time. Speeding up the simulation meant you could see in a few seconds the movement of the shadow as the sun went overhead.

The second example involved the use of clay models of landscapes. Apparently landscape architects often make these clay contours by hand when designing a landscape. In this case, the projector was again mounted above and a laser was used to measure the height at each point on the clay surface. The projector then projected computing information, such as coloured topological information, drainage routes, wind flow vectors, shadows, etc. all in real time. You'd just sit there playing with the clay and be able to see the impact on all factors immediately. Very cool stuff.
 
Status
Not open for further replies.
Back
Top