[sldev] LL: Most major projects done? Time to return to haptics :-)

Dale Mahalko dmahalko at gmail.com
Tue Apr 8 01:19:16 PDT 2008


We are rapidly reaching the point where LL is completing a bunch of
major projects that have been discussed literally for years now.
Database scaling, physics, viewer updating, and soon Mono... ?

Poor LL may be running out of incredibly huge and ambitious things to
occupy their time with these majors out of the way. So let me make a
small suggestion for your next big project. :-)


I would really like to see a way to interact more directly with
virtual environments, and current haptics technologies just do not cut
it. Most of them involve sitting in an office chair still staring at a
computer monitor while twiddling a small movement-limited knob or
joystick. Even the best available technology, the CAVE, still has us
just waving our hands around at nonphysical things we cannot touch.

You know what I would like to do? I would like to rez a cube, and have
the option to reach out with my hand, grab it, and stretch it, turn
it, move it around, etc. I'd like to be able to select by pointing at
a prim and waggling my finger at it.

I'd like to go for a stroll, physically walking through the virtual
world, sitting down on a virtual park bench, resting my back on a
virtual chair cushion. Then stand up and literally fly up into the air
soaring through the clouds, able to feel my body doing physical
somersaults and barrel rolls in the air...


Sure, sure, this will someday be supposedly possible through nerve
implants, the Matrix, blah blah blah. We are decades and possibly a
century or more away from that occurring. And besides I really don't
want wires jabbing into my spinal cord, and people hacking my brain,
thank you very much.

So what's the next best thing?

There are already people working on a full-body exosuit, that works as
a strength enhancing exoskeleton. It has sensors that detect small
muscle movements and multiplies that by several times to assist in the
lifting and moving of heavy objects. This is intended for common
everyday use, in job situations where nurses for example must be able
to lift and move hospital patients with high strength but also careful
sensitivity.

I see this exosuit concept as possibly the ultimate virtual reality
haptics interface, if it were reimplemented in a different way for a
completely different purpose. Rather than using it to lift physical
objects, the exosuit could become the ultimate haptics interface.


The suit would be mounted on a suspension boom in midair, with the
ability to rotate freely 360-degrees on any axis and with
force-reaction cylinders permitting sudden rapid movement of the
exosuit up to a meter in any direction. The exosuit would also include
a high-resolution stereoscopic HUD and 3D stereophonic sound for the
wearer that fully immerses the wearer into the virtual environment..

The exosuit sensors operate so that the wearer feels no weight on
their body from the supporting exostructure, and the sensors can
quickly move the suit to match any movement of the wearer.

The key detail here is that the joints of the exosuit are translated
to apply to an avatar in the virtual 3D world, such that movement of
the wearer of the exosuit is duplicated by motion of the avatar in the
3D virtual environment.

Raise your physical arm and wave to a friend and your avatar
immediately does the same. Nod your head, and your avatar does the
same. Eye-tracking in the HUD and ultrasonic 3D topography scanning of
the face could allow your avatar to match (or map to) your physical
facial expressions.

The exosuit also receives feedback from the virtual environment to
provide direct force-feedback to the suit wearer, such that walking on
rough ground in the virtual environment feels rough, as the exosuit
boots move against the wearer's feet, to simulate the rough unstable
ground.

If the person stumbles and falls forward, the suit flips the wearer
into a suspended prone position and suddenly jerks hard upward to
simulate falling to the virtual ground. Reaching up, the wearer's
avatar can grab a ledge and pull themselves up to a sitting position,
with the exosuit rotating and moving in midair to now provide a
"solid" ground platform, firmly supporting the legs in a straight out
position from the hips, as if the wearer were really sitting on the
virtual ground.

The strength of the exosuit force-reactiveness could be controlled as
levels of "user-familiarity", from novice to advanced user, with the
full force potentially strong enough to cause skin bruising from
high-impact events such as falling off a high-speed virtual motorcycle
and the avatar slamming into a virtual brick wall.


We've already had force-feeback amusement rides at Disney parks for
years, and with the work the exosuit concept, all that needs to be
done is to bring the two technologies together for a new application.

An exosuit applied to virtual applications could be far lighter than
an independently mobile strength exosuit, since the VR exosuit has no
need to carry its own power source. A mobile strength exosuit must
also have its own heavy actuators mounted directly on the frame, while
a VR exosuit can operate using external hydraulic pressure supplied
from the support gantry, allowing it to be very light and nimble.

So I say enough with these desktop "force feedback" toys and "VR"
stereo glasses, and put an end to the CAVEs. I want to literally step
into and directly physically manipulate and experience my Second
Life..

How about it, LL? It doesn't seem like you'll be too busy after Mono
is released.. ;-)

- Scalar Tardis / Dale Mahalko


More information about the SLDev mailing list