[sldev] Body motion and facial expression tracking, Microsoft did it

Argent Stonecutter secret.argent at gmail.com
Thu Jun 4 05:00:53 PDT 2009


On 2009-06-03, at 14:01, Tigro Spottystripes wrote:
> handshaking without haptic feedback and with SL's usual latency  
> wouldn't
> work, I imagine it would be possible to make a handshake work in SL   
> by
> adapting the IK code used to position the hand and arm while selecting
> an object at arm's reach, and then locally on each client play an
> animation over the IK moving both avatar's hands (or the IK target
> itself) up and down (if it was done server side, or just on one client
> and communicated to the other, or each client animating only it's own
> av, there would be the risk of things not matching or lagging too  
> much I
> guess)

Yes, that's something I've often thought of. I think just allowing a  
small amount of scripted IK based on having avatar joints track a prim  
if they were within a certain distance would allow this and more. With  
or without camera tracking... you'd wear a handshake attachment that  
would track the matching prim. Then if one or both users had camera  
tracking code that would be used to drive the handshake, if not the  
handshake would track the avatar's canned animations.

And you could have sit or stand animations with appropriate scripted  
overrides actually put your hands on your hips or your knees no matter  
what your avatar size.

llTrackAnimation(integer joint, key target, float strength);


More information about the SLDev mailing list