[sldev] Avatar Animation

Melinda Green melinda at superliminal.com
Fri May 22 13:48:35 PDT 2009


Second Life already supports all of these things but they're not 
terribly well tuned. Someone already mentioned that it supports 
voice-triggered gestures as well as lip sync but the quality is 
questionable. It also supports a rich avatar attention system that I did 
a lot of work on to allow it to be tuned/customized. It's currently 
difficult to notice, but when your head joint is not overridden by 
something of higher priority, your head will simetimes turn to look at 
nearby people when they type. Your head will also lock onto anything or 
anyone that you alt-click on, but that's hard to see because it also 
moves your camera, but they will see you turning to look at those things 
and people. The work I did can also let you choose what/who you look at 
whenever you left click on them. That's a great way to explicitly give 
attention to other people. Notice also that when you look at an object, 
you'll look at it's center, but when you look at a person, you'll look 
directly into their eyes. If you want to play with this system, the 
easiest way is to start by replacing the attentions.xml file in the 
character directory with the example attentionsN.xml. Examine their 
differences and you'll start to see the power of what can be done by 
simply tuning the parameters.

-Melinda

rweave at gmail.com wrote:
> This is my first post and I hope I am not far off topic here.  If I 
> am, I ask your tolerance.
>  
> I have been following this thread concerning user tracking to animate 
> the Avatar.  I would like  to point out an interesting approach used 
> by There.com.  There.com uses speech recognition to extract cues to 
> animate the avatars.  As one is speaking, the avatar makes very 
> lifelike movements.  Lip  sync is excellent and facial expressions as 
> well as posture and hand gestures quite realistically follow the 
> content of the speech.  While this may not be as sophisticated as head 
> tracking, it does lend a great deal of realism to the experience there.
>  
> It would seem to me that this method of avatar animation would be 
> easier to implement than tracking the users head movements and facial 
> expressions and then translating these data to avatar movements.  
>  
> As a slight aside, avatars in There.com frequently make eye contact. 
>  This is sadly missing and depersonalizing in Second Life.
>  
> I enjoy this mailing list a great deal and I  know I am among esteemed 
> company here.
>  
> eRobert Allen
> ------------------------------------------------------------------------
>
> _______________________________________________
> Policies and (un)subscribe information available here:
> http://wiki.secondlife.com/wiki/SLDev
> Please read the policies before posting to keep unmoderated posting privileges


More information about the SLDev mailing list