[sldev] Body motion and facial expression tracking, Microsoft did it
Jan Ciger
jan.ciger at gmail.com
Thu Jun 4 06:55:52 PDT 2009
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Argent Stonecutter wrote:
>
> Yes, that's something I've often thought of. I think just allowing a
> small amount of scripted IK based on having avatar joints track a prim
> if they were within a certain distance would allow this and more.
Argent, read my comment to Tigro's mail. It wouldn't work. At least not
in a nice way. For reaching and grasping you need much more IK than just
the three arm joints and then you are hitting a severely
under-constrained and computationally expensive problem. The trouble
with IK is that it treats joints equally and many things that are
computationally possible do not look realistic. There is no intrinsic
notion of "comfortable" or believable pose for IK and the solver has no
means to prefer those.
E.g. in one case I have seen the solver to keep the hands next to the
avatar's waist but stick the waist forward to reach a goal. Was the
objective achieved? Yes. Was it believable animation? No way. We have
"fixed" that by tweaking the joint weights (give more priority to arm
joints and not to pelvis), but then it blows up in another case, which
your tweaks made actually worse (a pose where moving the pelvis is
actually preferred to flailing the arms). This leads to a lot of special
cases that have to be defined, definitely not a scalable solution.
IK is a nice tool, but extremely hard to use unless you have an animator
guiding it.
On the other hand, the option to actually be able to do
scripted/procedural animation (IK falls in that category) instead of
only replaying keyframe animations would be great.
Regards,
Jan
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mandriva - http://enigmail.mozdev.org
iD8DBQFKJ9Jkn11XseNj94gRAoX1AKC3tgSQ95LfPp6v0oD+tjQx0hYKMACgmgMe
UIFWL7PPEzAGd16sAzdqHF0=
=TubX
-----END PGP SIGNATURE-----
More information about the SLDev
mailing list