[sldev] Body motion and facial expression tracking, Microsoft did it
Jan Ciger
jan.ciger at gmail.com
Wed Jun 3 13:06:45 PDT 2009
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Thomas Grimshaw wrote:
> BUT I really would like to see the lab building an
> /interface/ to allow this level of live avatar control which can be
> hooked into by third parties in the future.
All that you need is to expose the positions/quaternions of all bones in
the avatar's skeleton using some kind of API. Then it can be driven even
by live motion capture, if that is what you desire.
However, apart from some kind of puppeteering, this is not that useful
technique. You can't really walk in SL or grasp objects with your hands
with only some kind of motion capture.
Regards,
Jan
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mandriva - http://enigmail.mozdev.org
iD8DBQFKJtfVn11XseNj94gRAi4tAKCTNtLUiZo97bUQJfL07qJ3smv1xgCg0N1q
5PbmZW/M3FH0ZbY+0a5DagM=
=x9Ff
-----END PGP SIGNATURE-----
More information about the SLDev
mailing list