[sldev] Call for requirements: ISO
MPEG-V (mpeg for virtual worlds) Deadline: July 16, 2008
Lawson English
lenglish5 at cox.net
Sun May 25 13:32:24 PDT 2008
Argent Stonecutter wrote:
> On 2008-05-24, at 20:59, Lawson English wrote:
>> Well, in theory, you could do what Bungee did with the original
>> Marathon and use the game engine as a QuickTime codec (which would
>> be MPEG-4 compatible I should think) for streamed user events and
>> use a QuickTime movie file containing time-stamped keypress events as
>> a playback movie.
>
> Quicktime is an encapsulation format, like the encapsulation part of
> MPEG 4. It can encapsulate MPEG-4 codecs (in fact it uses MPEG-4 audio
> as the default audio format).
>
> That's interesting, but I'm not sure what the application for it is.
> For a demo, sure, but even there it doesn't seem practical for SL.
> There's orders of magnitude more data to stream.
Sure, but MPEG-4 streams audio/video anyway, and I doubt that there's
more data coming into SL than a streaming QT movie, MPEG-4 or not.
MPEG-4 just allows for a more efficient division of labor as far as
compression/decompression of components of the stream goes, or such is
my understanding. Different dtat-types can have their own sub-streams
(and therefore their own compression algorithms) within the MPEG-4 stream.
> It's making the encapsulated content something you can manipulate and
> interact with without the SL engine or the Croquet engine or the
> whatever else engine that would make it something like HTML. If you
> can't do that then it makes more sense to take something that is at
> least halfway standards based, and getting more so, and getting
> actually used... like the SL content stream... and document and
> encapsulate the individual components of that... or document an
> encapsulation format that you could interchange with the SL content
> stream that you could use to create a viewer that something like libsl
> (or whatever they're calling it these days) could provide a gateway for.
Even so, I think the idea is that with an MPEG-5, you could stream it to
whatever MPEG-5 box you have, and use the "standard" codec for a
virtual world to interpret the incoming data. I think the main
difference (haven't read the discussion) would be in the degree of
interactivity. You could design a one-way codec for SL with MPEG-4
already. You just couldn't MOVE. Not sure how to handle ping requests
from the SL server, but that might not be insurmountable. I suspect the
main issue is that the level of interactivity would be too low to be
useful (assuming you could ack PINGs in some way --otherwise the current
SL setup simply wouldn't work. The avie would die within a few seconds).
>
>> They could never figure out a use for VRML in MPEG-4 which is why
>> no-one has used it in any major products.
>
> Apple uses QTVR, which is more or less the Quicktime equivalent, for
> providing "hands on" views of objects and scenes.
Yeah, but the interactivity is all client-side and the QTVR is a single
image built from multiple perspectives that have been stitched into a
fisheye view of a panoramic scene (or a rotating view of a single
object/scene stitched into an inverse of that with the camera focus at
the center of the object).
Regardless, QTVR's intactivity is purely client-side so its not a good
model for what a MPEG-xx codec would need to do.
Lawson
More information about the SLDev
mailing list