[sldev] Call for requirements: ISO MPEG-V (mpeg for virtual worlds) Deadline: July 16, 2008

Lawson English lenglish5 at cox.net
Sat May 24 18:59:40 PDT 2008


Argent Stonecutter wrote:
> On 2008-05-24, at 19:35, Lawson English wrote:
>> Would it be possible to encapsulate SL protocols into such a system? 
>> There's no interactivity between nodes in MPEG-4 that I am aware of. 
>> Would it be possible to extend it in such a way that SL could be 
>> streamed into a set-top box as part of a larger chunk of data?
>
> You'd have to standardize and encapsulate all the SL object and 
> texture and everything else into something that was standardized.

Well, in theory, you could do what Bungee did with the original Marathon 
and use the game engine as a QuickTime codec (which would be  MPEG-4 
compatible I should think) for streamed user events and use a QuickTime 
movie file containing time-stamped keypress events as a playback movie. 
But the scenegraph of Marathon was completely static and all MOB 
activities were completely predicted by the user input. It was very kool 
though: could record an hour's net play between 6 players in only a few 
hundred KB .mov file, complete with switchable perspective from player 
to player, and speedup/slowdown movie controls.

I'm sure some kind of MPEG-5 would be at least as flexible. The issue is 
interactivity. would it be possible interleave SL events/textures/etc in 
a reasonable way with more mainstream multi-media stuff? I think it 
would be possible, but what specifically would you gain? The ability to 
play SL embedded within a regular Hollywood movie? The ability for 
utility software to extract SL data and massage it in realtime within 
your Tivo? For SL to be useable, you'd still need the ability to send 
message upstream. Would that be done using some built-in aspect of 
MPEG-5 streams or would your MPEG-5 player need to support some outgoing 
pipe separate from MPEG-5 itself?

>
> Seems to me like standardizing HTML in 1978 using the Plato system as 
> a model.
>
Eh, the incoming stuff would be quite doable, I think, even with MPEG-4. 
Its outgoing data for interactivity that is the issue. If the MPEG-5 is 
supposed to support outgoing user-generated data, it should be 
completely doable. Come to think of it, I think some extensions to 
MPEG-4 support some level of interactivity but perhaps not enough.

The question is: at what level of protocols do you standardize? And what 
is it FOR? They could never figure out a use for VRML in MPEG-4 which is 
why no-one has used it in any major products. What would 
MPEG-5-Second-Life bring to the party?


Lawson


More information about the SLDev mailing list