[sldev] Blowing wind

Aimee Trescothick aimee.trescothick at gmail.com
Wed Dec 9 05:06:01 PST 2009


On 9 Dec 2009, at 05:46, Philippe (Merov) Bossut wrote:

> Hi Aimee,
> 
> I haven't looked in the code, just assumed you copied the relevant parts :)
> 
> It seems to me that the intent of the code is to write in "newbuffer" by packets of "sizeof(MIXBUFFERFORMAT_T) == stride" bytes using "cursamplep" as a sliding pointer. That said then, having "stride == 4" when "MIXBUFFERFORMAT_T" is defined as S16 seems wrong.
> 
> So the question is: why is "stride" set independently of "sizeof(MIXBUFFERFORMAT_T)"? Are there any situation where those 2 things need to be different? Looking at how "stride" is set should give you a hint.

MIXBUFFERFORMAT and stride are only set with:

 #if LL_DARWIN
        typedef S32 MIXBUFFERFORMAT;
 #else
        typedef S16 MIXBUFFERFORMAT;
 #endif

and ...

#if LL_DARWIN
	stride = sizeof(LLAudioEngine_FMOD::MIXBUFFERFORMAT);
#else
	int mixertype = FSOUND_GetMixer();
	if (mixertype == FSOUND_MIXER_BLENDMODE ||
	    mixertype == FSOUND_MIXER_QUALITY_FPU)
	{
		stride = 4;
	}
	else
	{
		stride = 2;
	}
#endif

So stride seems only to be used as a (broken) hack when MIXBUFFERFORMAT is S16 and one of those two 32-bit mixer formats is in use, one of which wants S32, the other which wants F32.

If on my Mac I set MIXBUFFERFORMAT to S16 and stride to 4, which should be the same result as a PC using FSOUND_MIXER_BLENDMODE, the result is a really horrible noise, as I would expect. 

Aimee.



More information about the SLDev mailing list