Buffer sizes in OpenSL ES

Hi all,

I’m looking at doing a C++ implementation for OpenSL ES, and need to get hold of possible buffer sizes to use (for the enqueued buffers), i.e. I don’t want to use JNI for accessing
properties like “android media property OUTPUT_FRAMES_PER_BUFFER”.

For sample rates, I suppose I can use SLAudioInputDescriptor and SLAudioOutputDescriptor to get available rates, and then use 48000 as default rate (as it is a required rate in Android).

But how can I get hold of the default and possible buffer sizes ?? Or do I even need to care ?

Regards
Robert

Hi Robert,

The buffer size has little to do with the content of the buffer. The reason for having multiple buffers is so that you don’t have to allocate the memory all in one big chunk. In general, pick a buffer size that is suitable to the application and the content being played. The implementation will process the buffers one by one and release them when it’s done with them.

So to answer your question - do you care: Yes, you do care. But it is not critical and will not affect playback. Too small of a buffer means you will constantly be filling and submitting them. Too large of a buffer and you may run into memory allocation issues, or your entire content will end up in a single buffer.

Thanks Erik, that was pretty much what I’ve come to expect from reading the documentation.

The main reason I asked is because of wanting to have minimal latency. Example: if I use a Nexus 6 device, it has a native buffer size (for ALSA) of 192 samples. So in this case the ideal thing would be to use that buffer size with OpenSL. With any other buffer size, the OpenSL implementation will have to wrap it somehow to compensate for the difference in sizes, thereby inevitably adding to the total latency.

But I think I can manage right now using a buffer size defined in the application.

So, the question now is if this is anything that is considered for OpenSL ES ? I.e. the possibility to get hold of the underlying subsystem buffer size ? Right now, on Android, to get minimal latency, one needs to query Android property “android.media.property.OUTPUT_FRAMES_PER_BUFFER” and then choose that size + 2 buffers in the queue. Ideally the need to query Android (via JNI) should be avoided.

[QUOTE=robiwan;41696]Thanks Erik, that was pretty much what I’ve come to expect from reading the documentation.

The main reason I asked is because of wanting to have minimal latency. Example: if I use a Nexus 6 device, it has a native buffer size (for ALSA) of 192 samples. So in this case the ideal thing would be to use that buffer size with OpenSL. With any other buffer size, the OpenSL implementation will have to wrap it somehow to compensate for the difference in sizes, thereby inevitably adding to the total latency.

But I think I can manage right now using a buffer size defined in the application.[/QUOTE]

Why is it that 192 seems to be the accepted buffer size (i.e. on Pixel XL phone), but other phones which have problems with audio have a much higher buffer size of 8305 (i.e. Samsung Galaxy S4)?

Too large of a buffer and you may run into memory allocation issues, or your entire content will end up in a single buffer.