View Full Version : what prevent openML to efficiently provide multiple inpu

12-09-2002, 07:12 AM
ok after 6 a months re-read of the spec I found a really black point : all buffer MUST be sent at the same time for 1 transcoder. ie if I want to send a buffer to a transcoder and have the result written in another buffer I must enqueue both (in and out) in the same send().
ok might be right but :
->if I have two input streams (ie a multiple IO transcoder) both having their own framerate, this method is inneficient (requesting to send() multiple times the same buffer for the slower one if we can [compressed context dependants buffers can't])
->if I have multiple independants streams, synchronisation is ugly (the app has to know exactly when each streams have to be played and can't pre-buffer if stream switch (in case I want to join two video streams for exemple)) and video files like quicktime have such streams (you can have 10 video streams beginning at 0 10s 1m...)
-> another problem with framerates changes : I can enqueu with the highest rates but what if output rely on some context?

the problems comes from buffers messages don't have timestamp/duration options required :
in multiple stream transcoder this will slove the problem : ex of a merger transcoder (FX)
I enqueu on pipe 1 with relative start 0 duration 0.04s (25fps)
I enqueu on pipe 2 with relative start 1s duration 10s
I enqueu a lot of buffer on ouput

the transcoder write on output with duration equal to the result (in this case duration is the min of both ie 0.04s) and relative start to a computed one (ie in this case 0 then 0.04s then 0.08...)
when duration is over the buffer is discarded

/!\ this will no require that transcoder are time aware : a transcoder only have to compare the duration fields : buffer input 2 will be decreased of the duration of buffer input 1 until <0 and then dequeued

this can also produce "slow" or "fast" output easely : you only need to modify duration parameter

[ December 09, 2002: Message edited by: cityhunter ]