Latency varies on start/stop of Player and Recorder

Dear all,
I am modifying the echo example given by Android/Google people.
What I am doing is: (1) Start Play (2) Start Record (3) Stop Play when my data/tone has played (few thousand samples) (4) Stop Record when all my data has been received (few thousand samples)

I. I am recording what is being played.
II. What is being played is a short burst of white noise prefixed and padded with 0x00’s.

Question:
The very first time I do the operation above, the difference in play/record sample time is very good, about 60 samples and consistently performs this way.
But when I want to repeat the operation (i.e by a push button), the difference in in record/play sample time is in the order of 1,000 samples to 2,000 samples.

I have tried doing things like Clearing the SLAndroidSimpleBufferQueueItf on both Playe and Recorder, but makes no difference.
I quickly looked into the available public methods of SLAndroidSimpleBufferQueueItf , SLRecordItf, and SLObjectItf, but nothing seems to bring the “engine” to the state when it is first created/initialized so I can get the low latency.

Any ideas?

After more tests, what I discovered is that there is no “variable latency” it’s actually old data!
So where is this data coming from? I call the “Clear” method of the Buffer Queue Interface right before “starting” the recorder.

Ideas?
Thank you.

-Saul