Hi,
I’ve some performance issues and I’d like your advice.
I’m working with bits. It works fine but the following code takes about 95% of the computation time.
First, I have an array of 16 value.
float crown[NB_SECTOR]; // NB_SECTOR = 16
Then, I fill this array with some values read from a 2D image.
This should take a while. Let’s say that at this point, crown is filled.
Then, I have a descriptor (16bits) that I pack into an integer.
I set the bit of this descriptor relatively to the values of crown compared to a threshold :
int descriptor = 0;
for (int i=0; i<NB_SECTOR; ++i) {
if (crown[i]>threshold){
descriptor |= (1<<i);
}
}
As I said, this code takes 830 micro-seconds on the total 900 micro-seconds of the computation time.
How can this code be that long? Filling crown seems to me much more inefficient.
Thanks,
Vincent