I don't quite understand how real time measurements are taken.
This can be related question https://developer.qualcomm.com/forum/qdn-forums/software/adreno-gpu-prof... Although, it seems like I have approximately the same and probably correct distribution with any frequency, just unexpected sample size.
I'm trying to take EGL->FPS numbers from the device.
I set Frequency 1Hz, and take ~20sec sample.
I import data to csv and I see 92 values for 20 sec, so it cannot be 1Hz.
What is the real frequency of measurement? Does it affect precision?
Is it possible to get exact number of frames drawn for every second?