Hello, Probably this question belongs to kernelnewbies list but I think i will get accurate answer from here.
I am doing some optimization in kernel video driver code to reduce the latency from the time buffer is given to the time it is displayed. Userspace: t1 = gettimeofday() Kernelspace: t2 = do_gettimeofday() Using debugfs i am copying t2 to userspace and subtracting the result. latency = t2 - t1 Would this be accurate? I know there is context switch involved and other stuff but this is the method i came up with. Any better way? System: ARM (Android) Kernel- 3.10 -- To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to [email protected] More majordomo info at http://vger.kernel.org/majordomo-info.html Please read the FAQ at http://www.tux.org/lkml/

