I have noticed an interesting bug when using TextureView with the Camera. If I setup a TextureView, pass it a valid OpenGL texture and pass that into the camera, everything works great. I can then swap out the pixel shader with a simple shader and all still works great. But if I swap out the pixel shader with a more complex shader the framerate drops and stuttering appears.
What seems to be happening is that a frame is getting processed out of order. After some hypothesizing I wondered if the hardware was being sent too much to do. After trying a number of things I found that if I just slept the gl processing thread for around 75 milliseconds or so the framerate jumped back up and the stuttering disappeared. This is an okay hack for now, but I worry that value is just an arbitrary guess based on my shader, and that it may change with other shaders or other phones. My question is: Is it possible to know when it is safe to send another frame down to OpenGL without over-taxing the system? (Or is something else going on here?) I have tried using glFinish, and glFlush and neither of these helped. I wondered if that was because the camera uses samplerExternalOES and somehow that effects it? Note: If I run the same OpenGL code using a SurfaceView instead I don't get any stuttering. Of course the performance isn't as good so that may be the reason it doesn't. Any help is appreciated. Thanks, Brad -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en