Hi all!

I'm doing a project that consist on transmission and visualization of
equirectangular panoramic video in low low latency. The video is played
from a browser into the Oculus Rift.

The streaming and visualization is now going right. However, we want to
improve the results sending only in High Quality the image that the users
are seeing in Oculus. For that, we know which is the zone to send with High
Quality and know which need to send in low low quality to improve the
system.

Is it possible to stream with the explained pipeline? The determined zones
of the video that wants in High Quality is possible that are changing in
the time. It can be parametrize?

Thanks all and best regard.
_______________________________________________
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

Reply via email to