Hi all,

    I capture realtime video stream(with video and audio frames ) from a
device(such as webcam), decode them into YUV for video and PCM for audio
using the play SDK of the device. Then I want to encode such intermediate
uncompressed format data  and transfer these encoded data via socket to
another server ( I CANNOT transfer these intermediate uncompressed format
data directly via socket because these raw data is so large!) . Can somebody
tell me how to encode such two kinds of intermediate uncompressed format
data while maintaining the synchronization of video and audio? I think these
two encoded data should be encapsulated into the same container (such as avi
container), but until now i have no idea how to do that. Can someone show me
how?
    Thinks in advance!
_______________________________________________
libav-user mailing list
[email protected]
https://lists.mplayerhq.hu/mailman/listinfo/libav-user

Reply via email to