Hi all,

I sent this email to ffmpeg-user but just realized that was not the right
mailing list to ask. Sorry for spamming if you saw this message before.

I have a program on the server side that is able to generate a series of
JPEG files, and I want to play these files on the client browser as a video
stream, with a desired frame rates (this video should be playing while the
new JPEG files are being generated). Meanwhile, I have a wav file that is
handy and I want to play this wav file in the client side, when the
streaming video is being played.

Is there anyway to do it? I have done a plenty of research but can't find a
satisfactory solution -- all solutions I found require me to have all the
images files ready before generating the video. However, I need to play the
video while the images are feeding in.

I understand that ffserver allows me to stream the video to the client side,
if ffmpeg can generate a "live feed." So my question is reduced to, how to
use ffmpeg libraries to generate video while the images are feeding in.

I am very new to this area, so more detailed explanation will be extremely
appreciated. Thank you so much!

Jieyun
_______________________________________________
libav-user mailing list
[email protected]
https://lists.mplayerhq.hu/mailman/listinfo/libav-user

Reply via email to