Hi all,

I assume this is the right mailing list since it is seemingly more
programming/API oriented. If I were wrong, I apologize ahead and please let
me know the right place to post.

I wanted to find some API/example to generate ffm feed from frame images. I
was able to stream an existing video to the ffserver via command line and
play in the browser, which is awesome, and only one step away from what I am
looking for.

I have a program that generates a series of JPEG files, and I want to play
these files on the client browser as a video stream (this video should be
playing while the new JPEG files are being generated). Meanwhile, I have a
wav file that is handy and I want to play this wav file with the video.

I was also able to programmatically generate video files based on the
examples in ffmpeg source code, and I know how to use command line to stream
an existing video to ffserver via ffm feed. However I wonder if there is a
way to fully automate this: I cannot wait until a video is fully done before
sending it to the server via command line call. Ideally I want to generate
ffm feed from images, but unfortunately I couldn't find an example of doing
that.

Thanks so much for the help. I am quite new to ffmpeg and multimedia
technology, so any help would be deeply appreciated.

Thanks,
Jieyun
_______________________________________________
libav-user mailing list
[email protected]
https://lists.mplayerhq.hu/mailman/listinfo/libav-user

Reply via email to