Hello! I have written a C++ class that grabs pictures from the color buffer of a running 3D render engine to create an .mpg file. This works pretty well, but now I have to enhance my class to make it possible to stream the video into the network instead of writing it to hard drive disk. Unfortunately the streaming does not work.
I'm gonna give you a brief description of what I am doing and you please tell me if my approach is generally ok or if my idea is wrong: 1) I start the ffserver (my ffserver.conf has one feed feed1.ffm, some streams like test1.mpg, test1.avi etc. Nothing fancy) 2) I am creating a feed (called http://localhost:8090/feed1.ffm) with libavformat and libavcodec. I create the feed this way: I create an AVFormatContext, intitialize the AVOutputformat with guess_format("ffm", "", NULL), add a stream, configure the stream (by setting stuff like CODEC_TYPE_VIDEO, the width, height etc.), search and open an encoder and finally open the feed with url_fopen(). As filename I just pass http://localhost:8090/feed1.ffm. I use exactly the same code for creating the .mpg on harddrive, so I guess the code can't be totally wrong. I just don't know if my approach is working for streaming. 3)I try co connect to the stream with commands like vlc http://localhost:8090/test1.mpg. It does not look too bad. For example the ffserver responds with a code 200 (=OK). But all the players print errors like "Connection refused" or other stuff. So my questions are: Is my approach right? If yes, what could be the problem? And if my approach is totally wrong, how is streaming with libavformat,libavcodec done? I am really lost and any help is extremely appreciated! best regards, flo -- GRATIS! Movie-FLAT mit über 300 Videos. Jetzt freischalten unter http://portal.gmx.net/de/go/maxdome _______________________________________________ libav-user mailing list [email protected] https://lists.mplayerhq.hu/mailman/listinfo/libav-user
