Hello to all.

Im my application I need to stream a live stream (because it doesn't come
from file, but on demand). The "getFrame()" is already implemented. My
problem now is send the video (only video, no audio) to the network. It's
basicly the comet technology.

I explain:

As http(s) library I am using gnu libmicrohttpd. With its I can create a
callback with sends the data to client in a transparent way.

For example, to send indefinitly (and it's what I need) a string "Hello
World" to the client, I create a callback like this:

static ssize_t video_generator(void* cls, uint64_t pos, char* buf, size_t
max)
{
  sprintf(buf,"Hello World\n");
  return strlen(buf); // I need to return the size of data
}

So, when the server receive the request, while the client is connected, it
will receive:
Hello World
Hello World
Hello World
Hello World
Hello World
(...)
Hello World
Hello World
(...)

But it's not ffmpeg, but libmicrohttpd. Now I need to send a webm
(matroska+vp8, with no sound) stream to the client, instead strings.

If my video source was a file, I could do (in high level):
static ssize_t video_generator(void* cls, uint64_t pos, char* buf, size_t
max)
{
  frame = getEncodedFrameOnDemand();
  copy_frame_into_buffer(frame,buf);
  return length_of_buffer(buf);
}

But how can I do it, because I can't send only the frame out of a container,
because webm has a header, etc.

Is it possible with ffmpeg? If yes, how can I do it? Does anyohe in the
mailing list has experience with live streaming over http? Is it possible to
create an AvFormatContext with no associated file and with doesn't store all
the frames into it?
_______________________________________________
libav-user mailing list
[email protected]
https://lists.mplayerhq.hu/mailman/listinfo/libav-user

Reply via email to