It's great to see that rtpengine now supports transcoding.  I always watched 
rtpproxy (for so many years) to see when it might happen.

I have some questions.  I could not find on ffmpeg.org specific mention of:

  -advanced audio jitter buffer capability
  -advanced RTP related RFCs, eg. RFC 8108
  -advanced codec capability, eg. G711 Appendix II (DTX and CNG)

In particular, it's not clear to me where jitter buffer and RTP RFCs are 
handled -- inside rtpgengine or ffmpeg.  I don't see any mention of jitter
buffer on the rtpengine web and github pages.  In general ffmpeg's focus is on 
content delivery and not bidirectional real-time communication.  My
questions:

1) Is there an rtpengine-ffmpeg software architecture or data flow diagram 
available ?

2) Is it possible to connect libraries besides ffmpeg to the "other side" of 
rtpengine ?  For example, using the rtpengine interface, send and
receive a UDP/RTP packet stream to/from a third-party library and let it handle 
jitter buffer, encoding/decoding, ptime mismatch (transrating), and
more ?

3) If architecturally that's a do-able thing, is there a spec on how rtpengine 
is currently interfacing to ffmpeg ?  Which APIs are being used ?  (I
assume the command line interface is not being used).  Re. ffmpeg APIs I found 
this:

  http://dranger.com/ffmpeg/

but maybe there is something more recent or rtpengine source code we can look 
at.

Thanks.

-Jeff


_______________________________________________
Kamailio (SER) - Development Mailing List
sr-dev@lists.kamailio.org
https://lists.kamailio.org/cgi-bin/mailman/listinfo/sr-dev

Reply via email to