On 2020-08-26 03:29, Jesper Ek wrote:

…I have implemented a libavdevice to capture audio and video from an AJA
Kona card (similar to decklink). I'm then using the HLS muxer to encode my
stream. Now I need the HLS EXT-X-PROGRAM-DATE-TIME tag to specify exactly
when the input source was recorded, so my plan is to forward the "wall
clock time" (from the avdevice implementation) to the hls muxer, and use
that instead of using system time (av_gettime()).

I have a client that would like to have a timestamp in wall-clock time of precisely when each frame was recorded (to millisecond accuracy). What you are talking about sounds interesting to me. Maybe my client can use it.


…I cant use PTS, so what is the correct way to pass "wall clock time"
metadata for every frame to the HLS encoder?

It seems to me that you might be able to use Presentation Time Stamp (PTS). What is the obstacle to using PTS?

I believe that the type of PTS in the FFmpeg codebase is `int64_t` (signed 64-bit integer), and in the MPEG Transport Stream format stores presentation time stamps as unsigned 33-bit values counting ticks of a 90 kHz Program Clock Reference. If you multiply 90 kHz * 60 secs/min * 60 mins/hr * 24 hr/day, there are 7,776,000,000 PCR ticks in 24 hours. That fits in a 33-bit unsigned int. And it certainly fits in a signed 64-bit int. So, it seems like your libavdevice could assign PTS values which correspond to the time of day when each frame was captured. There is a fighting chance that this value would survive recording in an MPEG transport stream, and processing by FFmpeg. (But, I've not actually done this myself, so I don't know.)


…I noticed that the decklink implementation uses
`av_packet_add_side_data(&pkt, AV_PKT_DATA_STRINGS_METADATA, ...)` to add
timecode information, is this the correct approach?


Earlier this year, there were patches[3] which aimed to put linear timecodes in a format related to SMPTE 12M into a structure marked by a `AV_FRAME_DATA_S12M_TIMECODE` value. Maybe that refers to a location where you could store timecode values. You might want to search for that identifier, and for patches by that developer. I don't know if they have been added to the FFmpeg source yet.

SMPTE 12M refers to a linear timecode format[3]. I understand that this document has been split into three, and they are now named:

ST 12-1:2014 - SMPTE Standard - Time and Control Code
ST 12-2:2014 - SMPTE Standard - Transmission of Time Code in the Ancillary Data Space ST 12-3:2016 - SMPTE Standard - Time Code for High Frame Rate Signals and Formatting in the Ancillary Data Space

They are available for free download from the SMPTE web site[4] until 31 Dec 2020 (extended from 21 July 2020). You might want to get them now, for future reference.

Please do update this list on what you do. I for one would like to take a look at it.

Best regards,
     —Jim DeLaHunt, software engineer, Vancouver, Canada

[1] https://en.wikipedia.org/wiki/MPEG_transport_stream#PCR
[2] http://ffmpeg.org/pipermail/ffmpeg-devel/2020-July/265628.html
[3] https://en.wikipedia.org/wiki/SMPTE_timecode
[4] https://www.smpte.org/free-standards-and-publications



_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".

Reply via email to