Thanks first, I'll try.
-邮件原件-
发件人: libav-user-boun...@ffmpeg.org [mailto:libav-user-boun...@ffmpeg.org] 代表
Roger Pack
发送时间: 2012年11月13日 13:07
收件人: This list is about using libavcodec, libavformat, libavutil, libavdevice
and libavfilter.
主题: Re: [Libav-user] How can we make two NTP-sync
Qt has a multimedia framework named "Phonon" maybe helpful.
-邮件原件-
发件人: libav-user-boun...@ffmpeg.org [mailto:libav-user-boun...@ffmpeg.org]
代表 Tom Isaacson
发送时间: 2012年11月13日 13:13
收件人: Libav-user@ffmpeg.org
主题: [Libav-user] Displaying video in Windows
I've looked at the source for ffplay
The most easiest native way:
Decoding part:
Read thread: Open file, read packets and push them to audio and video shared
queues;
Audio thread: Read packets from audio queue, decode and push to audio play
queue(in PCM format);
Video thread: Read packets from video queue, decode, convert from YUV
I probably should have mentioned that the source is an RTSP server and it's
only video, so audio\video synchronisation is not an issue. What I want to do
is write a Windows 8 Metro app that displays this video stream. Unfortunately
Microsoft don't support RTSP (see
http://stackoverflow.com/ques
On 11/13/2012 05:52 AM, Tom Isaacson wrote:
I probably should have mentioned that the source is an RTSP server and it's only video, so audio\video synchronisation is not an issue. What I want to do is write a Windows 8 Metro app that displays this video stream. Unfortunately
Hi,
I will use dshow to capture live stream, then save individual frames to a
buffer. I plan to save the frames in the buffer to an AVI file with C++.
How can I save frames to an AVI file?
Thanks!
___
Libav-user mailing list
Libav-user@ffmpeg.org
http
Take a look at the muxing.c example; but, instead of source coming from a
generated frame as in fill_yuv_image(), use the frame coming from your buffer.
Use same idea for audio stream.
-Original Message-
From: libav-user-boun...@ffmpeg.org [mailto:libav-user-boun...@ffmpeg.org] On
Behalf
On date Monday 2012-11-12 15:00:22 +0200, Wenpeng Zhou wrote:
> Hi,
>
> I use a USB camera to capture streams. I hope to save streams to an AVI
> file.
Check dshow/video4linux2 devices in libavdevice (doc/indevs.texi, or
the ffmpeg manual). There is nothing special about an input device,
you trea
On date Monday 2012-11-12 14:50:24 -0600, Ron Woods wrote:
> I am trying out the filtering_audio.c example provided with the ffmpeg
> libraries for Windows to extract the audio from a MP4 file, resample to 8 KHz
> and convert from stereo to mono. The example pipes the audio output via
> stdout t
I have tried it on AVI or MOV input file as well -- same result; so, no, it
doesn't seem to depend on input file.
I am using same filter technique in another context to transcode audio from
input movie to WAV output file, and it works fine there.
Maybe it is something to do with the piping to ffp
On date Tuesday 2012-11-13 16:26:13 -0600, Ron Woods wrote:
> I have tried it on AVI or MOV input file as well -- same result; so, no, it
> doesn't seem to depend on input file.
> I am using same filter technique in another context to transcode audio from
> input movie to WAV output file, and it
Per your recommendation, I tried the following:
filtering_audio test.mp4 > test_audio
ffplay -f s16le -ar 8000 -ac 1 test_audio
but with no better luck than with:
filtering_audio test.mp4 | ffplay -f s16le -ar 8000 -ac 1 -
Where to search "trac tickets"?
-Original Mess
Hi, i'm very new to this, so please bear with me if the question is stupid.
In my app, there will be more than one instance of decoder process that
decode multiple files at the same time, so there will be more than place
where i need to open and do all the initialization before open video files
an
Hi guys,
I'm trying to use matroska container format for H.264 video and AAC audio.
When I run my program, there's warning 'Codec for stream 0 does not use global
headers but container format requires global headers'
Anyone knows how to solve that? Thanks
__
Le tridi 23 brumaire, an CCXXI, Tom Isaacson a écrit :
> I've looked at the source for ffplay and can see that the video is
> displayed using SDL. But as far as I can tell this doesn't actually use a
> real window in Windows, it just creates a space on the screen then watches
> for mouse events in
Hi,
I'm using ffmpeg to retrieve the video and data from camera.
This camera sends video through rtsp and data through onvif.
I use "avformat_open_input" -> "avformat_find_stream_info" -> and get 2
streams from camera, and then I just use "av_read_frame" to do what I want
to do to these 2 streams
Hello,
(sorry if it comes twice - the previous one was sent from wrong address)
I am seeing a problem with libavcodec's H.264 decoder, which I need some
help with.
I was using the codec to decode H.264 video in a straightforward way,
and at some point I realized that it fails to decode specific
17 matches
Mail list logo