I’m trying to write a TCP server to receive the output from my webcam:
./ffmpeg -f avfoundation -i "0:0" -f mpeg1video -b:v 800k -r 30 -s 320x240
tcp://localhost:8080
I’m looking to parse out the streamed jpeg images, apply some processing to
them, and then pass these along to another process
Hi all
We have an audio/video player system currently running on Apple Mac OS X
Mavericks (10.9), Linux and FreeBSD.
The player system is written in C programming language with SDL2, ffmpeg and
many other external libraries.
The player system consists of a media segmentation program, a player and
Denis Mysenko writes:
You cannot use 'itsoffset' to delay an audio stream, you need 'adelay'
audiofilter instead.
Eg. [1:0]adelay=10|10[out1] to produce a 10 second delayed audio from stream
1:0 and call it out1
So your case could be:
ffmpeg -y -i 20150416203941.broadcast.ts -i
20150416203
You cannot use 'itsoffset' to delay an audio stream, you need 'adelay'
audiofilter instead.
Eg. [1:0]adelay=10|10[out1] to produce a 10 second delayed audio from stream
1:0 and call it out1
So your case could be:
ffmpeg -y -i 20150416203941.broadcast.ts -i
20150416203941.broadcast.ts -filter_
Hi all
I have an mpegts container with one video and two audio streams.
I what to mix the two audio streams. At the same time the second audio
stream must be delayed 10 seconds to match the first audio stream.
Here is what I have as input
tps@t420:~/video/test$ ffprobe -i 20150416203941.bro