Re: [FFmpeg-user] Reliable audio capture method on windows ? (stereo mix does not exist on most computers)

2022-12-23 Thread Roger Pack
On Mon, Nov 7, 2022 at 11:17 PM Joe Smithy  wrote:
>
> Hi,
>
> I need a way to capture output audio that works reliably on most computers
> without having to know anything about the state of said computer.
>
> Searching for ffmpeg how to capture desktop audio only returns variants of
>
> -f dshow -i audio="Stereo Mix (Realtek High Definition Audio)"
>
>
> However, none of the computers I have in the house have any form of "stereo
> mix" recording device
>
> Yes,I did try all of these fixes
>
> https://www.wintips.org/how-to-enable-stereo-mix-if-not-showing-as-recording-device-in-windows-11-10/
> https://www.hitechwork.com/how-to-restore-missing-stereo-mix-on-windows-10/
> https://answers.microsoft.com/en-us/windows/forum/all/stereo-mix-missing-in-windows-10-did-they-remove/abd60f1a-1ad2-4928-a7a2-b723d84bc5bc
> https://answers.microsoft.com/en-us/windows/forum/all/stereo-mix-not-showing/2b706268-7887-42ec-9064-9418d16e99a8
> https://thedroidguy.com/how-to-restore-missing-stereo-mix-on-windows-10-1147145
> https://appuals.com/how-to-restore-missing-stereo-mix-on-windows-10/
>
> None of them work, stereo mix is a sound card driver feature that only
> exists on some cards and only some drivers. Most random computers you come
> across will not have this so giving the user a ffmpeg command string almost
> always fails.

Stereo mix is just the "name" of the sound device.  Names will vary.
Maybe should make that more clear.
You can enumerate them like ffmpeg -list_devices true -f dshow -i dummy
Thanks.
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] RTMP speed vlue never reaches 1 (100%), streamign fail in short time

2022-09-08 Thread Roger Pack
Maybe the network can't keep up with the bandwidth required?  If it
says dropping that's not fatal FWIW.

On Tue, Jun 28, 2022 at 8:06 AM Lordrak  wrote:
>
> Hello,
>
> I am trying streaming rtpm using ffmpeg but i have few problems. FPS has
> never reach 25fps, only 21 for example. So stream stopping and in short
> time (couple minutes) buffer overflow. Ffmpeg is unusable to stream
> using rtmp protocol.
>
> In complete scenario i need use tee muxer and send stream to rtmp and
> mpegts udp destination. When i leave only mpegts udp, streaming works
> fine.
>
> Here are three tests:
>
>
>
> ffmpeg-2022.06.12.exe -f dshow -rtbufsize 10
> -pixel_format uyvy422 -s 1920x1080 -r 25 -fflags +genpts
> -i video="Decklink Video Capture (2)":audio="Decklink Audio Capture (2)"
> -vf yadif,fps=25 -map 0:v -map 0:a -codec:a aac -ac 2 -ar 48000
> -b:a 128k -vcodec libx264 -preset veryfast -tune zerolatency -profile:v main
> -g 12 -top 1 -sc_threshold 0 -bufsize 11000k -minrate 8000k -maxrate 8000k
> -b:v 8000k -muxrate 11000k -pix_fmt yuv420p -s 1920x1080 -aspect 16:9
> -flags +ildct+ilme+global_header -streamid 0:481 -streamid 1:482
> -map_metadata -1 -metadata service_provider="TIK BOHUMIN" -metadata 
> service_name="TIK BOHUMIN" -mpegts_pmt_start_pid 480
> -f tee 
> [f=mpegts:bsfs/v=h264_mp4toannexb:use_fifo=1:onfail=ignore:pkt_size=1316]udp://@239.0.0.51:5000|
> [f=flv:onfail=ignore:flvflags=no_duration_filesize:bsfs/v=h264_mp4toannexb:use_fifo=1]rtmp://upstream.server.eu/live/tik2"
> 2> out1.txt
>
>
>
> ffmpeg-2022.06.12.exe -f dshow -rtbufsize 10 -pixel_format uyvy422
> -s 1920x1080 -r 25 -fflags +genpts -i video="Decklink Video Capture 
> (2)":audio="Decklink Audio Capture (2)"
> -vf yadif,fps=25 -map 0:v -map 0:a -codec:a aac -ac 2 -ar 48000 -b:a 128k
> -vcodec libx264 -preset veryfast -tune zerolatency -profile:v main -g 12 -top 
> 1
> -sc_threshold 0 -bufsize 11000k -minrate 8000k -maxrate 8000k -b:v 8000k
> -muxrate 11000k -pix_fmt yuv420p -s 1920x1080 -aspect 16:9 -flags 
> +ildct+ilme+global_header -streamid 0:481 -streamid 1:482 -map_metadata -1 
> -metadata service_provider="TIK BOHUMIN"
> -metadata service_name="TIK BOHUMIN"
> -mpegts_pmt_start_pid 480   -f tee 
> "[f=mpegts:bsfs/v=h264_mp4toannexb:use_fifo=1:onfail=ignore:pkt_size=1316]udp://@239.0.0.51:5000"
>2> out2.txt
>
>
> ffmpeg-2022.06.12.exe -f dshow -rtbufsize 10 -pixel_format uyvy422
> -s 1920x1080 -r 25 -fflags +genpts
> -i video="Decklink Video Capture (2)":audio="Decklink Audio Capture (2)"
> -vf yadif,fps=25 -map 0:v -map 0:a -codec:a aac -ac 2 -ar 48000 -b:a 128k
> -vcodec libx264 -preset veryfast -tune zerolatency -profile:v main -g 12 -top 
> 1
> -sc_threshold 0 -bufsize 11000k -minrate 8000k -maxrate 8000k -b:v 8000k
> -muxrate 11000k -pix_fmt yuv420p -s 1920x1080 -aspect 16:9 -flags 
> +ildct+ilme+global_header -streamid 0:481 -streamid 1:482
> -map_metadata -1 -metadata service_provider="TIK BOHUMIN" -metadata 
> service_name="TIK BOHUMIN"
> -mpegts_pmt_start_pid 480   -f 
> tee"[f=flv:onfail=ignore:flvflags=no_duration_filesize:bsfs/v=h264_mp4toannexb:use_fifo=1]rtmp://upstream.server.eu/live/tik2"
> 2> out3.txt
>
>
> Here are output from latest test using only rtmp
> ffmpeg version 2022-06-12-git-4d45f5acbd-full_build-www.gyan.dev Copyright 
> (c) 2000-2022 the FFmpeg developers
>built with gcc 11.3.0 (Rev1, Built by MSYS2 project)
>configuration: --enable-gpl --enable-version3 --enable-static 
> --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv 
> --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma 
> --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt 
> --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray 
> --enable-libcaca --enable-sdl2 --enable-libdav1d --enable-libdavs2 
> --enable-libuavs3d --enable-libzvbi --enable-librav1e --enable-libsvtav1 
> --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 
> --enable-libxvid --enable-libaom --enable-libjxl --enable-libopenjpeg 
> --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r 
> --enable-libfreetype --enable-libfribidi --enable-liblensfun 
> --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf 
> --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec 
> --enable-nvenc --enable-d3d11va --enable-dxva2 --enable-libmfx 
> --enable-libshaderc --enable-vulk
 an
>   --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme 
> --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb 
> --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame 
> --enable-libvo-amrwbenc --enable-libilbc --enable-libgsm 
> --enable-libopencore-amrnb --enable-libopus --enable-libspeex 
> --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite 
> --enable-libmysofa --enable-librubberband --enable-libsoxr 
> --enable-chromaprint
>libavutil  57. 26.100 / 57. 26.100
>   

Re: [FFmpeg-user] Hauppauge WinTV-7164 Analog Composite/S-Video Capture

2022-09-08 Thread Roger Pack
> BUG: The device specific URLS provided by ffmpeg dshow "-list_devices" for 
> this
> Hauppauge provided some extremely length device URLs that were apparently not
> recognized by the dshow video= and audio= options.

Examples please? :)
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] ffmpeg combining live video from 3 cameras

2022-09-08 Thread Roger Pack
"Maybe" with avisynth and a special input, but in general donations
welcome to make the dshow module able to do this :)

On Mon, Jul 18, 2022 at 10:35 PM ashwin Nair  wrote:
>
> Hi All,
>
> I have 3 USB cameras streaming video at a specific resolution. I am trying
> to merge the videos together using hstack and display using ffplay.
>
> This is the command I am using:
>
> ffmpeg -f dshow -pix_fmt uyvy422 -video_size 1280x2160 -i video="CAM_0" -f
> dshow -pix_fmt uyvy422 -video_size 1280x2160 -i video="CAM_1" -f dshow
> -pix_fmt uyvy422 -video_size 1280x2160 -i video="CAM_2" -filter_complex
> hstack=3 -f nut - | ffplay -
>
> There is a special case in this system. The USB Camera system will only
> start video stream only when 3 cameras are enabled (or started by ffmpeg).
> However when I am using the above command, ffmpeg starts CAM_0 and waits
> for data. As a result CAM_1 and CAM_2 are not started and video stream does
> not start.
>
> Is there any way where I can start all 3 inputs simultaneously using ffmpeg
> and then merge them together using hstack?
>
> Thanks!
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> https://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] MPEG2 Consistent/Same Bit Rate When Deinterlacing

2022-05-19 Thread Roger
Just a quick follow-up on this, seems I found another similar post sometime ago 
on this list, searching the archives.

The "[FFmpeg-user] Re-encode mpeg2 for same quality" 2015 archived response was:

"Use -qscale 2 -mbd 2 (or qscale 1)."


$ ffmpeg -i input.ts -filter:v yadif=1 -qscale 2 -mbd 2 -acodec copy out.mts

The incantation resulted with 2-3x's larger file size of the original transport 
stream. (eg. *.ts)

However the quality significantly improved, other than just using the "-b:v 
6500k" incantation, especially noticeable when viewing the high velocity human 
movements/exercises.

Shrugs.  I may still perform some test using the "-bufsize" option.

Any feedback would be great to hear on this.  The media is private VHS recorded 
events, captured using S-Video cable and a Hauppauge capture card.


Roger


> On Tue, May 17, 2022 at 10:04:06PM -0400, Roger wrote:
>I'm processing an original (interlaced) Hauppauge Transport Stream (eg. MPEG2) 
>video through the FFMpeg yadif deinterlace filter.  I would like to keep the 
>same original bitrate, preserving quality.  I've read the following, but only 
>seem to have success using "-maxrate 850 -b:v 6284k".
>
>Limiting the output bitrate
>https://trac.ffmpeg.org/wiki/Limiting%20the%20output%20bitrate
>
>ffprobe of original Hauppauge Transport Stream:
>Input #0, mpegts, from 'input.ts':
>  Duration: 00:14:40.51, start: 713.397322, bitrate: 6284 kb/s  <- bitrate
>  Program 1 
>  Stream #0:0[0x3e9]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), 
> yuv420p(tv, top first), 720x480 [SAR 8:9 DAR 4:3], 29.97 fps, 29.97 tbr, 90k 
> tbn, 59.94 tbc
>Side data:
>  cpb: bitrate max/min/avg: 850/0/0 buffer size: 1835008 vbv_delay: 
> N/A  <- maxrate
>  Stream #0:1[0x3ea]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, 
> fltp, 384 kb/s
>
>My best effort so far seems to be the following incanatation:
>
>$ ffmpeg -i input.ts -filter:v yadif=1 -maxrate 850 -b:v 6284k -acodec 
>copy out.mts
>
>660M input.ts
>726M output.mts
>
>This seems correct to me, expecting the deinterlaced file to be about 1/8-1/4 
>file size larger than the original interlaced file size"
>
>As far as using "-crf", this always seemed to provide a highly compressed 
>MPEG2 
>file, very unlike my experience with other video codec files.  Using 
>"-bufsize" 
>with the value from ffprobe input.ts also resulted with a highly compressed 
>MPEG2 file.  And to reiterate, specifying the maxrate and bitrate from ffprobe 
>input.ts so far seems to provide an almost identical quality of the original 
>MPEG2 transport stream.  Can I do better?
>
>As far as naming the output interlaced and deinterlaced files, is there a 
>naming standard specified for the deinterlaced video file?  eg.  
>file_name-int.mts and file_name-deint.mts?
>
>Roger
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


[FFmpeg-user] MPEG2 Consistent/Same Bit Rate When Deinterlacing

2022-05-17 Thread Roger
I'm processing an original (interlaced) Hauppauge Transport Stream (eg. MPEG2) 
video through the FFMpeg yadif deinterlace filter.  I would like to keep the 
same original bitrate, preserving quality.  I've read the following, but only 
seem to have success using "-maxrate 850 -b:v 6284k".

Limiting the output bitrate
https://trac.ffmpeg.org/wiki/Limiting%20the%20output%20bitrate

ffprobe of original Hauppauge Transport Stream:
Input #0, mpegts, from 'input.ts':
  Duration: 00:14:40.51, start: 713.397322, bitrate: 6284 kb/s  <- bitrate
  Program 1 
  Stream #0:0[0x3e9]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), 
yuv420p(tv, top first), 720x480 [SAR 8:9 DAR 4:3], 29.97 fps, 29.97 tbr, 90k 
tbn, 59.94 tbc
Side data:
  cpb: bitrate max/min/avg: 850/0/0 buffer size: 1835008 vbv_delay: N/A 
 <- maxrate
  Stream #0:1[0x3ea]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, 
fltp, 384 kb/s

My best effort so far seems to be the following incanatation:

$ ffmpeg -i input.ts -filter:v yadif=1 -maxrate 850 -b:v 6284k -acodec copy 
out.mts

660M input.ts
726M output.mts

This seems correct to me, expecting the deinterlaced file to be about 1/8-1/4 
file size larger than the original interlaced file size"

As far as using "-crf", this always seemed to provide a highly compressed MPEG2 
file, very unlike my experience with other video codec files.  Using "-bufsize" 
with the value from ffprobe input.ts also resulted with a highly compressed 
MPEG2 file.  And to reiterate, specifying the maxrate and bitrate from ffprobe 
input.ts so far seems to provide an almost identical quality of the original 
MPEG2 transport stream.  Can I do better?

As far as naming the output interlaced and deinterlaced files, is there a 
naming standard specified for the deinterlaced video file?  eg.  
file_name-int.mts and file_name-deint.mts?

Roger
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Difference between FFmpeg Formats/Containers MP4 and H264

2022-05-15 Thread Roger


>> So when specifying, "-f h264", format is assuming a video only container?
>>
>
>I'm not sure if container is the correct term here. It's just the video
>stream itself, which is capable of existing on its own outside of a
>container. Not many streams allow that. DV is one that comes to mind, where
>you can get raw DV streams which can even contain audio streams and other
>data streams interleaved. I've ran some tests and have not found a way to
>retain audio while using -f h264, it just retains a video stream. Maybe I'm
>doing something wrong?
>
>I can't think of many reasons why you would want standalone video streams
>outside of a container, so why not just use the mp4 or mkv format as it
>should have greater compatibility with playback tools?

Ah! I figured-out why "-f h264" worked for my past scenarios without apparent 
problems, the project contained only video, no audio!

And, likely omitting "-f h264" and only specifying the video/audio codecs for 
other video with audio projects.

Kind of a fluke my incanatation worked without apparent problems!

Glad I now have this ironed-out and am now correctly using ffmpeg for h264/mp4 
videos!

Always knew h264 was a codec, until seeing the "-f h264" option within ffmpeg 
manual.  Now I know "-f h264" for video only streams, likely security video 
without audio?

Roger

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Difference between FFmpeg Formats/Containers MP4 and H264

2022-05-14 Thread Roger
>MP4 can contain a variety of streams, including h264. If you specify h264,
>you just have a raw h264 video stream, that's it.

I've always used -f h264 until I ran into Hauppauge WinTV Version 10 using:

ffmpeg -i "[SOURCE_FILE]" -f mp4 -vcodec libx264 -preset ultrafast -profile:v 
main -acodec aac "[DEST_FILE].mp4

On the flip, I've never seen a *.h264 file name.

If I'm not mistaken, the "-f" specifies the format or the container of the 
video/audio stream.

So when specifying, "-f h264", format is assuming a video only container?

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


[FFmpeg-user] Difference between FFmpeg Formats/Containers MP4 and H264

2022-05-14 Thread Roger


What are the differences when specifying the format/contain as MP4 or H264?  
Are there any differences, when additionally specifying the video and audio 
codecs?

$ ffmpeg-formats |grep mp4 
 D  mov,mp4,m4a,3gp,3g2,mj2 QuickTime / MOV
  E mp4 MP4 (MPEG-4 Part 14)

MPEG-4 Part 14
https://en.wikipedia.org/wiki/MPEG-4_Part_14
Laste published release: January 2020; 2 years ago

$ ffmpeg-formats |grep h264
 DE h264raw H.264 video

Advanced Video Coding
https://en.wikipedia.org/wiki/Advanced_Video_Coding
Last published version: 22 August 2021


So what are the differences between the following?

$ ffmpeg -i video-in.ts -f mp4 -vcodec libx264 -preset ultrafast -profile:v 
main -acodec aac video-out.mp4

$ ffmpeg -i video-in.ts -f h264 -vcodec libx264 -preset ultrafast -profile:v 
main -acodec aac video-out.mp4


Roger

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Hauppauge WinTV-7164 Analog Composite/S-Video Capture

2022-05-06 Thread Roger

Small correction on this, as users might obtain a not playable video/container 
files from my prior incomplete instructions.

Use .avi container instead of .yuv!

c:\ ffmpeg -f dshow -crossbar_audio_input_pin_number 5 
-crossbar_video_input_pin_number 2 -i video="Hauppauge WinTV-7164 Analog 
Capture":audio="Hauppauge WinTV-7164 Analog Capture" -c copy test.avi

Example of huffyuv encoding, reducing by half size of original raw video, using 
lossless compression:

c:\ ffmpeg -f dshow -crossbar_audio_input_pin_number 5 
-crossbar_video_input_pin_number 2 -i video="Hauppauge WinTV-7164 Analog 
Capture":audio="Hauppauge WinTV-7164 Analog Capture" -vcodec huffyuv -acodec 
copy test.avi


>
>c:\ ffmpeg -f dshow -crossbar_audio_input_pin_number 5 
>-crossbar_video_input_pin_number 2 -i video="Hauppauge WinTV-7164 Analog 
>Capture":audio="Hauppauge WinTV-7164 Analog Capture" -c copy test.yuv
>
>Now I should be seeing within the captured raw file, stream of:
>stream #0: pixel_format=yuyv422 s=720x480 fps=29.97
>stream #1 s16 48000hz pcm 2 ch 1536 kb/s(?)
>
>Currently, I'm getting some flaky results with an unplayable file using 
>ffplay, etc.
>
>However, directly encoding the file to a compressed format provides a playable 
>file.  So I'm likely still doing something incorrect, and seemingly ffmpeg is 



signature.asc
Description: PGP signature
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Hauppauge WinTV-7164 Analog Composite/S-Video Capture

2022-05-03 Thread Roger
compatible input 
   
pins: 0 2 3 4 6 7 8 

   
[dshow @ 02229f7a5100]   Crossbar Output pin 1: "Audio Decoder" related 
output pin: 0 current input pin: -1 compatible input
   
 pins: 1 5 9

   
[dshow @ 02229f7a5100]   Crossbar Input pin 0 - "Video Tuner" related input 
pin: 1  
   
[dshow @ 02229f7a5100]   Crossbar Input pin 1 - "Audio Tuner" related input 
pin: 0  
   
[dshow @ 02229f7a5100]   Crossbar Input pin 2 - "Video Composite" related 
input pin: 5
 
[dshow @ 02229f7a5100]   Crossbar Input pin 3 - "S-Video" related input 
pin: 5
[dshow @ 02229f7a5100]   Crossbar Input pin 4 - "Video AUX" related input 
pin: 5
[dshow @ 02229f7a5100]   Crossbar Input pin 5 - "Audio Line" related input 
pin: 4
[dshow @ 02229f7a5100]   Crossbar Input pin 6 - "Video Composite" related 
input pin: 9
[dshow @ 02229f7a5100]   Crossbar Input pin 7 - "S-Video" related input 
pin: 9
[dshow @ 02229f7a5100]   Crossbar Input pin 8 - "Video AUX" related input 
pin: 9
[dshow @ 02229f7a5100]   Crossbar Input pin 9 - "Audio Line" related input 
pin: 8
video=Hauppauge WinTV-7164 Analog Capture: Immediate exit requested

Taking a closer look at the "Crossbar" titled inputs.  For me, only these 
specified working pin numbers for the Hauppauge 2250 card.

So, assuming ffmpeg uses the first tuner on the dual tuner card (eg. 
-video_device_number=0), the following provides both audio/video raw streams:

c:\ ffmpeg -f dshow -crossbar_audio_input_pin_number 5 
-crossbar_video_input_pin_number 2 -i video="Hauppauge WinTV-7164 Analog 
Capture":audio="Hauppauge WinTV-7164 Analog Capture" -c copy test.yuv

Now I should be seeing within the captured raw file, stream of:
stream #0: pixel_format=yuyv422 s=720x480 fps=29.97
stream #1 s16 48000hz pcm 2 ch 1536 kb/s(?)

Currently, I'm getting some flaky results with an unplayable file using ffplay, 
etc.

However, directly encoding the file to a compressed format provides a playable 
file.  So I'm likely still doing something incorrect, and seemingly ffmpeg is 
being far less forgiving with my incanatations on Windows versus Linux.  With 
using ffmpeg on Linux, I rarely get unplayable rawvideo files, if ever.

TIP: For those using Windows, use/install Cygwin rather than using the Windows 
terminal.  Bash shell will be far easier to work with it's history and syntax.

BUG: The device specific URLS provided by ffmpeg dshow "-list_devices" for this 
Hauppauge provided some extremely length device URLs that were apparently not 
recognized by the dshow video= and audio= options.


Reference:
FFmpeg Devices Documentation - dshow
https://ffmpeg.org/ffmpeg-devices.html#dshow



> On Tue, May 03, 2022 at 01:43:29PM -0400, Roger wrote:
>
>I've found, after searching Google, the following on Linux pulls the entire 
>video+audio stream successfully:
>
>$ cat /dev/video0 | ffplay -
>
>or,
>
>$ ffmpeg -i file:/dev/video0
>
>Now to see if doing so within Windows does the same, likely omitting the "-f 
>dshow", similar to omitting the "-f v4l2" on Linux.
>
>
>
>
>- Forwarded message from Roger  -
>
>Subject: Hauppauge WinTV-7164 Analog Composite/S-Video Capture
>Date: Sat, 30 Apr 2022 14:50:31 -0400
>From: Roger 
>To: ffmpeg-user@ffmpeg.org
>
>Seems I'm plagued by the same bug as described here:
>https://stackoverflow.com/questions/19113197/ffmpeg-directshow-capture-2-pins
>
>
>Hauppauge WinTV-7164 Analog Composite/S-Video Capture
>Using this device trying to capture composite/s-video from VCR/VHS media.
>
>dshow device: "Hauppauge WinTV-7164 Analog Capture"
>
>(NTSC Model 2250)
>
>
>This Hauppauge PCIe device, likely many Hauppuage video capture cards,
>capture a video stream already combined with an audio stream.  The bug
>and fix posted to stackoverflow is similar.
>
>When specifying the following, either with/without ~Audio (pin 1), I
>do get an audio str

Re: [FFmpeg-user] Hauppauge WinTV-7164 Analog Composite/S-Video Capture

2022-05-03 Thread Roger

I've found, after searching Google, the following on Linux pulls the entire 
video+audio stream successfully:

$ cat /dev/video0 | ffplay -

or,

$ ffmpeg -i file:/dev/video0

Now to see if doing so within Windows does the same, likely omitting the "-f 
dshow", similar to omitting the "-f v4l2" on Linux.




- Forwarded message from Roger  -

Subject: Hauppauge WinTV-7164 Analog Composite/S-Video Capture
Date: Sat, 30 Apr 2022 14:50:31 -0400
From: Roger 
To: ffmpeg-user@ffmpeg.org

Seems I'm plagued by the same bug as described here:
https://stackoverflow.com/questions/19113197/ffmpeg-directshow-capture-2-pins


Hauppauge WinTV-7164 Analog Composite/S-Video Capture
Using this device trying to capture composite/s-video from VCR/VHS media.

dshow device: "Hauppauge WinTV-7164 Analog Capture"

(NTSC Model 2250)


This Hauppauge PCIe device, likely many Hauppuage video capture cards,
capture a video stream already combined with an audio stream.  The bug
and fix posted to stackoverflow is similar.

When specifying the following, either with/without ~Audio (pin 1), I
do get an audio stream detected, but the audio stream is null/silent:

D:\record-ffmpeg>ffplay -f dshow -rtbufsize 500M -video_size 720x480
-framerate 29.97 -audio_pin_name ~Audio -show_video_device_dialog
false -i video="Hauppauge WinTV-7164 Analog Capture":audio="Hauppauge
WinTV-7164 Analog Capture"

D:\record-ffmpeg>ffplay -f dshow -video_size 352x288 -i
video="Hauppauge WinTV-7164 Analog Capture":audio="Hauppauge
WinTV-7164 Analog Capture"
ffplay version 5.0.1-full_build-www.gyan.dev Copyright (c) 2003-2022
the FFmpeg developers
  built with gcc 11.2.0 (Rev7, Built by MSYS2 project)
  configuration: --enable-gpl --enable-version3 --enable-static
--disable-w32threads --disable-autodetect --enable-fontconfig
--enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp
--enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib
--enable-librist --enable-libsrt --enable-libssh --enable-libzmq
--enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2
--enable-libdav1d --enable-libdavs2 --enable-libuavs3d
--enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp
--enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxvid
--enable-libaom --enable-libopenjpeg --enable-libvpx
--enable-mediafoundation --enable-libass --enable-frei0r
--enable-libfreetype --enable-libfribidi --enable-liblensfun
--enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf
--enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec
--enable-nvenc --enable-d3d11va --enable-dxva2 --enable-libmfx
--enable-libshaderc --enable-vulkan --enable-libplacebo
--enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug
--enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame
--enable-libshine --enable-libtheora --enable-libtwolame
--enable-libvo-amrwbenc --enable-libilbc --enable-libgsm
--enable-libopencore-amrnb --enable-libopus --enable-libspeex
--enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite
--enable-libmysofa --enable-librubberband --enable-libsoxr
--enable-chromaprint
  libavutil  57. 17.100 / 57. 17.100
  libavcodec 59. 18.100 / 59. 18.100
  libavformat59. 16.100 / 59. 16.100
  libavdevice59.  4.100 / 59.  4.100
  libavfilter 8. 24.100 /  8. 24.100
  libswscale  6.  4.100 /  6.  4.100
  libswresample   4.  3.100 /  4.  3.100
  libpostproc56.  3.100 / 56.  3.100
[dshow @ 020e9bb04f00] Could not find audio only device with name
[Hauppauge WinTV-7164 Analog Capture] among source devices of type
audio.
[dshow @ 020e9bb04f00] Searching for audio device within video
devices for Hauppauge WinTV-7164 Analog Capture
Input #0, dshow, from 'video=Hauppauge WinTV-7164 Analog
Capture:audio=Hauppauge WinTV-7164 Analog Capture':
  Duration: N/A, start: 3738.527811, bitrate: 1536 kb/s
  Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 352x288,
29.97 fps, 29.97 tbr, 1k tbn
  Stream #0:1: Audio: pcm_s16le, 48000 Hz, 2 channels, s16, 1536 kb/s
3740.13 A-V: -0.010 fd=   0 aq=0KB vq=0KB sq=0B f=0/0

D:\record-ffmpeg>


It's possible the fix was, possibly a hack/workaround, specifically
applied only for certain devices such as "AJA Capture Source".  Many
Hauppauge capture cards should be added as well.

Sometimes right after booting into Windows 10, I may get ffplay to
play audio only with a blue screen.  Subsequent ffplay executions drop
audio, with a blue screen, until either Hauppauge's WinTV10 or
VirtualDub64 are subsequently executed, likely properly initializing
the video capture card.  Subsequent ffplay executions show video
properly but again without audio, only stating the video/audio stream
specifications.  Nor is the audio stream recorded to file using
ffmpeg.

- End forwarded message -


signature.asc
Des

[FFmpeg-user] Hauppauge WinTV-7164 Analog Composite/S-Video Capture

2022-04-30 Thread Roger
Seems I'm plagued by the same bug as described here:
https://stackoverflow.com/questions/19113197/ffmpeg-directshow-capture-2-pins


Hauppauge WinTV-7164 Analog Composite/S-Video Capture
Using this device trying to capture composite/s-video from VCR/VHS media.

dshow device: "Hauppauge WinTV-7164 Analog Capture"

(NTSC Model 2250)


This Hauppauge PCIe device, likely many Hauppuage video capture cards,
capture a video stream already combined with an audio stream.  The bug
and fix posted to stackoverflow is similar.

When specifying the following, either with/without ~Audio (pin 1), I
do get an audio stream detected, but the audio stream is null/silent:

D:\record-ffmpeg>ffplay -f dshow -rtbufsize 500M -video_size 720x480
-framerate 29.97 -audio_pin_name ~Audio -show_video_device_dialog
false -i video="Hauppauge WinTV-7164 Analog Capture":audio="Hauppauge
WinTV-7164 Analog Capture"

D:\record-ffmpeg>ffplay -f dshow -video_size 352x288 -i
video="Hauppauge WinTV-7164 Analog Capture":audio="Hauppauge
WinTV-7164 Analog Capture"
ffplay version 5.0.1-full_build-www.gyan.dev Copyright (c) 2003-2022
the FFmpeg developers
  built with gcc 11.2.0 (Rev7, Built by MSYS2 project)
  configuration: --enable-gpl --enable-version3 --enable-static
--disable-w32threads --disable-autodetect --enable-fontconfig
--enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp
--enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib
--enable-librist --enable-libsrt --enable-libssh --enable-libzmq
--enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2
--enable-libdav1d --enable-libdavs2 --enable-libuavs3d
--enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp
--enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxvid
--enable-libaom --enable-libopenjpeg --enable-libvpx
--enable-mediafoundation --enable-libass --enable-frei0r
--enable-libfreetype --enable-libfribidi --enable-liblensfun
--enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf
--enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec
--enable-nvenc --enable-d3d11va --enable-dxva2 --enable-libmfx
--enable-libshaderc --enable-vulkan --enable-libplacebo
--enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug
--enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame
--enable-libshine --enable-libtheora --enable-libtwolame
--enable-libvo-amrwbenc --enable-libilbc --enable-libgsm
--enable-libopencore-amrnb --enable-libopus --enable-libspeex
--enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite
--enable-libmysofa --enable-librubberband --enable-libsoxr
--enable-chromaprint
  libavutil  57. 17.100 / 57. 17.100
  libavcodec 59. 18.100 / 59. 18.100
  libavformat59. 16.100 / 59. 16.100
  libavdevice59.  4.100 / 59.  4.100
  libavfilter 8. 24.100 /  8. 24.100
  libswscale  6.  4.100 /  6.  4.100
  libswresample   4.  3.100 /  4.  3.100
  libpostproc56.  3.100 / 56.  3.100
[dshow @ 020e9bb04f00] Could not find audio only device with name
[Hauppauge WinTV-7164 Analog Capture] among source devices of type
audio.
[dshow @ 020e9bb04f00] Searching for audio device within video
devices for Hauppauge WinTV-7164 Analog Capture
Input #0, dshow, from 'video=Hauppauge WinTV-7164 Analog
Capture:audio=Hauppauge WinTV-7164 Analog Capture':
  Duration: N/A, start: 3738.527811, bitrate: 1536 kb/s
  Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 352x288,
29.97 fps, 29.97 tbr, 1k tbn
  Stream #0:1: Audio: pcm_s16le, 48000 Hz, 2 channels, s16, 1536 kb/s
3740.13 A-V: -0.010 fd=   0 aq=0KB vq=0KB sq=0B f=0/0

D:\record-ffmpeg>


It's possible the fix was, possibly a hack/workaround, specifically
applied only for certain devices such as "AJA Capture Source".  Many
Hauppauge capture cards should be added as well.

Sometimes right after booting into Windows 10, I may get ffplay to
play audio only with a blue screen.  Subsequent ffplay executions drop
audio, with a blue screen, until either Hauppauge's WinTV10 or
VirtualDub64 are subsequently executed, likely properly initializing
the video capture card.  Subsequent ffplay executions show video
properly but again without audio, only stating the video/audio stream
specifications.  Nor is the audio stream recorded to file using
ffmpeg.
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Fluorescent White Balance Video Filters

2022-04-27 Thread Roger

Chapter 2.28 uses filter_complex without white spaces or colons?

$ ffmpeg -i 20220421_163910-meet.mp4 -i image_with_clut-meet-edited-clut.png 
-filter_complex [0]eq=brightness=0.06[a];[a][1]haldclut -y 
20220421_163910-meet-edited.mp4

Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '20220421_163910-meet.mp4': 

   
...
Filter eq has an unconnected output 

   
bash: [a][1]haldclut: command not found



Notice, "haldclut command not found"

Including white space for filter_complex:

$ ffmpeg -i 20220421_163910-meet.mp4 -i image_with_clut-meet-edited-clut.png 
-filter_complex '[0]eq=brightness=0.06 [a];[a] [1] haldclut' -y 20220421_1
63910-meet-edited.mp4

man ffmpeg-filters

Looks like you syntax omits the single quotes around the filter_complex option 
AND has missing white spaces and/or colons.

Not sure if "[a];[a]" should have white spaces.



For sake of simplicity, here's what I have:

/* OWN MODIFIED LUT OR WHITE BALANCE FILTER */

Snip or extract portion of video requiring white balance, color or brightness 
corrections.
$ ffmpeg -ss 590 -i 20220421_163910.mp4 -t 867 -codec copy 
20220421_163910-meet.mp4

Frame of (VHS) video requiring white balance or color/brightness correction, 
creating a CLUT embedded extracted image:
This is frame at "11:00" for this specific video

$ ffmpeg -ss 11:00 -i 20220421_163910.mp4 -f lavfi -i haldclutsrc=8 
-filter_complex 
"[1]format=pix_fmts=rgb48be[a];[a][0]xstack=inputs=2:layout=0_0|w0_0" -frames 1 
-y image_with_clut-meet.png

Open image_with_clut-meet.png within Gimp, edit Color > Color Temperature 
and/or Color > Levels > Output Levels, export as "16bpc RGB" 
image_with_clut-meet-edited.png.

Crop the CLUT (512:512) from the Gimp exported image.
$ ffmpeg -i image_with_clut-meet-edited.png -vf crop=512:512:0:0 -y 
image_with_clut-meet-edited-clut.png

Apply the resulting CLUT using FFmpeg's filter_complex.
TIP: Use acodec copy as this is only modifying the video stream.
TODO: Brightness reduction for color retention?
TODO: Added white space to filter_complex, likely could also use colons, but 
white spacing is better for readability.

$ ffmpeg -i 20220421_163910-meet.mp4 -i image_with_clut-meet-edited-clut.png 
-filter_complex '[0]eq=brightness=0.06 [a];[a] [1] haldclut' -y 
20220421_163910-meet-edited.mp4



signature.asc
Description: PGP signature
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Fluorescent White Balance Video Filters

2022-04-27 Thread Roger
> On Wed, Apr 20, 2022 at 07:35:22AM +0200, Michael Koch wrote:
>
>You can make your own LUT, as described in chapter 2.28 in my book.
>-- Extract one frame from your video.
>-- Insert a haldclut in a corner of the image, or use xstack to attach it to
>the side.
>-- Use the program of your choice to correct the colors in this image. When
>done, save it lossless as 16-bit PNG.
>-- Use FFmpeg to separate the haldclut from the image.
>-- Apply the LUT to the whole video.
>
>Michael

Chapter 2.27, p59;

Step 3) "The extracted image is opened in GIMP." (eg. image.png?)

Should mention the file name, else the reader has to turn back a page, eg.  
image.png?

What if the video is interlaced, the extracted image will be interlaced!



Step 4) "The color table will be opened in GIMP, selected with "Select all" and 
copied with ctrl-c."

I would imagine this color table is clut.png from Step 2?

"Select All" and "CTRL-C" is very ambiguous!  For me, this does nothing.

Oh, now I see, instructions should read, "From within Gimp, open both, 
image.png and clut.png images"; Right click on the clut.png Gimp tab, click 
"Edit Select all" and then click "Edit > Copy".  Pasting the clut.png image 
into my extracted image of the Gimp image.png tab.  Subsequently, I noticed the 
extracted image.png image is 640x480 and the 512x512, covering almost the 
entire extract image from the 640x480 video.


As far as the Windows' batch file syntax instead of using generic scripting, I 
have my "Windows Batch file to Unix/Linux Script file" glasses on.  Straining 
one eye to the sky while holding my foot also seems to help!

Here's how many other books' format commands rather than inhibiting Windows or 
Unix/Linux formats:

To extract frame 51.47 from 20220421_163910.mp4 video, saving to image.png 
file:
$ ffmpeg -ss 51.47 -i 20220421_163910.mp4 -frames 1 -y image.png

Generate an identity Hald CLUT stream with eight levels:
$ ffmpeg -f lavfi -i haldclutsrc=8 -frames 1 -pix_fmt rgb48be clut.png


Chapter 2.28 p61;
Rather than, "The above workflow can be simplified as follows:", use "Chapter 
2.27 can be condensed into a one line FFmpeg incanatation: ..."?

Instead of "Step 2: This image is now processed in GIMP", use "Step 2: Now open 
the Image_with_CLUT.png within Gimp ... "

" .. then exported with the same file name as 16-bit
PNG ... "?  This is another point of confusion.  Most readers are expecting a 
suffixed file name such as "image_with_clut-edited.png" (regardless of case), 
as using the same filename lacks clear separation of performed tasks.  For 
example, most experienced users will try performing each scripted task without 
reading the instructions.


I think I got this now, in lamens terms, just start at Chapter 2.28 when 
creating a modified CLUT, as you initially stated.

I prefer, and is likely one of the most popular edits/modifications nowadays, 
is automatic white balance or a specific white balance correction, and secondly 
likely most popular/importance option is correcting age color fading, 
brightness, etc.



signature.asc
Description: PGP signature
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Issues deinterlacing DirectShow input with ffplay

2022-04-26 Thread Roger Pack
Without yadif it comes in at 59.94i ?

On Sat, Apr 23, 2022 at 1:36 PM Alex  wrote:
>
> On Tuesday, April 19th, 2022 at 2:03 AM, Roger Pack  
> wrote:
> > Can you replicate it not using dshow?
>
> Not in any way that I know of. I recorded raw output from the capture card
> (using -c copy) to an AVI, and when playing the file with ffplay the yadif
> filter works fine (also when piping the file through stdin).
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Fluorescent White Balance Video Filters

2022-04-20 Thread Roger
>> Found through Google,
>> https://lutify.me/free-white-balance-correction-luts-for-everyone/
>> Download: Free White Balance Correction LUTs
>> Lutify-me-Free-White-Balance-Correction-3D-LUTs.zip
>> 
>> 2800 Kelvin - 3200 Kelvin - 0.34 CTO.cube
>> 2800 Kelvin - 4300 Kelvin - 0.95 CTO.cube
>> 2800 Kelvin - 5500 Kelvin - 1.34 CTO.cube
>> 2800 Kelvin - 6500 Kelvin - 1.56 CTO.cube
>> 3200 Kelvin - 2800 Kelvin - 0.34 CTB.cube
>> 3200 Kelvin - 4300 Kelvin - 0.61 CTO.cube
>> 3200 Kelvin - 5500 Kelvin - 1.00 CTO.cube
>> 3200 Kelvin - 6500 Kelvin - 1.21 CTO.cube
>> 4300 Kelvin - 2800 Kelvin - 0.95 CTB.cube
>> 4300 Kelvin - 3200 Kelvin - 0.61 CTB.cube
>> 4300 Kelvin - 5500 Kelvin - 0.39 CTO.cube
>> 4300 Kelvin - 6500 Kelvin - 0.60 CTO.cube
>> 5500 Kelvin - 2800 Kelvin - 1.34 CTB.cube
>> 5500 Kelvin - 3200 Kelvin - 1.00 CTB.cube
>> 5500 Kelvin - 4300 Kelvin - 0.39 CTB.cube
>> 5500 Kelvin - 6500 Kelvin - 0.21 CTO.cube
>> 6500 Kelvin - 2800 Kelvin - 1.56 CTB.cube
>> 6500 Kelvin - 3200 Kelvin - 1.21 CTB.cube
>> 6500 Kelvin - 4300 Kelvin - 0.60 CTB.cube
>> 6500 Kelvin - 5500 Kelvin - 0.21 CTB.cube
...
>> Are these LUTS my best options?  Or does the paid subscription offer better?
>> Or are there other white balance/color correcting LUTS elsewhere?
>

Think I figured-out, the previously listed kelvin cube files above, seem to be 
a compilation of gradual generic tint lut/cube files.

>You can make your own LUT, as described in chapter 2.28 in my book.
>-- Extract one frame from your video.
>-- Insert a haldclut in a corner of the image, or use xstack to attach it to
>the side.
>-- Use the program of your choice to correct the colors in this image. When
>done, save it lossless as 16-bit PNG.
>-- Use FFmpeg to separate the haldclut from the image.
>-- Apply the LUT to the whole video.
>
>Michael

Sorry.  I had a little difficulty with understanding the text within the 
Chapter 2.28.  Just couldn't scan and understand easily.  Although I thoroughly 
understand SH/Bash, the variable assignments for simple filenames also slowed 
reading, as I had to scan/reference back to the top of the page for the 
definition/intent of the variable used.  Easier understanding for the reader to 
read, "input_video.ext" and "output_video.ext", or use a similar named variable 
such as $_input_video.ext and $_output_video.ext. Note: using a prefixed 
underscore within SH/Bash variables separates/isolates from any possible 
conflicts.  Heard of using this syntax from the Bash mailing list when doing C 
style defines in SH/Bash.  However, for the sake of easy readability for 
readers, it's probably best to use simple non-isolated variables.

Some of the terminology used at this chapter is above my head, but I've been 
looking to do something exactly as you described when using my Nikon D5600 when 
taking raw photos and attaining similar results to it's Capture NX-D/Studio.

I've flagged this Email and will hopefully return to it when I get more free 
time.  I waste a full day on these larger tasks, especially on first time 
workflows.

Don't worry, I did figure-out how to apply LUTS, either from the chapter or 
from additional Google searching for ffmpeg Stack Exchange examples, etc.

Roger

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Fluorescent White Balance Video Filters

2022-04-19 Thread Roger
> On Tue, Apr 19, 2022 at 06:32:39PM +, Carl Zwanzig wrote:
>On 4/19/2022 11:07 AM, Roger wrote:
>> I noticed the colorcorrect filter, for use with augmenting the Red and Blue
>> channels.
>
>Touching on that- in most video systems green is the fixed* level and R/B are
>adjusted to that. Makes sense that ffmpeg does the same.
>
>*not separately adjustable
>
>z!

Ah!  That makes sense as I was manually playing with RGB linear values, and 
noticed a pattern.

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Fluorescent White Balance Video Filters

2022-04-19 Thread Roger
> On Tue, Apr 19, 2022 at 08:25:17PM +0200, Michael Koch wrote:
>Am 19.04.2022 um 03:34 schrieb Roger:
>> I'm not finding much about fixing video having incorrect white space, more
>> specifically video with incorrect or forgotten fluorescent white balance
>> setting during recording.
...
>> 
>> Question, what is the proper method of applying such a missing white balance
>> fluorescent filter to a video file using ffmpeg?
>
>I would do that with a color-look-up-table. The procedure is described
>step-by-step in chapter 2.27 and an easier simplified version is described in
>chapter 2.28 in my book:
>http://www.astro-electronic.de/FFmpeg_Book.pdf
>
>Michael

No stranger to these color tables for correction white balance, as my Nikon 
DSLR and likely most recent DLSR cameras use some correcting color profiles 
these days!

However, seems there's few options for video files.

Found through Google,
https://lutify.me/free-white-balance-correction-luts-for-everyone/
Download: Free White Balance Correction LUTs
Lutify-me-Free-White-Balance-Correction-3D-LUTs.zip

2800 Kelvin - 3200 Kelvin - 0.34 CTO.cube
2800 Kelvin - 4300 Kelvin - 0.95 CTO.cube
2800 Kelvin - 5500 Kelvin - 1.34 CTO.cube
2800 Kelvin - 6500 Kelvin - 1.56 CTO.cube
3200 Kelvin - 2800 Kelvin - 0.34 CTB.cube
3200 Kelvin - 4300 Kelvin - 0.61 CTO.cube
3200 Kelvin - 5500 Kelvin - 1.00 CTO.cube
3200 Kelvin - 6500 Kelvin - 1.21 CTO.cube
4300 Kelvin - 2800 Kelvin - 0.95 CTB.cube
4300 Kelvin - 3200 Kelvin - 0.61 CTB.cube
4300 Kelvin - 5500 Kelvin - 0.39 CTO.cube
4300 Kelvin - 6500 Kelvin - 0.60 CTO.cube
5500 Kelvin - 2800 Kelvin - 1.34 CTB.cube
5500 Kelvin - 3200 Kelvin - 1.00 CTB.cube
5500 Kelvin - 4300 Kelvin - 0.39 CTB.cube
5500 Kelvin - 6500 Kelvin - 0.21 CTO.cube
6500 Kelvin - 2800 Kelvin - 1.56 CTB.cube
6500 Kelvin - 3200 Kelvin - 1.21 CTB.cube
6500 Kelvin - 4300 Kelvin - 0.60 CTB.cube
6500 Kelvin - 5500 Kelvin - 0.21 CTB.cube

Yea, horrendously named file names!

$ ffmpeg -i 3.avi -vf lut3d=/tmp/lut/Lutify.me\ Free\ White\ Balance\ 
Correction\ 3D\ LUTs/3D\ LUTs/6500\ Kelvin\ -\ 5500\ Kelvin\ -\ 0.21\ CTB.cube 
-crf 27 -preset veryfast -c:a copy test2.mp


On my second try, tried "6500 Kelvin - 5500 Kelvin - 0.21" and this is very 
close to RawTherapee's Philips TL85 fluorescent white balance setting results, 
if not better than RawTherapee!  Still not absolutely perfect likely more due 
to VHS tape degradation or re-recorded degradation, as this section/sesesion of 
the VHS tape was a recording of a recording of a recording during the 1990's.

For reference for those wondering how I digitized without a time correcting 
device, I used ffmpeg on the compressed original video captures to augment the 
video/audio sync problem and was trial and error.  Subsequently using the yadif 
deinterlace filter and specifying the double pass option.
 -filter:v "setpts=0.8015*PTS",yadif=1

Are these LUTS my best options?  Or does the paid subscription offer better?  
Or are there other white balance/color correcting LUTS elsewhere?


___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Fluorescent White Balance Video Filters

2022-04-19 Thread Roger
For those wondering about the fluorescent color settings, for filtering 
different standard fluorescent lighting scenarios within photography or videos:

Color temperature white points Wikipedia
https://en.wikipedia.org/wiki/Template:Color_temperature_white_points

This includes my video containing "F10 Philips TL85, Ultralume 50", having 
overly green saturated or green masking.

Looks like I'll need to somehow convert those CIE color space values to RGB?

I noticed the colorcorrect filter, for use with augmenting the Red and Blue 
channels.  Again, doesn't appear much is talked about colorcorrect when 
searching via Google.
  
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Issues deinterlacing DirectShow input with ffplay

2022-04-18 Thread Roger Pack
On Mon, Apr 11, 2022 at 2:50 PM Alex  wrote:
>
> On Sunday, April 10th, 2022 at 1:18 AM, Roger Pack  
> wrote:
> > Input frame rate is still the same both ways?
>
> Yes, 29.97 fps either way (didn't realize that it's covered in the video;
> the log files should have everything though).

Can you replicate it not using dshow?
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Change device setting from CLI?

2022-04-18 Thread Roger Pack
On Sun, Apr 10, 2022 at 7:28 AM Steve Russell  wrote:
>
> I bought an HD webcam during the pandemic and it's been pretty good except 
> the colours were a bit off. I finally decided to try and do something about 
> it. The app I use most with it (Zoom) doesn't offer any way to tweak the 
> camera settings so, after a search, I found that FFmpeg can do the job. If I 
> run this command:
>
> ffmpeg-5.0.1-full_build\bin\ffmpeg -f dshow -show_video_device_dialog true -i 
> video="HD Web Camera"
>
> I get a dialog that lets me change several settings. Simply dropping the 
> saturation down to about 60 does the trick but the change doesn't stick - I 
> assume Zoom is resetting the camera when it starts up. To avoid having to 
> manually set the saturation as above, it would be great to have a command 
> line to change the setting. I've experimented with:
>
> ffmpeg-5.0.1-full_build\bin\ffmpeg -f dshow -i video="HD Web Camera" -vf 
> eq=saturation=0.6 output
>
> but this runs without changing the appearance in Zoom. I'm guessing that this 
> is because I'm putting the filter in between the camera and "output" rather 
> than changing the device settings directly.
>
> I've dug around in the documentation but find it somewhat overwhelming and 
> don't really know where to start looking. Any pointers gratefully received!

There are some filters that support settings but apparently it's hit or miss
https://stackoverflow.com/a/27931102/32453
you can try the video_device_load and video_device_save params, if
they work it works, otherwise you're kind of stuck.
Good luck!
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


[FFmpeg-user] Fluorescent White Balance Video Filters

2022-04-18 Thread Roger
I'm not finding much about fixing video having incorrect white space, more 
specifically video with incorrect or forgotten fluorescent white balance 
setting during recording.

I have a very old VHS recorded tape without a fluorescent white balance 
applied, and as such a green mask throughout the video.

I started examing one frame of the video, luckily having distinct red, white, 
blue colors and was viewing the frame within RawTherapee and tinkering with 
white balance settings.  Low and behold, after applying the "Philips TL85 
fluorescent" (designated as "F10 - Philips TL85") white balance preset profile, 
the red/green/blue values all appeared exact!

However, I cannot find this preset profile within RawTherapee installed files, 
nor do the *.pp3 sidecar files indicate specific values, hinting the values are 
either hard-coded or further buried within the installed system files, for 
which fgrep insensitive searching still fails to find the specifics of this 
filter.

Question, what is the proper method of applying such a missing white balance 
fluorescent filter to a video file using ffmpeg?

I'm surprised there's so little mentioned via a Google search, as users are 
always mucking-up the white balance settings while video recording or taking 
photos, unless using a color/gray card.


Roger



signature.asc
Description: PGP signature
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Issues deinterlacing DirectShow input with ffplay

2022-04-09 Thread Roger Pack
Input frame rate is still the same both ways?

On Fri, Aug 13, 2021 at 10:06 PM Alex via ffmpeg-user
 wrote:
>
> I'm using ffplay as a live preview for my capture card, and I'm trying to use
> the yadif filter in mode 1 (send_field) to deinterlace the 59.94i input to
> 59.94 fps.
>
> Using this command works as expected:
>
>   ffplay -f dshow -i video="SA7160 PCI, Analog 01 Capture" -vf yadif=1
>
> However, if I try the same command with an audio device included, the frame
> rate appears capped at 29.97 fps as if I were using mode 0 (send_frame):
>
>   ffplay -f dshow -i video="SA7160 PCI, Analog 01 Capture":audio="SA7160 PCI,
>   Analog 01 WaveIn" -vf yadif=1
>
> Why does the filter only work properly when there is no audio device on the
> input? Is there a workaround for this or is it a bug?
>
> Here's a screen recording as a demonstration: https://streamable.com/p0v9rk
> Console output is attached.
>
> Alex___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> https://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] ADD REAL TIME STAMP TO RTSP STREAM SEGMENT

2020-04-28 Thread Roger Pack
On Fri, Apr 17, 2020 at 5:38 PM Michael Glenn Williams
 wrote:
>
> We have interest in being able to read timestamps in protocol headers also.
>
> Do we have a doc that specifies which protocols in the video stream ffmpeg
> supports with a timecode, and what format of the timecode is?

Probably depends on the protocol.  ffprobe comes to mind.
My hunch/guess is it's different from .mp4 versus .ts or what not (container).
Maybe can pull it out of the RTSP stream but you'd be parsing raw bytes?
GL!
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] ADD REAL TIME STAMP TO RTSP STREAM SEGMENT

2020-04-17 Thread Roger Pack
On Tue, Sep 3, 2019 at 3:01 PM Alejandro Escudero
 wrote:
>
> Hi,
>
> I am trying to get a RTSP stream from an IP Camera and save that stream as 
> several  MP4 segments, but i need that each file segment name gets the real 
> time stamp of the video stream. (If I use the -strftime 1, it gets the time 
> of the local machine but i need the real rtsp time). How can I get the Real 
> Time Stamp?
>
> I am suing this command:
>
>
> ffmpeg -rtsp_transport tcp -i 
> "rtsp://admin:p...@myddns.dyndns.org:554/cam/realmonitor?channel=1&subtype=0" 
> –f segment -segment_time 5 -c copy  OUT%d.mp4


Maybe if not mp4 it will use the capture ts?
Maybe -copyts 
https://ffmpeg-user.ffmpeg.narkive.com/fzTrnfHX/getting-precise-start-time-from-wall-clock-for-capture
?
Maybe -use_source_wallclock_as_timestamps
https://lists.ffmpeg.org/pipermail/ffmpeg-user/2016-March/031250.html
Maybe https://stackoverflow.com/questions/51085133/does-pts-have-to-start-at-0
GL!
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] delay time in live ultrasound converter

2020-04-17 Thread Roger Pack
On Thu, Aug 22, 2019 at 3:16 PM Michael Koch
 wrote:
>
> Hello Paul,
>
> > ffplay and using pipe gives you huge delay. By using mpv and filtergraph
> > directly you would get much lesser delay.
> > Default delay introduced by this filter is the one set by win_size in
> > number of samples, which is by default 4096.
> > If you need less delay even than this one and can not use lower win_size
> > because of lesser precision use amultiply solution which
> > inherently have 0 delay.
>
> In my test the FFT method has a shorter delay time than the amultiply
> method.
> I just found out that the delay can be minimized by setting
> -audio_buffer_size to a very small value (10ms).
> Delay is now about 0.5 seconds. Short enough to see and hear the bats
> simultaneously.

Se also https://trac.ffmpeg.org/wiki/DirectShow#BufferingLatency
GL! :)
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] DVD EIA-608 to ASS from stdin with custom style

2020-04-01 Thread Roger Pack
Does it use cc_dec for the decoder?

On Fri, Sep 20, 2019 at 12:06 AM CGar  wrote:
>
> When extracting EIA-608 captions from a DVD to an ASS file I am
> unsure how to  specify the style. I've seen snippets online about a
> force_style option but it seems to be about hardcoding them or
> specifying a subtitle file as input. Since I'm reading from stdin I'm
> unsure where or how to write the relevant part.
>
> My command line is as follows:
> [...] | ffmpeg -f lavfi -i 'movie=pipe\\:0[out0+subcc]' -map s sub_out.ass
>
> Where or how would I add something like:
> force_style=BorderStyle=1,Outline=2
>
> --
> -CGar-
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> https://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Ffmpeg is a Windows codec?

2020-03-30 Thread Roger Pack
On Thu, Sep 19, 2019 at 11:37 AM jamie marchant
 wrote:
>
> Is their a Windows codec available? I want to play Indeo video 3 files,
> which FFPlay can do but I want to play it through 'Video for
> Windows"(Windows NT/10 version). That way I can run an old piece of
> multimedia software.

possibly LAV filters or ffdshow tryouts (latter seems defunct now? dang...)
GL!
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] "Invalid argument" when running ffmpeg.exe with a pipe as video input

2020-03-26 Thread Roger Pack
Same version of ffmpeg works on other computers?  I suppose you could
drill into the code and try and see what it's doing with pipes...

On Fri, Oct 18, 2019 at 8:02 AM Alessandro Santos
 wrote:
>
> I am creating a c# application to record videos H264.
>
> The workflow of my application is like this:
> 1) My application connects to a webcam using DirectShow and presents the 
> frames in the UI;
> 2) When recording the app starts a FFmpeg process that accepts a video input 
> from a pipe;
> 3) After that in my app a NamedPipeServerStream is created and the frames are 
> wrote there.
>
> Till now, I never had a problem with this method, but yesterday I started 
> having problems in a new computer. The problem seems to be related with the 
> creation of the pipe.
>
> This is the command I am currently running:
>
> -loglevel debug -thread_queue_size 1024 -framerate 30 -f rawvideo -pix_fmt 
> bgra -video_size 848x480 -use_wallclock_as_timestamps true -i 
> \\.\pipe\28e1b96a-c6ea-43f2-88a0-2e7e4bfb48ed -framerate 25 -vcodec libx264 
> -crf 23 -pix_fmt yuv420p -preset ultrafast -framerate 30  
> "2032787211637070049780945982.mp4"
>
> And these are the logs I got from running the command:
>
> ffmpeg version N-91715-gd71dfc087b Copyright (c) 2000-2018 the FFmpeg 
> developers
>   built with gcc 8.2.1 (GCC) 20180813
>   configuration: --enable-gpl --enable-version3 --enable-sdl2 
> --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass 
> --enable-libbluray --enable-libfreetype --enable-libmp3lame 
> --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg 
> --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr 
> --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack 
> --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 
> --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab 
> --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa 
> --enable-libspeex --enable-libxvid --enable-libaom --enable-libmfx 
> --enable-amf --enable-ffnvcodec --enable-cuvid --enable-d3d11va 
> --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth
>   libavutil  56. 19.100 / 56. 19.100
>   libavcodec 58. 27.100 / 58. 27.100
>   libavformat58. 17.103 / 58. 17.103
>   libavdevice58.  4.101 / 58.  4.101
>   libavfilter 7. 26.100 /  7. 26.100
>   libswscale  5.  2.100 /  5.  2.100
>   libswresample   3.  2.100 /  3.  2.100
>   libpostproc55.  2.100 / 55.  2.100
> Splitting the commandline.
> Reading option '-loglevel' ... matched as option 'loglevel' (set logging 
> level) with argument 'debug'.
> Reading option '-thread_queue_size' ... matched as option 'thread_queue_size' 
> (set the maximum number of queued packets from the demuxer) with argument 
> '1024'.
> Reading option '-framerate' ... matched as AVOption 'framerate' with argument 
> '30'.
> Reading option '-f' ... matched as option 'f' (force format) with argument 
> 'rawvideo'.
> Reading option '-pix_fmt' ... matched as option 'pix_fmt' (set pixel format) 
> with argument 'bgra'.
> Reading option '-video_size' ... matched as AVOption 'video_size' with 
> argument '848x480'.
> Reading option '-use_wallclock_as_timestamps' ... matched as AVOption 
> 'use_wallclock_as_timestamps' with argument 'true'.
> Reading option '-i' ... matched as input url with argument 
> '\\.\pipe\28e1b96a-c6ea-43f2-88a0-2e7e4bfb48ed'.
> Reading option '-framerate' ... matched as AVOption 'framerate' with argument 
> '25'.
> Reading option '-vcodec' ... matched as option 'vcodec' (force video codec 
> ('copy' to copy stream)) with argument 'libx264'.
> Reading option '-crf' ... matched as AVOption 'crf' with argument '23'.
> Reading option '-pix_fmt' ... matched as option 'pix_fmt' (set pixel format) 
> with argument 'yuv420p'.
> Reading option '-preset' ... matched as AVOption 'preset' with argument 
> 'ultrafast'.
> Reading option '-framerate' ... matched as AVOption 'framerate' with argument 
> '30'.
> Reading option '2032787211637070049780945982.mp4' ... matched as output url.
> Finished splitting the commandline.
> Parsing a group of options: global .
> Applying option loglevel (set logging level) with argument debug.
> Successfully parsed a group of options.
> Parsing a group of options: input url 
> \\.\pipe\28e1b96a-c6ea-43f2-88a0-2e7e4bfb48ed.
> Applying option thread_queue_size (set the maximum number of queued packets 
> from the demuxer) with argument 1024.
> Applying option f (force format) with argument rawvideo.
> Applying option pix_fmt (set pixel format) with argument bgra.
> Successfully parsed a group of options.
> Opening an input file: \\.\pipe\28e1b96a-c6ea-43f2-88a0-2e7e4bfb48ed.
> [rawvideo @ 017f0409aec0] Opening 
> '\\.\pipe\28e1b96a-c6ea-43f2-88a0-2e7e4bfb48ed' for reading
> [file @ 017f0409c800] Setting default whitelist 'file,crypto'
> \\.\pipe\28e1b96a-c6ea-43f2-88a0-2e7e4bfb48ed: Invalid argument
>
> The really strange pa

Re: [FFmpeg-user] screen cast windows 10 desktop with audio

2020-03-26 Thread Roger Pack
Screen cast it to where?

On Mon, Mar 16, 2020 at 1:52 PM Lance Bermudez  wrote:
>
> How do i screen cast a windows 10 desktop with audio?
>
> PS C:\Users\lance> ffmpeg -list_devices true -f dshow -i dummy
>
>   ffmpeg version 4.2.2 Copyright (c) 2000-2019 the FFmpeg
> developers
>   built with gcc 9.2.1 (GCC) 20200122
>   configuration: --disable-static --enable-shared --enable-gpl
> --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls
> --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray
> --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb
> --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus
> --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora
> --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp
> --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg
> --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab
> --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa
> --enable-libspeex --enable-libxvid --enable-libaom --enable-libmfx
> --enable-amf --enable-ffnvcodec --enable-cuvid --enable-d3d11va
> --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth
> --enable-libopenmpt
>   libavutil  56. 31.100 / 56. 31.100
>   libavcodec 58. 54.100 / 58. 54.100
>   libavformat58. 29.100 / 58. 29.100
>   libavdevice58.  8.100 / 58.  8.100
>   libavfilter 7. 57.100 /  7. 57.100
>   libswscale  5.  5.100 /  5.  5.100
>   libswresample   3.  5.100 /  3.  5.100
>   libpostproc55.  5.100 / 55.  5.100
> [dshow @ 03261400] DirectShow video devices (some may be both video and
> audio devices)
> [dshow @ 03261400]  "HD WebCam"
> [dshow @ 03261400] Alternative name
> "@device_pnp_\\?\usb#vid_04f2&pid_b452&mi_00#7&a554dbd&0&#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\global"
> [dshow @ 03261400] DirectShow audio devices
> [dshow @ 03261400]  "Microphone (Realtek High Definition Audio)"
> [dshow @ 03261400] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{634C03E2-5FAF-41B6-8FDE-76EA6186F1B1}"
>
> I have been using this.
> PS C:\Users\lance> ffmpeg -rtbufsize 1500M -f gdigrab -framerate ntsc
> -draw_mouse 1 -probesize 20M -video_size 1366x768 -thread_queue_size 512 -i
> desktop -f dshow -thread_queue_size 512 -i audio="Microphone (Realtek High
> Definition Audio)" -preset ultrafast -r 15 -vcodec libx264 -acodec aac -ac
> 1 -ar 44100 -async 15 "C:\Users\lance\Videos\Captures\out13.mp4"
>
> ffmpeg version
> 4.2.2 Copyright (c) 2000-2019 the FFmpeg developers
>   built with gcc 9.2.1 (GCC) 20200122
>   configuration: --disable-static --enable-shared --enable-gpl
> --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls
> --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray
> --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb
> --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus
> --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora
> --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp
> --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg
> --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab
> --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa
> --enable-libspeex --enable-libxvid --enable-libaom --enable-libmfx
> --enable-amf --enable-ffnvcodec --enable-cuvid --enable-d3d11va
> --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth
> --enable-libopenmpt
>   libavutil  56. 31.100 / 56. 31.100
>   libavcodec 58. 54.100 / 58. 54.100
>   libavformat58. 29.100 / 58. 29.100
>   libavdevice58.  8.100 / 58.  8.100
>   libavfilter 7. 57.100 /  7. 57.100
>   libswscale  5.  5.100 /  5.  5.100
>   libswresample   3.  5.100 /  3.  5.100
>   libpostproc55.  5.100 / 55.  5.100
> [gdigrab @ 039b1c00] Capturing whole desktop as 1366x768x32 at (0,0)
> Input #0, gdigrab, from 'desktop':
>   Duration: N/A, start: 1584387532.163434, bitrate: 1006131 kb/s
> Stream #0:0: Video: bmp, bgra, 1366x768, 1006131 kb/s, 29.97 fps, 29.25
> tbr, 1000k tbn, 1000k tbc
> Guessed Channel Layout for Input Stream #1.0 : stereo
> Input #1, dshow, from 'audio=Microphone (Realtek High Definition Audio)':
>   Duration: N/A, start: 123897.214000, bitrate: 1411 kb/s
> Stream #1:0: Audio: pcm_s16le, 44100 Hz, stereo, s16, 1411 kb/s
> File 'C:\Users\lance\Videos\Captures\out13.mp4' already exists. Overwrite ?
> [y/N] y
> Stream mapping:
>   Stream #0:0 -> #0:0 (bmp (native) -> h264 (libx264))
>   Stream #1:0 -> #0:1 (pcm_s16le (native) -> aac (native))
> Press [q] to stop, [?] for help
> [libx264 @ 03a27500] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
> BMI1 SlowPshufb
> [libx264 @ 03a27500] profile High 4:4:4 Predictive, level 3.2, 4:4:4, 8-bit
> [libx264 @ 03a27500] 264 - core 159 - H.264/MPEG-4 AVC codec - Copyleft

Re: [FFmpeg-user] Unable To Transcode 8 sources simultaneously Real Time In One Instance

2019-06-07 Thread Roger Pack
Yeah if you are exhausting a huge buffer like that it means you are
getting wy behind in the real time aspect, yes?

On Mon, May 13, 2019 at 4:50 PM Gabriel Balaich  wrote:
>
> >
> > 2G seems like a very large buffer...
> >
>
> Even when in the context of attempting to transcode 9k worth of footage
> real-time?
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> https://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Detect Frozen video

2019-05-13 Thread Roger Pack
On Wed, Nov 28, 2018 at 4:46 AM José María Infanzón  wrote:
>
> Hi All, I'm streaming a live channel and I want to use ffmpeg to monitor
> the stream, what I need to check is when the image is frozen. Is there a
> way yo achieve this? I've read that I can use blend function, but not sure
> how.

ffmpeg freezes? Wonder what causes the freeze...
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Unable To Transcode 8 sources simultaneously Real Time In One Instance

2019-05-13 Thread Roger Pack
 -rtbufsize 2147.48M
2GB? maybe it's just filling right up...

On Mon, Jan 14, 2019 at 12:10 PM Gabriel Balaich  wrote:
>
> I've decided to post this same question on Superuser in an attempt to get
> more eyes on the issue:
> https://superuser.com/questions/1394234/unable-to-transcode-8-sources-in-one-instance-real-time
>
> If I receive any possible solutions from here or there I will update both
> parties.
>
> Any insight would be greatly appreciated.
>
> >
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] does ffmpeg support ffserver

2019-05-13 Thread Roger Pack
It lists a few streaming options here
https://trac.ffmpeg.org/wiki/StreamingGuide FWIW.

On Sat, Apr 6, 2019 at 8:21 AM qw  wrote:
>
> Hi,
>
>
> Thanks for your help.
>
>
> Have you used python-librtmp to transfer rtmp stream?
>
>
> Thanks
>
>
> Regards
>
>
> Andrew
>
> At 2019-04-01 06:13:19, "Michael Shaffer"  wrote:
> >Here is the script I made for restarting the cameras.. Could alter it so it
> >lets you switch streams etc..
> >
> >
> >import tkinter as tk
> >from tkinter import *
> >import mysql.connector
> >import subprocess
> >from subprocess import Popen, PIPE, STDOUT, call
> >import threading
> >import re
> >import time
> >
> >top = tk.Tk()
> >t = None
> >
> ># Enable auto restart for individual cameras
> >cam1 = 1
> >cam2 = 1
> >cam3 = 1
> >cam4 = 1
> >cam5 = 1
> >
> ># Time of last restart
> >c1lastRestartTime = time.time()
> >c2lastRestartTime = time.time()
> >c3lastRestartTime = time.time()
> >c4lastRestartTime = time.time()
> >c5lastRestartTime = time.time()
> >
> >c2UptimeTotal = 0
> >
> >c1RecordUptime = 0
> >c2RecordUptime = 0
> >c3RecordUptime = 0
> >c4RecordUptime = 0
> >c5RecordUptime = 0
> >
> ># Counts to keep track of how many times cameras restart
> >cam1RestartCount = 1
> >cam2RestartCount = 1
> >cam3RestartCount = 1
> >cam4RestartCount = 1
> >cam5RestartCount = 1
> >
> ># Disabled by default, enabled using button
> >autoRestart = 0
> >
> >def Start1():
> >  global c1lastRestartTime
> >  global cam1RestartCount
> >  timeNow = time.time()
> >  delay = timeNow - c1lastRestartTime
> >  print(delay)
> >  if delay > 20:
> >c1lastRestartTime = time.time()
> >cam1RestartCount = cam1RestartCount + 1
> >addRecord(1)
> >Stop1()
> >myUrl='xterm -geometry 80x25+50+20 -fg green -hold -e
> >/home/dell/ffmpeg1 -rtsp_transport tcp -thread_queue_size 5096 -i \"rtsp://
> >admin:pw@192.168.1.64:554\" -rtbufsize 1000M -f lavfi -f dshow -f alsa -i
> >default -c:a libmp3lame -ab 128k -ar 44100 -c:v copy -f flv \"rtmp://
> >a.rtmp.youtube.com/live2/\"'
> >subprocess.Popen(myUrl, shell=True)
> >print("Camera 1 started.")
> >def Start2():
> >  global c2lastRestartTime
> >  global cam2RestartCount
> >  timeNow = time.time()
> >  delay = timeNow - c2lastRestartTime
> >  print(delay)
> >  if delay > 20:
> >c2lastRestartTime = time.time()
> >cam2RestartCount = cam2RestartCount + 1
> >addRecord(2)
> >Stop2()
> >myUrl='xterm -geometry 80x25+50+380 -fg green -hold -e
> >/home/dell/ffmpeg2 -rtsp_transport tcp -thread_queue_size 5096 -i \"rtsp://
> >admin:pw@192.168.1.102:554/VideoInput/1/h264/1\" -f lavfi -f dshow
> >-rtbufsize 500M -f alsa -i default -c:a libmp3lame -ab 128k -ar 44100 -c:v
> >copy -threads 0 -bufsize 512k -f flv \"rtmp://
> >a.rtmp.youtube.com/live2/xxx\"'
> >subprocess.Popen(myUrl, shell=True)
> >print("Camera 2 started.")
> >
> >def Start3():
> >  global c3lastRestartTime
> >  global cam3RestartCount
> >  timeNow = time.time()
> >  delay = timeNow - c3lastRestartTime
> >  print(delay)
> >  if delay > 20:
> >c3lastRestartTime = time.time()
> >cam3RestartCount = cam3RestartCount + 1
> >addRecord(3)
> >Stop3()
> >myUrl='xterm -geometry 80x25+50+730 -fg green -hold -e
> >/home/dell/ffmpeg3 -rtsp_transport tcp -thread_queue_size 5096 -i \"rtsp://
> >admin:pw@192.168.1.103:554/VideoInput/1/h264/1\" -f lavfi -f dshow
> >-rtbufsize 500M -f alsa -i default -c:a libmp3lame -ab 128k -ar 44100 -c:v
> >copy -threads 0 -bufsize 512k -f flv \"rtmp://
> >a.rtmp.youtube.com/live2/x\"'
> >subprocess.Popen(myUrl, shell=True)
> >print("Camera 3 started.")
> >
> >def Start4():
> >  global c4lastRestartTime
> >  global cam4RestartCount
> >  timeNow = time.time()
> >  delay = timeNow - c4lastRestartTime
> >  print(delay)
> >  if delay > 20:
> >c4lastRestartTime = time.time()
> >cam4RestartCount = cam4RestartCount + 1
> >addRecord(4)
> >Stop4()
> >myUrl='xterm -geometry 80x25+560+20 -fg green -hold -e
> >/home/dell/ffmpeg4 -rtsp_transport tcp -thread_queue_size 5096 -i \"rtsp://
> >admin:pw@192.168.1.104:554/VideoInput/1/h264/1\" -rtbufsize 500M -f lavfi
> >-f dshow -f alsa -i default -c:a libmp3lame -ab 128k -ar 44100 -c:v copy -f
> >flv \"rtmp://a.rtmp.youtube.com/live2/xx\"'
> >subprocess.Popen(myUrl, shell=True)
> >print("Camera 4 started.")
> >
> >def Start5():
> >  global c5lastRestartTime
> >  global cam5RestartCount
> >  timeNow = time.time()
> >  delay = timeNow - c5lastRestartTime
> >  print(delay)
> >  if delay > 20:
> >c5lastRestartTime = time.time()
> >cam5RestartCount = cam5RestartCount + 1
> >addRecord(5)
> >Stop5()
> >myUrl='xterm -geometry 80x25+560+380 -fg green -hold -e
> >/home/dell/ffmpeg5 -rtsp_transport tcp -thread_queue_size 5096 -i \"rtsp://
> >admin:pw@192.168.1.105:554/VideoInput/1/h264/1\" -f lavfi -f dshow
> >-rtbufsize 500M -f alsa -i default -c:a libmp3lame -ab 128k -ar 44100 -c:v
> >cop

Re: [FFmpeg-user] "DirectShow video devices (some may be both video and audio devices)"

2019-05-13 Thread Roger Pack
DirectShow video devices (some may be both video
and audio devices)

maybe? I think you can see the pins if you run with -loglevel verbose
good luck!

On Mon, Apr 29, 2019 at 8:00 PM Gabriel Balaich  wrote:
>
> Quick question regarding calling devices via dshow in FFmpeg,
>
> I'm trying to capture and Avermedia GC573, audio and video, but when I list
> devices there appears to be no audio device that would pair with the GC573.
>
> Here's me listing devices:
> PS C:\Users\Jordan> ffmpeg -list_devices true -f dshow -i dummy
> ffmpeg version N-93087-g2b8458fcc5 Copyright (c) 2000-2019 the FFmpeg
> developers
>   built with gcc 8.2.1 (GCC) 20181201
>   configuration: --enable-gpl --enable-version3 --enable-sdl2
> --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass
> --enable-libdav1d --enable-libbluray --enable-libfreetype
> --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb
> --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy
> --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx
> --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265
> --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp
> --enable-libvidstab --enable-libvorbis --enable-libvo-amrwbenc
> --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom
> --enable-libmfx --enable-amf --enable-ffnvcodec --enable-cuvid
> --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2
> --enable-avisynth --enable-libopenmpt
>   libavutil  56. 26.100 / 56. 26.100
>   libavcodec 58. 46.100 / 58. 46.100
>   libavformat58. 26.100 / 58. 26.100
>   libavdevice58.  6.101 / 58.  6.101
>   libavfilter 7. 48.100 /  7. 48.100
>   libswscale  5.  4.100 /  5.  4.100
>   libswresample   3.  4.100 /  3.  4.100
>   libpostproc55.  4.100 / 55.  4.100
> [dshow @ 01688ade9f00] DirectShow video devices (some may be both video
> and audio devices)
> [dshow @ 01688ade9f00]  "Game Capture HD60 Pro (Video) (#01)"
> [dshow @ 01688ade9f00] Alternative name
> "@device_pnp_\\?\pci#ven_12ab&dev_0380&subsys_00061cfa&rev_00#4&33186293&0&00e8#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\{6f814be9-9af6-43cf-9249-c03401000222}"
> [dshow @ 01688ade9f00]  "AVerMedia HD Capture GC573 1"
> [dshow @ 01688ade9f00] Alternative name
> "@device_pnp_\\?\pci#ven_1461&dev_0054&subsys_57301461&rev_00#4&3174068&0&00e0#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\{adef4cb5-1401-4177-84ee-fe8b26c13a5b}"
> [dshow @ 01688ade9f00] DirectShow audio devices
> [dshow @ 01688ade9f00]  "SPDIF/ADAT (1+2) (RME Fireface UC)"
> [dshow @ 01688ade9f00] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{AADE0540-0E9D-4CFC-B16E-1E52492511CE}"
> [dshow @ 01688ade9f00]  "Game Capture HD60 Pro (Audio) (#01)"
> [dshow @ 01688ade9f00] Alternative name
> "@device_pnp_\\?\pci#ven_12ab&dev_0380&subsys_00061cfa&rev_00#4&33186293&0&00e8#{33d9a762-90c8-11d0-bd43-00a0c911ce86}\{6f814be9-9af6-43cf-9249-c03401000322}"
> [dshow @ 01688ade9f00]  "ADAT (5+6) (RME Fireface UC)"
> [dshow @ 01688ade9f00] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{31381BEE-58DA-47F2-BEFE-7D8A59C3E6BC}"
> [dshow @ 01688ade9f00]  "SPDIF coax. (RME Fireface UC)"
> [dshow @ 01688ade9f00] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{41CF9FF0-5B17-4620-BDA9-4CA0239F66BF}"
> [dshow @ 01688ade9f00]  "ADAT (3+4) (RME Fireface UC)"
> [dshow @ 01688ade9f00] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{67F06B32-DCFE-46D4-AACB-5344C542555E}"
> [dshow @ 01688ade9f00]  "Analog (5+6) (RME Fireface UC)"
> [dshow @ 01688ade9f00] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{87EA8908-4B10-4D7A-BC87-E1FD14EA99DB}"
> [dshow @ 01688ade9f00]  "Analog (7+8) (RME Fireface UC)"
> [dshow @ 01688ade9f00] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{9B92B138-51D3-419B-A3B7-F09596E0F3A7}"
> [dshow @ 01688ade9f00]  "Analog (3+4) (RME Fireface UC)"
> [dshow @ 01688ade9f00] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{C4A7F11E-A89D-4D2E-9C88-1CF3D70D5ABD}"
> [dshow @ 01688ade9f00]  "Analog (1+2) (RME Fireface UC)"
> [dshow @ 01688ade9f00] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{CE2E8C0E-5D8C-4EE4-9E3D-AE3B02D6DD1D}"
> [dshow @ 01688ade9f00]  "ADAT (7+8) (RME Fireface UC)"
> [dshow @ 01688ade9f00] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{E4072ABB-0A99-4973-9B89-C223017BEB85}"
> dummy: Immediate exit requested
>
> As can be seen there is only a video source for the GC573, but upon further
> examination I noticed the line "DirectShow video devices (some may be both
> video and audio devices)". So I just listed the GC573 a

Re: [FFmpeg-user] Cannot receive audio/video from Unreal WebRTC DirectShow Source filter

2018-10-27 Thread Roger Pack
On Sat, Oct 27, 2018 at 5:54 AM Roger Pack  wrote:
>
> On Thu, Sep 20, 2018 at 12:08 PM Maxim Ershtein  wrote:
> >
> > Hello,
> >
> > I am trying to record VP9/Opus -encoded live stream into .mkv file. The
> > live stream is being encoded by Chrome browser and sent via WebRTC to
> > Unreal Media Server.
> >
> > Unreal Media Server allows installing WebRTC DirectShow Source filter:
> > http://umediaserver.net/components/index.html
> >
> > This filter allows connecting to Unreal Media Server and receiving the
> > original stream.
> > It works fine in GraphEdit - I can render and play the stream.
> >
> > But my goal is to record it with ffmpeg to .mkv file with no transcoding.
> > So I am doing:
> >
> > *ffmpeg -f dshow -show_video_device_dialog true -i video="Unreal WebRTC
> > Client Source":audio="Unreal WebRTC Client Source" -c:a copy -c:v copy
> > c:\temp\output.mkv*
> >
> > According to documentation, it should work.
> > Notice that I am doing  *-show_video_device_dialog true*
> > This is because you need to initialize the filter, like all Unreal source
> > filters, with the URL where to pull the stream from. This URL is not
> > getting persisted anywhere, so each time you instantiate this filter, you
> > need to initialize it with the URL. So the dialog allows you to input the
> > URL.
> > But when I do it, ffmpeg reports *"Could not get media type"*.
> >
> > *What am I doing wrong?*
>
> full uncut console output please?  Also you might be able to put it in
> a gif file and play that through avisynth...

also output of ffmpeg -f show -list_devices true -i dummy

I wasn't aware ffmpeg could read arbitrary show filters?
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Cannot receive audio/video from Unreal WebRTC DirectShow Source filter

2018-10-27 Thread Roger Pack
On Thu, Sep 20, 2018 at 12:08 PM Maxim Ershtein  wrote:
>
> Hello,
>
> I am trying to record VP9/Opus -encoded live stream into .mkv file. The
> live stream is being encoded by Chrome browser and sent via WebRTC to
> Unreal Media Server.
>
> Unreal Media Server allows installing WebRTC DirectShow Source filter:
> http://umediaserver.net/components/index.html
>
> This filter allows connecting to Unreal Media Server and receiving the
> original stream.
> It works fine in GraphEdit - I can render and play the stream.
>
> But my goal is to record it with ffmpeg to .mkv file with no transcoding.
> So I am doing:
>
> *ffmpeg -f dshow -show_video_device_dialog true -i video="Unreal WebRTC
> Client Source":audio="Unreal WebRTC Client Source" -c:a copy -c:v copy
> c:\temp\output.mkv*
>
> According to documentation, it should work.
> Notice that I am doing  *-show_video_device_dialog true*
> This is because you need to initialize the filter, like all Unreal source
> filters, with the URL where to pull the stream from. This URL is not
> getting persisted anywhere, so each time you instantiate this filter, you
> need to initialize it with the URL. So the dialog allows you to input the
> URL.
> But when I do it, ffmpeg reports *"Could not get media type"*.
>
> *What am I doing wrong?*

full uncut console output please?  Also you might be able to put it in
a gif file and play that through avisynth...

> Again, using the same dialog in GraphEdit works fine and I can render pins
> and play.
>
> I am also looking for some automation of that dialog - how do I pass that
> URL to the filter using command line? The filter readme says:
> "Configuration parameters can be provided via property page or via
> IFileSourceFilter interface exposed by the filter. When configured via
> IFileSourceFilter interface, initialization string needs to be provided as
> first parameter of IFileSourceFilter::Load()
> method."
> So, I guess, ffmpeg could check if IFileSourceFilter interface is supported
> and then call a Load function with  command line - supplied parameter. If
> ffmpeg doesn't support that, any ideas on how to pass the URL to the filter
> programmatically?

Good idea...though it might take some time.  Funding would help :)

> Note that this is a pretty generic situation - I might have 10 ffmpeg
> instances loading this filter and each one is going to use a different URL.
> So the URL is not something static and cannot sit in registry or some
> config file. So same problem with other network source filters from Unreal,
> but I don't need others, as ffmpeg can receive rtsp, rtmp and mpeg-ts
> streams by itself.
> But ffmpeg, of course, cannot receive browser-encoded WebRTC streams with
> VP9/Opus encodings. Hence I am trying to use Unreal WebRTC DirectShow
> Source filter.
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Max rtbufsize Via dshow

2018-09-19 Thread Roger Pack
On Fri, Sep 14, 2018 at 10:33 AM Gabriel Balaich  wrote:
>
> Up until a few days ago I had been running two capture cards, 1x4K card and
> 1x1080P card, I would encode both streams with a
> GTX 1080 no dropped frames no issues. But just recently I replaced the
> 1080P capture card with another 4K capture card to prep
> for a new camera and can't get things working gracefully. On my first test
> encoding 2x4K60 streams simultaneously I dropped
> frames starting the recording *and *ending the recording, but nothing
> in-between.
>
> Errors displayed:
> [dshow @ 01499bb17180] real-time buffer [Video (00 Pro Capture HDMI
> 4K+)] [video input] too full or near too full (62% of
> size: 214748 [rtbufsize parameter])! frame dropped!
> [dshow @ 0149944e7080] real-time buffer [AVerMedia HD Capture GC573 1]
> [video input] too full or near too full (62% of
> size: 214748 [rtbufsize parameter])! frame dropped!
>
> The obvious answer, according to the warning, would be increasing rtbufsize
> but I seem to have hit the cap... If I try to increase
> rtbufsize passed 2147.48M I get another error:
> [dshow @ 0250df6c7080] Value 30.00 for parameter
> 'rtbufsize' out of range [0 - 2.14748e+09]
> [dshow @ 0250df6c7080] Error setting option rtbufsize to value 3000M.
> video=AVerMedia HD Capture GC573 1:audio=SPDIF/ADAT (1+2) (RME Fireface
> UC): Result too large
>
> Is this a baked in limitation of FFmpeg or dshow? And if it is, why impose
> said limitation?

The limits "max signed INT" I believe (2B).
Anyway this message typically means your system isn't keeping up with
encoding.  I'd think nvenc would be able to handle it but maybe not?

> Since then I've tried messing with other parts of my command and found that
> by re-arranging my inputs / outputs I can prevent
> frame drops upon starting a recording but I still can't end a recording
> without dropping a bunch of frames.
>
> Anyone know why re-arranging my inputs / outputs would prevent frames from
> dropping when starting a recording or affect
> anything? Is it possible to bypass the rtbufsize cap? Any other ideas?
>
> Here is my full command, Ignore the blank numbers next to -ss, will be used
> to sync outputs when everything is working:
> ffmpeg -y -hide_banner -thread_queue_size  -indexmem 
> -guess_layout_max 0 -f dshow -rtbufsize 2147.48M `
> -i audio="Analog (1+2) (RME Fireface UC)" `
> -thread_queue_size  -indexmem  -guess_layout_max 0 -f dshow
> -rtbufsize 2147.48M `
> -i audio="ADAT (5+6) (RME Fireface UC)" `
> -thread_queue_size  -indexmem  -r 25 -f lavfi -rtbufsize 2147.48M
> -i color=c=black:s=50x50 `
> -thread_queue_size  -indexmem  -guess_layout_max 0 -f dshow
> -video_size 3840x2160 -rtbufsize 2147.48M `
> -framerate 60 -pixel_format nv12 -i video="AVerMedia HD Capture GC573
> 1":audio="SPDIF/ADAT (1+2) (RME Fireface UC)" `
> -thread_queue_size  -indexmem  -guess_layout_max 0 -f dshow
> -video_size 3840x2160 -rtbufsize 2147.48M `
> -framerate 60 -pixel_format nv12 -i video="Video (00 Pro Capture HDMI
> 4K+)":audio="ADAT (3+4) (RME Fireface UC)" `
> -map 2,0 -map 0 -c:v libx264 -r 25 -rc-lookahead 50 -forced-idr 1
> -sc_threshold 0 -flags +cgop `
> -force_key_frames "expr:gte(t,n_forced*2)" -preset ultrafast -pix_fmt nv12
> -b:v 16K -minrate 16K -maxrate 16K -bufsize 16k `
> -c:a aac -ar 44100 -b:a 384k -ac 2 -af "aresample=async=250" -vsync 1 -ss
> 00:00:00.000 `
> -max_muxing_queue_size  -f segment -segment_time 600 -segment_wrap 9
> -reset_timestamps 1 `
> -segment_format_options max_delay=0
> C:\Users\djcim\Videos\Main\Discord\Discord%02d.ts `
> -map 2,1 -map 1 -c:v libx264 -r 25 -rc-lookahead 50 -forced-idr 1
> -sc_threshold 0 -flags +cgop `
> -force_key_frames "expr:gte(t,n_forced*2)" -preset ultrafast -pix_fmt nv12
> -b:v 16K -minrate 16K -maxrate 16K -bufsize 16k `
> -c:a aac -ar 44100 -b:a 384k -ac 2 -af "aresample=async=250" -vsync 1 -ss
> 00:00:00.000 `
> -max_muxing_queue_size  -f segment -segment_time 600 -segment_wrap 9
> -reset_timestamps 1 `
> -segment_format_options max_delay=0
> C:\Users\djcim\Videos\Main\Soundboard\Soundboard%02d.ts `
> -map 3:0,3:1 -map 3:1 -c:v h264_nvenc -r 60 -rc-lookahead 120 -forced-idr 1
> -strict_gop 1 -sc_threshold 0 -flags +cgop `
> -force_key_frames "expr:gte(t,n_forced*2)" -preset: llhp -pix_fmt nv12 -b:v
> 250M -minrate 250M -maxrate 250M -bufsize 250M `
> -c:a aac -ar 44100 -b:a 384k -ac 2 -af "pan=mono|c0=c0,
> aresample=async=250" -vsync 1 -ss 00:00:00.000 `
> -max_muxing_queue_size  -f segment -segment_time 600 -segment_wrap 9
> -reset_timestamps 1 `
> -segment_format_options max_delay=0
> C:\Users\djcim\Videos\Main\Camera\Camera%02d.ts `
> -map 4:0,4:1 -map 4:1 -c:v h264_nvenc -r 60 -rc-lookahead 120 -forced-idr 1
> -strict_gop 1 -sc_threshold 0 -flags +cgop `
> -force_key_frames "expr:gte(t,n_forced*2)" -preset: llhp -pix_fmt nv12 -b:v
> 250M -minrate 250M -maxrate 250M -bufsize 250M `
> -c:a aac -ar 44100 -b:a 384k

Re: [FFmpeg-user] Reducing Latency while streaming desktop

2018-08-29 Thread Roger Pack
https://trac.ffmpeg.org/wiki/StreamingGuide#Latency may be of some use to you.
On Sat, Aug 25, 2018 at 12:24 PM Thomas Glanzmann  wrote:
>
> Hello,
> I would like to use ffmpeg to stream my windows 7 desktop to windows 10
> desktops on the same broadcast domain. Last week I used the following two 
> lines
> for 5 days, it worked flawlessly but the latency was around 400ms. I would 
> like
> to get below 50ms on a switched 1 Gbit ethernet network that has a
> latency of 0.3ms. Do you have any recommendations that I can try?
>
> I used one of the following two lines to capture my Windows 7 desktop:
>
> ffmpeg -f gdigrab -framerate 25 -i desktop -vcodec libx264 -pix_fmt yuv420p 
> -tune zerolatency -preset ultrafast -f mpegts udp://236.0.0.1:2000
> ffmpeg -y -loglevel warning -f dshow -i video="screen-capture-recorder" 
> -framerate 25  -vcodec libx264 -pix_fmt yuv420p -tune zerolatency -preset 
> ultrafast -f mpegts udp://236.0.0.1:2000
>
> I used the following two lines to play:
>
> mpv.exe udp://236.0.0.1:2000 --no-cache --untimed --no-demuxer-thread
> ffplay -probesize 32 -sync ext udp://236.0.0.1:2000
>
> I used nightly builds both for ffmpeg and mpv.
>
> Cheers,
> Thomas
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Decoding frames as fast as possible

2018-08-08 Thread Roger Pack
On Tue, Jul 24, 2018 at 2:50 PM, Évan  wrote:
> On Mon, Jul 23, 2018 at 8:45 AM, Roger Pack wrote:
>> Does ffprobe take 20s?
>
> I just tested and "ffprobe.exe -show_entries frame=pkt_pts_time
> bbb_sunflower_1080p_30fps_normal.mp4" does take time to complete.

I also saw this recently:
https://stackoverflow.com/questions/51547517/using-ffprobe-to-get-number-of-keyframes-in-raw-avi-file-without-processing-en#comment90288838_51547517
(hint: consider asking on superuser and/or av.stackexchange)
Anyway my current hypothesis is that ffmpeg has no way to "skip
reading" input as it were...

> I'm using ffprobe version 4.0 [...] built with gcc 7.3.0 (GCC), x64, win32
> zeranoe version.
> From my test, ffprobes takes like 3 minutes and a half when printing the
> result to the console and a minute and a half when output is redirected to a
> file.
>
> My test video is "Big Buck Bunny" 1080p 30FPS MP4 version, md5 sum
> babbe51b47146187a66632daa791484b.
>
> I can send you my test program but I used ffmpeg (and ffprobe) source code
> to write it.
> Moreover, all CPUs are used at 100% when "building the index" so my program
> seems efficient CPU-side.
>
> Regards
>
> PS : I'm not subscribed to the list so please CC me.
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Anyone having success capturing hours of 4k video, reliably and with low loss, using ffmpeg?

2018-08-08 Thread Roger Pack
On Tue, Aug 7, 2018 at 3:15 PM, Gabriel Balaich  wrote:
>>
>> You are missing the question.
>>
>> I am not asking for help to diagnose that simple experiment. I am asking
>> if others have had experience with the top-level goal: capturing hours
>> of 4k video, reliably and with low loss, using ffmpeg.
>>
>> I can use whatever capture card, operating system, CPU, and graphics
>> card are necessary to build a system that works, and costs less than
>> what Vendor A would charge us for their closed system.
>>
>
> I capture 4K60 and 1080P60 simultaneously using one instance of FFmpeg
> with no artifacting (ever) or perceivable loss in quality for 7-8 hours at
> a time
> consistently.
>
> No crazy hardware - a few capture cards (HD60 Pro + Magewell Pro Capture
> HDMI 4K Plus), 6800K, 16GB of RAM, GTX 1050 (handles encoding), and
> an Intel 600P NVME SSD for storage.

Nice.  What codec and/or command line?

Also if anybody's desperate for a "software" solution I might be able
to provide a new video codec that "uses a lot of space but is
screaming fast" let me know if x264 ultrafast isn't fast enough.
Cheers!
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] How to build static fontconfig with ffmpeg?

2018-08-08 Thread Roger Pack
On Tue, Jul 24, 2018 at 12:02 AM, qw  wrote:
> Hi,
>
>
>>Wonder if it should be
>>or use another configure command : ./configure --disable-shared
>>--enable-static LDFLAGS="-L/usr/local/lib" LIBS="-lfreetype -lharfbuzz
>> -lpng"
>>
>>If that doesn't work maybe the circular dep. with harfbuzz and free
>>type isn't trivial?

I wonder if you can specify them circular style like

-lfreetype -lharfbuzz -lfreetype (or some combination)

if not sounds like you found a work around.  Either that or remove the
circular aspect.

> It doesn't work. I combine libfreetype.a, libharfbuzz.a and libpng16.a into 
> new libfreetype.a, and can succeed in building fontconfig by linking to the 
> new libfreetype.a, i.e.
>
>
>
> ar x libfreetype.a" && ar x libharfbuzz.a" && ar x libpng16.a" && ar cru 
> libfreetype.new.a ./*.o && ranlib libfreetype.new.a
>
>
>
> Is there another method to build fontconfig? Is it possible to build 
> fontconfig by only running configure and make?
>
>
>
> Thanks!
>
>
> Regards
>
>
> Andrew
>
>
> At 2018-07-23 22:43:44, "Roger Pack"  wrote:
>>Wonder if it should be
>>or use another configure command : ./configure --disable-shared
>>--enable-static LDFLAGS="-L/usr/local/lib" LIBS="-lfreetype -lharfbuzz
>> -lpng"
>>
>>If that doesn't work maybe the circular dep. with harfbuzz and free
>>type isn't trivial?
>>
>>On Tue, Jul 17, 2018 at 7:20 AM, qw  wrote:
>>> Hi,
>>>
>>>
>>> My environment is Centos 7.4.
>>>
>>>
>>> I want to build static fontconfig and build ffmpeg with static fontconfig. 
>>> I succeed in building static 
>>> libpng/freetype2/harfbuzz/vidstab/fribidi/expat, but I fail to build static 
>>> fontconfig. The following is my building steps:
>>>
>>>
>>> 1) build libpng-1.6.34.tar.gz
>>> ./autogen.sh
>>> ./configure --disable-shared --enable-static
>>> make && make install
>>>
>>>
>>> 2) build freetype-2.9.1.tar.bz2
>>> ./configure --disable-shared --enable-static --with-png=no 
>>> --with-harfbuzz=no
>>> make && make install
>>>
>>>
>>> 3) build harfbuzz-1.7.6.tar.bz2
>>> ./configure --prefix=/usr/local/ --disable-shared --enable-static 
>>> --with-freetype=yes
>>> make && make install
>>>
>>>
>>> 4) rebuild freetype2
>>> make clean && make distclean
>>> ./configure --disable-shared --enable-static --with-png=yes 
>>> --with-harfbuzz=yes
>>> make && make install
>>>
>>>
>>> 5) build georgmartius-vid.stab-v1.1.0-0-g60d65da.tar.gz
>>> cmake -G "Unix Makefiles" -DBUILD_SHARED_LIBS=0
>>> make && make install
>>>
>>>
>>> 6) build fribidi-1.0.1.tar.gz
>>> ./bootstrap
>>> ./configure --disable-shared --enable-static
>>> make && make install
>>>
>>>
>>> 7) build expat-2.2.5.tar.bz2
>>> ./configure --disable-shared --enable-static
>>> make && make install
>>>
>>>
>>> 8) build fontconfig-2.13.0.tar.bz2
>>> ./configure --disable-shared --enable-static
>>> make
>>>
>>>
>>> Then, make reports error messages as below:
>>> /usr/local/lib/libfreetype.a(sfnt.o): In function `Load_SBit_Png':
>>> sfnt.c:(.text+0x3c49): undefined reference to `png_create_read_struct'
>>> sfnt.c:(.text+0x3c63): undefined reference to `png_create_info_struct'
>>> sfnt.c:(.text+0x3c8b): undefined reference to `png_set_longjmp_fn'
>>> sfnt.c:(.text+0x3cb6): undefined reference to `png_destroy_read_struct'
>>> sfnt.c:(.text+0x3d06): undefined reference to `png_set_read_fn'
>>> sfnt.c:(.text+0x3d1b): undefined reference to `png_read_info'
>>>
>>>
>>> or use another configure command : ./configure --disable-shared 
>>> --enable-static LDFLAGS="-L/usr/local/lib" LIBS="-lharfbuzz -lfreetype 
>>> -lpng"
>>> Then, make reports error messages as below:
>>> /usr/local/lib/libfreetype.a(autofit.o): In function `af_face_globals_free':
>>> autofit.c:(.text+0x2d52): undefined reference to `hb_font_destroy'
>>> autofit.c:(.text+0x2d5b): undefined reference to `hb_buffer_destroy'
>>> /usr/local/lib/libfreetype.a(autofit.o): 

Re: [FFmpeg-user] FFplay network stream with low latency

2018-08-07 Thread Roger Pack
https://trac.ffmpeg.org/wiki/StreamingGuide#Latency
may be useful to you...

On Sun, Aug 5, 2018 at 7:41 PM, Elliott Balsley
 wrote:
> I discovered an option in picamera python module to include the framerate in 
> the SPS headers, so now I will use that instead of raspivid.  Plus it’s more 
> flexible, allowing for more advanced text overlays.
> Now ffplay detects it correctly as 30 tbr, although it shows 60 tbc and it 
> shows different numbers for fps like 28.83 or 29.25 each time I run it.  Why 
> is that?  Anyway, it still has the same 10 seconds latency, with or without 
> ffmpeg up front.
>
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Anyone having success capturing hours of 4k video, reliably and with low loss, using ffmpeg?

2018-08-07 Thread Roger Pack
On Wed, Jul 25, 2018 at 2:34 PM, Jim DeLaHunt  wrote:
>
> On 2018-07-23 10:25, Rafael Lima wrote:
>>>
>>>   They set up a 4k camera on top of a building (have electricity, but
>>> limited internet),
>>
>>
>> 4K on limited internet? is just in my mind that those two words doesn't
>> fit
>> together?
>>
>>
>> What do you mean by " capture 6-12 hours of 4k 29.92fps video from that
>> camera"? Is the camera streaming the video somehow and you just need to
>> store it?
>
>
> Thank you for your reply, Rafael.
>
> I'm sorry if my original message wasn't clear. I understand why "4K" and
> "limited internet" might not fit together. The missing part is "enough
> terabytes of reasonably fast SSD storage on the server to which the camera
> is attached to hold the video".
>
> So yes, I "just" need to store it. And I need to store every frame of 4K at
> 29.92fps. And I need the capture system to not run out of memory, or crash,
> during the 12-hour session.  And if an individual part of the capture system
> could crash more often than, say, during 1% of the 12-hour sessions, then I
> need some kind of redundancy to allow another system to capture if the
> primary system has failed.
>
> On 2018-07-23 10:25, Rafael Lima wrote:
>>
>> ...If it is your only limitation are the [bandwith] and storage as
>> ffmpeg doesn't need to process nothing and it could be done with any
>> stream
>> saver program
>
>
> This sounds reassuring, as if this isn't such a hard task after all. But I
> don't hear you saying that you know of someone who has done it.
>
> On 2018-07-23 00:42, Roger Pack wrote:
>>
>> It "should" work assuming your transcoding/disk can keep up with realtime
>> ...
>
>
> Thank you, Roger. I am hearing you say "should work", not "did work, for a
> real-life case I know".

Try it out and let us know.  Another thing that might help is some
kind of "gnu" encoding.
I'm not sure if libx264 even on "ultrafast" will be able to handle it, GL!
-Roger-

>
> On 2018-07-23 05:42, Another Sillyname wrote:
>>
>> Yes, but honestly I think you [Roger Pack] are over simplifying it..
>>
>> For example, are you intending to capturing scenes with high activity
>> that requires a much higher bandwidth and transcoding capability? ...
>> The capability exists and as you've stated the existing solutions can
>> be crazy expensive, but if you're going to homebrew a solution you
>> need to do some work/testing to make sure what YOU want to capture can
>> be done within YOUR budget.
>
>
> Thank you, "Another Sillyname". This is what I'm concerned with: that it
> might work, might not, and the only way to gather data is to try my own
> experiments.
>
> My goal with the question was to limit the range of possibilities, by having
> someone come forward with concrete experiences. If I hear, "we tried it, it
> was crazy hard", maybe I should tell my boss to pay for the crazy expensive
> commercial solutions. If I hear, "we tried it, it was pretty easy", maybe I
> encourage my boss to let me homebrew a solution.
>
> In either case, we will have to test it.
>
> Thank you all for your replies,
>—Jim DeLaHunt, Vancouver, Canada
>
> --
>
> --
> --Jim DeLaHunt, j...@jdlh.com http://blog.jdlh.com/
> (http://jdlh.com/)
>   multilingual websites consultant
>
>   355-1027 Davie St, Vancouver BC V6E 4L2, Canada
>  Canada mobile +1-604-376-8953
>
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Decoding frames as fast as possible

2018-07-23 Thread Roger Pack
Does ffprobe take 20s?

On Tue, Jul 3, 2018 at 2:23 AM, Evan  wrote:
> Hi everyone,
>
> I'm developping a media player featuring advanced video/frame processing.
> To be able to seek to any particular frame, I need to know, for each frame,
> its presentation time and if available, if it's a key frame or not. The
> latter is optional, really.
>
> To do that I build an "index" once the media is valid and opened.
> It's a basically a loop, fetching a packet, checking if it's the video
> stream I'm interested in, feeding the codec until he can produces a frame
> or reaching the end of the file.
>
> This is perfectly working but it's way too long for me, like 20 seconds for
> a ~800MB mp4/h264 file.
> I'm considering the following options:
> - building the index in the background while playing the media: I guess
> I'll have to copy the decoder context to not mix both processes but I
> expect having playback (stuttering) issues.
> - use multiple threads: from what I read, I can only use one thread for a
> stream. I may eventually use two threads, one for video and one for audio
> but I want to speed up video decoding fisrt.
> - use the GPU to decode frames faster: while it sounds like a good idea, I
> have to add two constraints: I'm on Windows 7 (and soon, Windows 10) and I
> want a generic approach so not H.264-only.
> - decode only packets: it's tempting but I think I will either loose
> timestamp information or precision (because of I, P, B frames.
> - use an option (av_dict_set) to do a dummy frame decoding: is there
> anything like that?
> - use reference counting to avoid costly frame allocations: I already tried
> that but didn't see any difference.
>
> For now, I will measure time elapsed in each piece of code in order to
> pinpoint lengthy operations.
> If you guys have any hint, I'd be glad to hear them.
>
> Thanks.
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] How to build static fontconfig with ffmpeg?

2018-07-23 Thread Roger Pack
Wonder if it should be
or use another configure command : ./configure --disable-shared
--enable-static LDFLAGS="-L/usr/local/lib" LIBS="-lfreetype -lharfbuzz
 -lpng"

If that doesn't work maybe the circular dep. with harfbuzz and free
type isn't trivial?

On Tue, Jul 17, 2018 at 7:20 AM, qw  wrote:
> Hi,
>
>
> My environment is Centos 7.4.
>
>
> I want to build static fontconfig and build ffmpeg with static fontconfig. I 
> succeed in building static libpng/freetype2/harfbuzz/vidstab/fribidi/expat, 
> but I fail to build static fontconfig. The following is my building steps:
>
>
> 1) build libpng-1.6.34.tar.gz
> ./autogen.sh
> ./configure --disable-shared --enable-static
> make && make install
>
>
> 2) build freetype-2.9.1.tar.bz2
> ./configure --disable-shared --enable-static --with-png=no --with-harfbuzz=no
> make && make install
>
>
> 3) build harfbuzz-1.7.6.tar.bz2
> ./configure --prefix=/usr/local/ --disable-shared --enable-static 
> --with-freetype=yes
> make && make install
>
>
> 4) rebuild freetype2
> make clean && make distclean
> ./configure --disable-shared --enable-static --with-png=yes 
> --with-harfbuzz=yes
> make && make install
>
>
> 5) build georgmartius-vid.stab-v1.1.0-0-g60d65da.tar.gz
> cmake -G "Unix Makefiles" -DBUILD_SHARED_LIBS=0
> make && make install
>
>
> 6) build fribidi-1.0.1.tar.gz
> ./bootstrap
> ./configure --disable-shared --enable-static
> make && make install
>
>
> 7) build expat-2.2.5.tar.bz2
> ./configure --disable-shared --enable-static
> make && make install
>
>
> 8) build fontconfig-2.13.0.tar.bz2
> ./configure --disable-shared --enable-static
> make
>
>
> Then, make reports error messages as below:
> /usr/local/lib/libfreetype.a(sfnt.o): In function `Load_SBit_Png':
> sfnt.c:(.text+0x3c49): undefined reference to `png_create_read_struct'
> sfnt.c:(.text+0x3c63): undefined reference to `png_create_info_struct'
> sfnt.c:(.text+0x3c8b): undefined reference to `png_set_longjmp_fn'
> sfnt.c:(.text+0x3cb6): undefined reference to `png_destroy_read_struct'
> sfnt.c:(.text+0x3d06): undefined reference to `png_set_read_fn'
> sfnt.c:(.text+0x3d1b): undefined reference to `png_read_info'
>
>
> or use another configure command : ./configure --disable-shared 
> --enable-static LDFLAGS="-L/usr/local/lib" LIBS="-lharfbuzz -lfreetype -lpng"
> Then, make reports error messages as below:
> /usr/local/lib/libfreetype.a(autofit.o): In function `af_face_globals_free':
> autofit.c:(.text+0x2d52): undefined reference to `hb_font_destroy'
> autofit.c:(.text+0x2d5b): undefined reference to `hb_buffer_destroy'
> /usr/local/lib/libfreetype.a(autofit.o): In function `af_shaper_get_coverage':
> autofit.c:(.text+0x7544): undefined reference to `hb_font_get_face'
> autofit.c:(.text+0x756d): undefined reference to `hb_ot_tags_from_script'
> autofit.c:(.text+0x759e): undefined reference to `hb_set_create'
> autofit.c:(.text+0x75bb): undefined reference to 
> `hb_ot_layout_collect_lookups'
> autofit.c:(.text+0x75c3): undefined reference to `hb_set_is_empty'
> autofit.c:(.text+0x75ee): undefined reference to `hb_set_destroy'
> autofit.c:(.text+0x75f8): undefined reference to `hb_set_destroy'
> autofit.c:(.text+0x7602): undefined reference to `hb_set_destroy'
>
>
> How to build static fontconfig?
>
>
> Thanks!
>
>
> Regards
>
>
> Andrew
>
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] FFMPEG configure FAILD

2018-07-23 Thread Roger Pack
At the end

/tmp/ffconf.QPyKsvYV/test.c:1:23: error: lame/lame.h: No such file or directory

you probably need to set CPATH to your include dir  and LIBRARY_PATH
to your lib dir
ref: 
https://github.com/rdp/ffmpeg-windows-build-helpers/blob/master/cross_compile_ffmpeg.sh#L2140
GL!

On Fri, Jul 20, 2018 at 3:55 AM, 盛金鹿  wrote:
> HI:
>
> 1.I can’t use root to install it.
>
> 2.I install lame  not in "/user/local/lib"  and to other directory.
> for example:/app/user/lame
> then I setting PATH and LD_LIBRARY_PATH
> export LASM_HOME=/app/user/lame
> export PATH=$PATH:$LASM_HOME/bin
>
>
> export LD_LIBRARY_PATH=$LASM_HOME/bin:$LD_LIBRARY_PATH
> In .bash_profile
> 3. When I run commond
>  ./configure --prefix=/app/user/ffmpeg --enable-shared --enable-libmp3lame 
> That has error :ERROR: libmp3lame >= 3.98.3 not found If you think configure 
> made a mistake, make sure you are using the latest version from Git.  If the 
> latest version fails, report the problem to the ffmpeg-user@ffmpeg.org 
> mailing list or IRC #ffmpeg on irc.freenode.net. Include the log file 
> "ffbuild/config.log" produced by configure as this will help solve the 
> problem.
>
> I hope you can help me.
>
>
> Thanks!
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Best option for live stream

2018-07-23 Thread Roger Pack
This page https://trac.ffmpeg.org/wiki/StreamingGuide
may be useful to ya.

On Fri, Jul 20, 2018 at 5:20 AM, hans gerte  wrote:
> Hi all.
> need some advise on parameters for live streaming.
> This is currently my string, it takes the stream from tvheaedned:
> need to lower the bitstream as im sending the stream over wan. but the
> problem that i have, is that it uses a lot of cpu and sometimes the picture
> isn't smooth.. have seen some using minbitrate and maxbittrate. but dont
> really understand what and how to use it. plus others ar using crt 24 or
> fast..slow and so on
>
> so i like to get your opinion or suggestion on what your using for
> livestream.
>
> ffmpeg -f mpegts -i
> http://username:password@192.168.0.14:4445/stream/channelid/14 -vcodec
> libx264 -vb 1000k -acodec aac -ar 48000 -ab 48k -f flv
> rtmp://localhost/myapp/mystream
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Anyone having success capturing hours of 4k video, reliably and with low loss, using ffmpeg?

2018-07-23 Thread Roger Pack
It "should" work assuming your transcoding/disk can keep up with realtime
...

On Thu, Jun 28, 2018 at 3:50 PM, Jim DeLaHunt 
wrote:

> On 2018-06-28 14:21, Carl Eugen Hoyos wrote:
>
> 2018-06-28 21:58 GMT+02:00, Jim DeLaHunt :
>>
>> We tried a simple experiment. We set up a 4K camera in the
>>> office. A straightforward ffmpeg invocation did capture a
>>> couple of minutes of video. But there were nasty artifacts
>>>
>> Command line and complete, uncut console outupt missing.
>>
>> In this case, you should also tell us which operating system
>> you have, which cpu and which graphics-card (if applicable).
>>
>
> You are missing the question.
>
> I am not asking for help to diagnose that simple experiment. I am asking
> if others have had experience with the top-level goal: capturing hours of
> 4k video, reliably and with low loss, using ffmpeg.
>
> I can use whatever capture card, operating system, CPU, and graphics card
> are necessary to build a system that works, and costs less than what Vendor
> A would charge us for their closed system.
>
> --
> --Jim DeLaHunt, j...@jdlh.com http://blog.jdlh.com/ (
> http://jdlh.com/)
>   multilingual websites consultant
>
>   355-1027 Davie St, Vancouver BC V6E 4L2, Canada
>  Canada mobile +1-604-376-8953
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
>
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] 4K 60Hz Directshow Video Capture

2018-06-09 Thread Roger Pack
Just enter it manually as you're doing I think it's a signed int.

On 3/2/18, Alex P  wrote:
> Thank you for that suggestion Roger, I can't believe I forgot about the
> preset options.
>
> Executing the following on a RAM disk gets me about .9x performance, letting
> me capture about 5 seconds worth of video before the buffers overflow and I
> lose frames. Not perfect but enough for what I need. I think this is the
> best I'm going to get unless the mfg makes a better driver and adds yuv444p
> support or I get a better CPU. I really appreciate for everyone's input.
>
> ffmpeg -f dshow -video_size 3840x2160 -framerate 6/1001 -rtbufsize
> 21000  -pixel_format bgr24 -i video="MZ0380 PCI, Analog 01 Capture" -c:v
> libx264rgb -preset ultrafast -crf 0 -pix_fmt bgr24 -t 00:00:10 -r 6/1001
> out.avi
>
> One more question, what is the command to use the maximum buffer size?
> -rtbufsize INT_MAX doesn't work. Thanks.
>
> -Original Message-
> From: ffmpeg-user [mailto:ffmpeg-user-boun...@ffmpeg.org] On Behalf Of Roger
> Pack
> Sent: Tuesday, February 27, 2018 8:03 PM
> To: FFmpeg user questions
> Subject: Re: [FFmpeg-user] 4K 60Hz Directshow Video Capture
>
> consider also libx264 "ultrafast" preset, GL!
>
> On Tue, Feb 13, 2018 at 7:57 AM, Alex P  wrote:
>
>> I think I've figured it out. When I use nv12 or yuv420p as the input
>> and output pixel format, I get x1 performance. If I use bgr24/rgb24 as
>> the input and yuv444p as the output, I get around x0.3.
>>
>> But even when I use bgr0 for the input and output, I get less than x1.
>> Does anyone know what exactly bgr0 is? I can't find any information
>> about it in my googling.
>>
>> In your testing James, what was the pixel format?
>>
>> -Original Message-
>> From: ffmpeg-user [mailto:ffmpeg-user-boun...@ffmpeg.org] On Behalf Of
>> James Girotti
>> Sent: Monday, February 12, 2018 7:03 PM
>> To: FFmpeg user questions
>> Subject: Re: [FFmpeg-user] 4K 60Hz Directshow Video Capture
>>
>> >
>> > ffmpeg -f dshow -video_size 3840x2160 -framerate 6/1001
>> > -rtbufsize
>> > 21 -pixel_format bgr24 -i video="MZ0380 PCI, Analog 01 Capture"
>> > -c:v h264_nvenc -preset lossless -f null - Gives me the same error
>> >
>>
>> That's surprising, I can get about 200fps using file-based/ramdisk
>> "-c:v h264_nvenc -preset -lossless". Have you also tried "-c:v
>> hevc_nvenc -preset lossless"? What's the encoding FPS that you're
>> getting? You technically shouldn't be able get much more than 60fps as
>> that's what your capture card is supplying. Can you monitor the "Video
>> Engine Utilization" during encoding? In linux it's listed in the
>> nvidia-settings GUI or "nvidia-smi dmon" on the CLI will show enc/dec%.
>>
>>
>> > ffmpeg -f dshow -video_size 3840x2160 -framerate 6/1001
>> > -rtbufsize
>> > 21 -pixel_format bgr24 -i video="MZ0380 PCI, Analog 01 Capture"
>> > -c:v rawvideo -f null -
>> > Gets me nearly x1 performance when executing from a ram disk but
>> >
>> > ffmpeg -f dshow -video_size 3840x2160 -framerate 6/1001
>> > -rtbufsize
>> > 21 -pixel_format bgr24 -i video="MZ0380 PCI, Analog 01 Capture"
>> > -c:v rawvideo raw.nut
>> > Only gets me x0.5 and the buffer overflows.
>>
>>
>> > Is there a way of accelerating rawvideo decoding? Would using my
>> > colleagues 1080 make a difference? Thanks.
>>
>>
>> I think raw-video is already decoded. So no way/need to accelerate that.
>> You might try a different pix_fmt from your capture card while using
>> hw-encoding, but you'd have to test. I don't know the internals, i.e.
>> when the pixel format is converted during hw-encoding. So it might
>> make a difference.
>>
>> Changing pixel formats might be a concern if you are trying to achieve
>> "100% lossless" capture. I've read that yuv444p should be sufficient
>> colorspace for bgr24.
>>
>> There isn't a lot of info out there on encoding speed differences
>> based on GPU models. It's a complex subject, but from what I have
>> observed the ASIC is tied to the GPU clock (I have observed that GPU
>> clock speed increases as ASIC load increases). If that's true, then a
>> GTX 1080, with it's higher max clock, could have faster encoding, but I
>> have no data to back that up only.
>> _

Re: [FFmpeg-user] Support request - configuring ffmpeg for YouTube Live 60fps

2018-06-09 Thread Roger Pack
On 4/25/18, Stuart Porter  wrote:
> Hi there,
>
> Firstly, forgive any rookie mistakes – I’m new here and not a seasoned
> support blog user. Neither am I a developer of any sort.
>
> I’m working with a developer though and we are trying to configure ffmpeg
> to stream live from a video capture device capturing at 60fps (AverMedia
> Live Gamer HD2) with audio, to YouTube Live using rtmp. The machine I want
> to stream from is Windows7 64-bit with Intel Core i5-6400 CPU @ 2.70GHz
>
> I’m happy to use either h264 or vp9 codec. And have tried configuring
> numerous examples taken from previous posts but nothing works.
>
> e.g. for VP9 I have tried
>
>  ffmpeg -f dshow -video_size 1920x1080 -r 60 -i video="AVerMedia Live Gamer
> HD 2":audio="Line (3- AVerMedia Live Gamer H" -r 30 -g 60 -s 1920x1080
> -quality good -speed 5 -threads 16 -row-mt 1 -tile-columns 3
> -frame-parallel 1 -qmin 4 -qmax 48 -b:v 7800k -c:v libvpx-vp9 -b:a 128k
> -c:a libopus -f webm rtmp://a.rtmp.youtube.com/live2/[my key]
>
> results in a string of failures such as
>
> [dshow @ 003d4dc0] real-time buffer [AVerMedia Live Gamer HD 2]
> [video input] too full or near too full (272% of size: 3041280 [rtbufsize
> parameter])! frame dropped!
>
> We have tried adjusting the quality, speed, framerate and bandwidth but get
> very similar results each time.

vpx is I think not a "fast" encoder making it tricky to get up to
realtime.  Typically people use libx264 preset ultrafast or what not
to save on encode speed (or, as you mentioned, adjust framerate, frame
size, etc.)
Or maybe vpx has some ultrafast preset option, dunno.
GL!
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] can't capture from audio speakers, but well from microphone

2018-06-09 Thread Roger Pack
On 3/10/18, Carl Zwanzig  wrote:
> On 3/10/2018 6:01 AM, Phil Forneur wrote:
>> I can't capture from audio speakers, but well from microphone.
>
> Speakers/audio-out aren't considered an input device.
>
> If you're trying to capture what's being played out those speakers, you need
> software to intercept the data on it's way to the output device and direct
> it elsewhere (look for the 'Windows WASAPI' device, see
> http://manual.audacityteam.org/man/tutorial_recording_computer_playback_on_windows.html).

You may be able to install a dshow device to "capture what you  hear"
as it were:
humbly: https://github.com/rdp/virtual-audio-capture-grabber-device
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Any Actual Solution to "Past Duration Too Large"?

2018-06-05 Thread Roger Pack
There are a few tracs on it, and this:
https://superuser.com/questions/902661/ffmpeg-past-duration-0-xx-too-large
Seems to be a bug somehow to me but I never looked closely enough.

On Mon, Apr 16, 2018 at 7:50 PM, Gabriel Balaich 
wrote:

> Hey all, I'm having an issue with the "Past duration too large" warning
> being spammed in my console.
>
> Suggestions range from silencing the console to applying the "fps=xxx"
> filter, but when I apply the fps filter it adds stutter to my
> video, and silencing my console isn't really an actual solution as I lose
> visibility of everything else. For example I can't view what
> segment I'm on, or other warnings and error messages that may be important.
> Oddly enough I have found that doubling my input framerate and then calling
> the actual framerate I want before the output gets rid
> of the warning completely:
>
> ffmpeg -y -thread_queue_size  -indexmem  -guess_layout_max 0 -f
> dshow -video_size 3440x1440 -rtbufsize 2147.48M ^
> -framerate 200 -pixel_format nv12 -i video="Video (00 Pro Capture HDMI
> 4K+)":audio="SPDIF/ADAT (1+2) (RME Fireface UC)" ^
> -map 0:0,0:1 -map 0:1 -c:v h264_nvenc -r 100 -rc-lookahead 200 -forced-idr
> 1 -strict_gop 1 -sc_threshold 0 -flags +cgop ^
> -force_key_frames expr:gte(t,n_forced*2) -preset: llhp -pix_fmt nv12 -b:v
> 250M -minrate 250M -maxrate 250M -bufsize 250M ^
> -c:a aac -ar 44100 -b:a 384k -ac 2 -af "atrim=0.035, asetpts=PTS-STARTPTS,
> aresample=async=250" -vsync 1 -ss 00:00:01.096 ^
> -max_muxing_queue_size  -f segment -segment_time 600 -segment_wrap 9
> -reset_timestamps 1 ^
> C:\Users\djcim\Videos\PC\PC\PC%02d.ts
>
> As you can see I have -framerate on the input at 200 and -r on the output
> at 100, I have absolutely no idea why this would negate,
> hide, or prevent the warning message. And while this has solved my issue
> with this specific video input device, it doesn't work with
> just any video input device. The Magewell is a really high end card and
> allows 200 fps recording at certain resolutions, and while it
> doesn't actually support 200 fps at the resolution I am recording I believe
> that it's ability to do so at other resolutions is why ffmpeg
> is allowing me to set that option in the first place. When I try to do the
> same thing with my Elgato capture card it throws an error
> and won't even let me start recording, as one would expect:
>
> ffmpeg -y -thread_queue_size  -indexmem  -guess_layout_max 0 -f
> dshow -video_size 1920x1080 -rtbufsize 2147.48M ^
> -framerate 120 -pixel_format yuyv422 -i video="Game Capture HD60 Pro
> (Video) (#01)":audio="ADAT (5+6) (RME Fireface UC)" ^
> -map 0:0,0:1 -map 0:1 -c:v h264_nvenc -r 60 -rc-lookahead 120 -forced-idr 1
> -strict_gop 1 -sc_threshold 0 -flags +cgop ^
> -force_key_frames expr:gte(t,n_forced*2) -preset: llhp -pix_fmt yuv420p
> -b:v 40M -minrate 40M -maxrate 40M -bufsize 40M ^
> -c:a aac -ar 44100 -b:a 384k -ac 2 -af "pan=mono|c0=c0, adelay=120|120,
> aresample=async=250" -vsync 1 ^
> -max_muxing_queue_size  -f segment -segment_time 600 -segment_wrap 9
> -reset_timestamps 1 ^
> C:\Users\djcim\Videos\PC\Camera\CPC%02d.ts
>
> So in the end I have to change -framerate back to 60 in-which the "Past
> duration too large" error is spammed at an alarming, and
> sometimes even console breaking rate. From what I gather looking at the
> source code the error occurs when the delta between
> input PTS and output PTS is less than -.06. I'm not entirely sure what that
> entails but would there be a way for me to adjust either
> the output or input PTS so the threshold is never crossed? I've tried a few
> things with setpts but nothing seems to work, I just
> really don't have a concrete understanding of PTS and how to adjust them.
>
> With my experience and limited knowledge it would seem the only solutions I
> have are:
> A) Dispose of my Elgato capture card and purchase another Magewell so I can
> double the input framerate.
> B) Modify the if statement in the source code and compile FFmpeg myself.
>
> Magewell capture cards are extremely expensive, and while I really wish I
> could just double the framerate and be done with it, it is
> pretty clear that that is not a conventional or "actual" solution. And
> constantly modifying the source code and compiling everything
> myself every time I want to update sounds tiresome. I am fairly confident
> there is a solution to this but I lack the knowledge, any
> help or advice would be much appreciated. Keep in mind that I need to
> maintain a constant integer framerate so I can force
> keyframes at correct intervals, keep segments at consistent lengths, and
> maintain a/v synchronization. Basically, if possible, I'd like
> to keep all my existing options but add something to stop, or prevent the
> warning message without silencing my console.
>
> Thanks for any help
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/list

Re: [FFmpeg-user] ffmpeg and two (2) logitech 920 webcams

2018-06-02 Thread Roger Pack
On Sat, May 19, 2018 at 4:34 PM, Brenda Spinner 
wrote:

> I am using the following line to preview my webcam:
>
> ffmpeg -s 1280x720 -framerate 30 -pix_fmt yuv420p -rtbufsize 100MB -f dshow
> -i video="Logitech Cam#1" -c:v copy -an -f sdl "WebCAM Preview"
>
> this line works fine with cam #1, however when try it with cam #2, I get
> the error:
>
> [dshow @ 002d8920] Could not set video options video=Logitech
> Cam#2: I/O error
>

Try it without the framerate/pix_fmt options...maybe the capabilities
differ?


Also -loglevel debug might help
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] 4K 60Hz Directshow Video Capture

2018-02-27 Thread Roger Pack
consider also libx264 "ultrafast" preset, GL!

On Tue, Feb 13, 2018 at 7:57 AM, Alex P  wrote:

> I think I've figured it out. When I use nv12 or yuv420p as the input and
> output pixel format, I get x1 performance. If I use bgr24/rgb24 as the
> input and yuv444p as the output, I get around x0.3.
>
> But even when I use bgr0 for the input and output, I get less than x1.
> Does anyone know what exactly bgr0 is? I can't find any information about
> it in my googling.
>
> In your testing James, what was the pixel format?
>
> -Original Message-
> From: ffmpeg-user [mailto:ffmpeg-user-boun...@ffmpeg.org] On Behalf Of
> James Girotti
> Sent: Monday, February 12, 2018 7:03 PM
> To: FFmpeg user questions
> Subject: Re: [FFmpeg-user] 4K 60Hz Directshow Video Capture
>
> >
> > ffmpeg -f dshow -video_size 3840x2160 -framerate 6/1001 -rtbufsize
> > 21 -pixel_format bgr24 -i video="MZ0380 PCI, Analog 01 Capture"
> > -c:v h264_nvenc -preset lossless -f null - Gives me the same error
> >
>
> That's surprising, I can get about 200fps using file-based/ramdisk "-c:v
> h264_nvenc -preset -lossless". Have you also tried "-c:v hevc_nvenc -preset
> lossless"? What's the encoding FPS that you're getting? You technically
> shouldn't be able get much more than 60fps as that's what your capture card
> is supplying. Can you monitor the "Video Engine Utilization" during
> encoding? In linux it's listed in the nvidia-settings GUI or "nvidia-smi
> dmon" on the CLI will show enc/dec%.
>
>
> > ffmpeg -f dshow -video_size 3840x2160 -framerate 6/1001 -rtbufsize
> > 21 -pixel_format bgr24 -i video="MZ0380 PCI, Analog 01 Capture"
> > -c:v rawvideo -f null -
> > Gets me nearly x1 performance when executing from a ram disk but
> >
> > ffmpeg -f dshow -video_size 3840x2160 -framerate 6/1001 -rtbufsize
> > 21 -pixel_format bgr24 -i video="MZ0380 PCI, Analog 01 Capture"
> > -c:v rawvideo raw.nut
> > Only gets me x0.5 and the buffer overflows.
>
>
> > Is there a way of accelerating rawvideo decoding? Would using my
> > colleagues 1080 make a difference? Thanks.
>
>
> I think raw-video is already decoded. So no way/need to accelerate that.
> You might try a different pix_fmt from your capture card while using
> hw-encoding, but you'd have to test. I don't know the internals, i.e. when
> the pixel format is converted during hw-encoding. So it might make a
> difference.
>
> Changing pixel formats might be a concern if you are trying to achieve
> "100% lossless" capture. I've read that yuv444p should be sufficient
> colorspace for bgr24.
>
> There isn't a lot of info out there on encoding speed differences based on
> GPU models. It's a complex subject, but from what I have observed the ASIC
> is tied to the GPU clock (I have observed that GPU clock speed increases as
> ASIC load increases). If that's true, then a GTX 1080, with it's higher max
> clock, could have faster encoding, but I have no data to back that up only.
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org
> with subject "unsubscribe".
>
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
>
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Encoding 4K 60Hz lossless from a capture card

2017-12-15 Thread Roger Pack
You are still performing encoding:

"(native) -> (h264)"

On 12/12/17, Alex Pizzi  wrote:
> Windows 10 64-bit
> Ryzen 7
> GTX 1080
> 32GB RAM
>
> Hi all,
>
> I'm trying to encode 4K 30/60Hz video in a lossless format from a 4K
> capture card and everything I've tried gives me a similar error as in the
> linked image, [real-time buffer too full or near too full frame dropped]
>
> https://cloud.githubusercontent.com/assets/4932401/22171307/ef5c9864-df58-11e6-8821-4b74ce3f32d0.png
>
> This is the command I've tried most recently:
>
> ffmpeg.exe -f dshow -video_size 3840x2160 -framerate 30 -pixel_format bgr24
> -rtbufsize INT_MAX -i video="MZ0380 PCI, Analog 01 Capture" -vf fps=30
> out%d.BMP
>
> With the images dumped to a 10G RAM disk or 850 EVO. I'm doing this to skip
> the encoding step.
>
> I get the same error when encoding with h265 lossless and NVENC h265
> lossless.
>
> I need the video to be lossless as it will be used to test hardware h265
> encoders.
>
> Video source is a 4K Blu-ray.
>
> Any help would be greatly appreciated. Thank you.
>
> -Alex P
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Batch convert JPG to Animated GIF's - Where to start?

2017-11-17 Thread Roger Pack
Appears ffmpeg can create animated gif's
https://stackoverflow.com/questions/3688870/create-animated-gif-from-a-set-of-jpeg-images

In terms of a script to pass ffmpeg the right parameters, that's
unfortunately, probably on you mate.

On 11/17/17, colink  wrote:
> I have very little experience with command line interface but can follow
> instructions or edit a working file to suit my needs.
>
> Is there any way to batch create GIF's from multiple images?
>
> Ideally I would like to create a CSV file or Excel file where each row would
> list the files to include in the animated GIF (plus if necessary parameters
> like speed and number of cycles).
>
> Files could be on my hard drive or on a remote server (CSV, would include
> URL for file location)
>
> Can I do this with FFMPEG? Does any script exist?
>
> Where can I find more information.
>
> Willing to consider any other solution with FFMPEG or any other application
> that allows me to create large numbers of Animated GIF's in an automated
> way.
>
> Thanks ColinK
>
>
>
> --
> Sent from: http://www.ffmpeg-archive.org/
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] ffmpeg on windows seems to ignore -to

2017-10-24 Thread Roger Pack
Works in linux? What versions of ffmpeg?

On 10/20/17, Kevin Duffey  wrote:
> Hi all,
> So I am using this simple command on Windows 10 (64-bit) command to cut out
> a small clip:
> ffmpeg -i input.mov -ss 00:18:22.0 -to 00:18:44.0 -c copy clip10.mov
>
> This *should* give me a 22 second clip.
> Instead, it seems to delay for a couple of minutes.. and when I use D
> option, I see a ton of
> cur_dts is invalid (this is harmless if it occurs once at the start per
> stream)  0x    Last message repeated 341 times
>
> I have tried various combos, including
> ffmpeg -i input.mov -ss 00:18:22.000 -to 00:18:44.000 -c copy
> clip10.movffmpeg -i input.mov -ss 00:18:22 -to 00:18:44 -c copy
> clip10.movffmpeg -i input.mov -ss 00:18:22 -t 00:18:44 -c copy clip10.mov
> as well as
> ffmpeg -ss 00:18:00 -i input.mov -ss 00:18:22 -to 00:18:44 -c copy
> clip10.mov
>
> and other variances with the .000, .0, and so forth.
> From some recent stack overflow and other bits online, I dont see that I am
> doing anything wrong.
> The -t is to typically specify the duration, where as -to is to specify the
> end point to stop the clip at.
> This is with ffmpeg latest (3.4).
> Any help would be appreciated.
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Take webcam picture with minimal delay

2017-10-24 Thread Roger Pack
Maybe overwrite a jpg file "over and over" and just grab a copy of it
on demand when you need it?

On 9/20/17, m...@stefan-kleeschulte.de  wrote:
> Hi everyone!
>
> I want to take a picture from a webcam (to send it in a http server's
> response). I already found a command to take a picture and write it to
> stdout (from where I can grab it in the http server):
>
> ffmpeg -f video4linux2 -video_size 1920x1080 -input_format yuyv422 -i
> /dev/video0 -f image2 -frames:v 1 -qscale:v 1 pipe:1
>
> The only drawback is that it takes about 3 seconds to get the picture.
> The delay comes from ffmpeg (not from the server/network), probably
> because it needs to wait for the webcam to initialize.
>
> Now my idea is to somehow keep the webcam active and grab the current
> video frame whenever a http request comes in. But I do not know how to
> do this with ffmpeg (on linux). Can I use one instance of ffmpeg to
> capture the webcam video and stream it *somewhere*, and then use a
> second instance of ffmpeg to extract the current frame of that stream?
> Or is there a better way to do it?
>
> Thanks!
> Stefan Kleeschulte
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] How to have a secure UDP BroadCast

2017-10-04 Thread Roger Pack
I don't think UDP can have passwords can it?

On 10/2/17, hassan shatnawi  wrote:
> Dears,
>
> How to have a secure connection with username and password while
> broadcasting "UPD"
>
> I've use this command
>
> ffmpeg -f dshow -video_size 1280x720 -rtbufsize 702000K -framerate 30
> -i video="Video (00-0 Pro Capture Quad HDMI)" -r 30 -threads 4 -vcodec
> libx264 -crf 0 -preset ultrafast -f mpegts
> "udp://username:password@192.168.1.12:"
>
> but it didn't work can you advice me plz.
>
> Thanks in advance.
>
> *Hassan Shatnawi*
> Software Developer
>
> *Software Development Department*
>
>
>
> *Haupshy Establishment for Electronics*
>
> Al Madina Al Monawara St. bld 31 | P.O.Box 6875 Amman 8, Jordan.
>
> *T: *+962 6 5820 927* | F: *+962 6 5820 926* | M: *+962 78 5792 096
>
> *E*: has...@haupshy.com *|* www.haupshy.com
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Record H264 webcam with DirectShow

2017-08-28 Thread Roger Pack
hmm wonder what types those pins are broadcasting :|

On 7/29/17, Aviv Hurvitz  wrote:
> Now with debug output:
>
>>ffmpeg -y -f dshow -video_size 640x480 -vcodec h264 -framerate 30 -report
> -loglevel trace -rtbufsize 100M  -i video="Microsoft LifeCam Front" -vcodec
> copy -copyinkf  out1.mp4
> ffmpeg started on 2017-07-29 at 11:22:40
> Report written to "ffmpeg-20170729-112240.log"
> ffmpeg version N-86848-g03a9e6f Copyright (c) 2000-2017 the FFmpeg
> developers
>   built with gcc 7.1.0 (GCC)
>   configuration: --enable-gpl --enable-version3 --enable-cuda
> --enable-cuvid --enable-d3d11va --enable-dxva2 --enable-libmfx
> --enable-nvenc --enable-avisynth --enable-bzlib --enable-fontconfig
> --enable-frei0r --enable-gnutls --enable-iconv --enable-libass
> --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libfreetype
> --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug
> --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb
> --enable-libopenh264 --enable-libopenjpeg --enable-libopus --enable-librtmp
> --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora
> --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc
> --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp
> --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid
> --enable-libzimg --enable-lzma --enable-zlib
>   libavutil  55. 68.100 / 55. 68.100
>   libavcodec 57.102.100 / 57.102.100
>   libavformat57. 76.100 / 57. 76.100
>   libavdevice57.  7.100 / 57.  7.100
>   libavfilter 6. 95.100 /  6. 95.100
>   libswscale  4.  7.101 /  4.  7.101
>   libswresample   2.  8.100 /  2.  8.100
>   libpostproc54.  6.100 / 54.  6.100
> Splitting the commandline.
> Reading option '-y' ... matched as option 'y' (overwrite output files) with
> argument '1'.
> Reading option '-f' ... matched as option 'f' (force format) with argument
> 'dshow'.
> Reading option '-video_size' ... matched as AVOption 'video_size' with
> argument '640x480'.
> Reading option '-vcodec' ... matched as option 'vcodec' (force video codec
> ('copy' to copy stream)) with argument 'h264'.
> Reading option '-framerate' ... matched as AVOption 'framerate' with
> argument '30'.
> Reading option '-report' ... matched as option 'report' (generate a report)
> with argument '1'.
> Reading option '-loglevel' ... matched as option 'loglevel' (set logging
> level) with argument 'trace'.
> Reading option '-rtbufsize' ... matched as AVOption 'rtbufsize' with
> argument '100M'.
> Reading option '-i' ... matched as input url with argument 'video=Microsoft
> LifeCam Front'.
> Reading option '-vcodec' ... matched as option 'vcodec' (force video codec
> ('copy' to copy stream)) with argument 'copy'.
> Reading option '-copyinkf' ... matched as option 'copyinkf' (copy initial
> non-keyframes) with argument '1'.
> Reading option 'out1.mp4' ... matched as output url.
> Finished splitting the commandline.
> Parsing a group of options: global .
> Applying option y (overwrite output files) with argument 1.
> Applying option report (generate a report) with argument 1.
> Applying option loglevel (set logging level) with argument trace.
> Successfully parsed a group of options.
> Parsing a group of options: input url video=Microsoft LifeCam Front.
> Applying option f (force format) with argument dshow.
> Applying option vcodec (force video codec ('copy' to copy stream)) with
> argument h264.
> Successfully parsed a group of options.
> Opening an input file: video=Microsoft LifeCam Front.
> [dshow @ 02643500] Selecting pin Capture on video
> [dshow @ 02643500] Could not RenderStream to connect pins
> video=Microsoft LifeCam Front: I/O error
>
>
>
> On Fri, Jul 28, 2017 at 8:06 PM, Aviv Hurvitz 
> wrote:
>
>> I’m on a Surface Pro 3 running Windows 10. Trying to record an mp4 from a
>> native H.264 stream.
>>
>> The H.264 video sources are on the second “pin” and I suspect that’s the
>> problem. But the first and second pins have the same name, so I don’t know
>> how to specify them. I was able to record from the first “pin” by removing
>> the “-vcodech264”.
>> Here are commands:
>>
>>
>>
>>
>>
>> >ffmpeg -list_options true -f dshow -i video="Microsoft LifeCam Front"
>>
>>
>>
>> ffmpeg version N-86848-g03a9e6f Copyright (c) 2000-2017 the FFmpeg
>> developers
>>
>> built with gcc 7.1.0 (GCC)
>>
>> configuration: --enable-gpl --enable-version3 --enable-cuda --enable-cuvid
>> --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-nvenc
>> --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r
>> --enable-gnutls --enable-iconv --enable-libass --enable-libbluray
>> --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme
>> --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame
>> --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenh264
>> --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libsnap

Re: [FFmpeg-user] Record H264 webcam with DirectShow

2017-07-28 Thread Roger Pack
output with -loglevel debug?

On 7/28/17, Aviv Hurvitz  wrote:
> I’m on a Surface Pro 3 running Windows 10. Trying to record an mp4 from a
> native H.264 stream.
>
> The H.264 video sources are on the second “pin” and I suspect that’s the
> problem. But the first and second pins have the same name, so I don’t know
> how to specify them. I was able to record from the first “pin” by removing
> the “-vcodech264”.
> Here are commands:
>
>
>
>
>
>>ffmpeg -list_options true -f dshow -i video="Microsoft LifeCam Front"
>
>
>
> ffmpeg version N-86848-g03a9e6f Copyright (c) 2000-2017 the FFmpeg
> developers
>
> built with gcc 7.1.0 (GCC)
>
> configuration: --enable-gpl --enable-version3 --enable-cuda --enable-cuvid
> --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-nvenc
> --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r
> --enable-gnutls --enable-iconv --enable-libass --enable-libbluray
> --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme
> --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame
> --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenh264
> --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libsnappy
> --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame
> --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis
> --enable-libvpx --enable-libwavpack --enable-libwebp
> --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid
> --enable-libzimg --enable-lzma --enable-zlib
>
> libavutil 55. 68.100 / 55. 68.100
>
> libavcodec 57.102.100 / 57.102.100
>
> libavformat 57. 76.100 / 57. 76.100
>
> libavdevice 57. 7.100 / 57. 7.100
>
> libavfilter 6. 95.100 / 6. 95.100
>
> libswscale 4. 7.101 / 4. 7.101
>
> libswresample 2. 8.100 / 2. 8.100
>
> libpostproc 54. 6.100 / 54. 6.100
>
> [dshow @ 00df6ce0] DirectShow video device options (from video
> devices)
>
> [dshow @ 00df6ce0] Pin "Capture" (alternative pin name "Capture")
>
> [dshow @ 00df6ce0] pixel_format=yuyv422 min s=640x360 fps=15 max
> s=640x360 fps=30
>
> [dshow @ 00df6ce0] pixel_format=yuyv422 min s=640x480 fps=15 max
> s=640x480 fps=30
>
> [dshow @ 00df6ce0] pixel_format=yuyv422 min s=480x270 fps=15 max
> s=480x270 fps=30
>
> [dshow @ 00df6ce0] pixel_format=yuyv422 min s=424x240 fps=15 max
> s=424x240 fps=30
>
> [dshow @ 00df6ce0] pixel_format=yuyv422 min s=320x240 fps=15 max
> s=320x240 fps=30
>
> [dshow @ 00df6ce0] pixel_format=yuyv422 min s=320x180 fps=15 max
> s=320x180 fps=30
>
> [dshow @ 00df6ce0] pixel_format=yuyv422 min s=160x120 fps=15 max
> s=160x120 fps=30
>
> [dshow @ 00df6ce0] pixel_format=yuyv422 min s=848x480 fps=15 max
> s=848x480 fps=30
>
> [dshow @ 00df6ce0] pixel_format=yuyv422 min s=1920x1080 fps=15 max
> s=1920x1080 fps=30
>
> [dshow @ 00df6ce0] pixel_format=yuyv422 min s=1280x720 fps=15 max
> s=1280x720 fps=30
>
> [dshow @ 00df6ce0] pixel_format=yuyv422 min s=960x540 fps=15 max
> s=960x540 fps=30
>
> [dshow @ 00df6ce0] pixel_format=yuyv422 min s=2592x1944 fps=15 max
> s=2592x1944 fps=15
>
> [dshow @ 00df6ce0] pixel_format=yuyv422 min s=2592x1728 fps=15 max
> s=2592x1728 fps=15
>
> [dshow @ 00df6ce0] pixel_format=yuyv422 min s=1296x864 fps=15 max
> s=1296x864 fps=30
>
> [dshow @ 00df6ce0] Pin "Capture" (alternative pin name "Capture")
>
> [dshow @ 00df6ce0] vcodec=h264 min s=1920x1080 fps=15 max
> s=1920x1080 fps=30
>
> [dshow @ 00df6ce0] vcodec=h264 min s=1280x720 fps=15 max s=1280x720
> fps=30
>
> [dshow @ 00df6ce0] vcodec=h264 min s=960x540 fps=15 max s=960x540
> fps=30
>
> [dshow @ 00df6ce0] vcodec=h264 min s=848x480 fps=15 max s=848x480
> fps=30
>
> [dshow @ 00df6ce0] vcodec=h264 min s=640x480 fps=15 max s=640x480
> fps=30
>
> [dshow @ 00df6ce0] vcodec=h264 min s=640x360 fps=15 max s=640x360
> fps=30
>
> [dshow @ 00df6ce0] vcodec=h264 min s=480x270 fps=15 max s=480x270
> fps=30
>
> [dshow @ 00df6ce0] vcodec=h264 min s=424x240 fps=15 max s=424x240
> fps=30
>
> [dshow @ 00df6ce0] vcodec=h264 min s=320x240 fps=15 max s=320x240
> fps=30
>
> [dshow @ 00df6ce0] vcodec=h264 min s=320x180 fps=15 max s=320x180
> fps=30
>
> [dshow @ 00df6ce0] vcodec=h264 min s=160x120 fps=15 max s=160x120
> fps=30
>
> video=Microsoft LifeCam Front: Immediate exit requested
>
>
>
>ffmpeg -y -f dshow -video_size 640x480 -vcodec h264 -framerate 30
> -report -rtbufsize 100M -i video="Microsoft LifeCam Front" -vcodec copy
> -copyinkf out1.mp4
>
>
>
> ffmpeg started on 2017-07-28 at 19:51:23
>
> Report written to "ffmpeg-20170728-195123.log"
>
> ffmpeg version N-86848-g03a9e6f Copyright (c) 2000-2017 the FFmpeg
> developers
>
> built with gcc 7.1.0 (GCC)
>
> configuration: --enable-gpl --enable-version3 --enable-cuda --enable-cuvid
> --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-nvenc
> 

Re: [FFmpeg-user] Screen Capture - Windows 10

2017-06-01 Thread Roger Pack
Maybe turn off graphics acceleration for your browser?

On 6/1/17, Ron Barnes  wrote:
> Hello all,
>
>
>
> I have been trying for days to figure out how to capture the video from my
> desktop and save it as a playable file.  I've been successful to a point.
>
> I am able to capture audio and video from my desktop as far as opening and
> using apps, moving my mouse and what not, but when I open a web browser and
> go to google or CNN or Yahoo or Fox News or anywhere for that matter, the
> Browser Window header is present (Firefox or IE) and the mouse pointer but
> the Browser Window with the content is a black screen.  I believe it is
> because Microsoft no longer supports Direct Show and or my video card is not
> compatible with FFmpeg or the feed to the browser is somehow separate from
> the video I can see on the screen.  I'm clueless here.
>
>
>
> Does anyone see where I may have gone south in the script below or the
> hardware I'm using?  If possible, would someone offer an alternate script I
> can try?
>
>
>
> Here is my script.  It has been put together using google searches as I'm
> very new to FFmpeg.
>
>
>
> ffmpeg -y -rtbufsize 2000M -f gdigrab -thread_queue_size 1024 -probesize 50M
> -r 30 -video_size 1920x1080 -draw_mouse 1 -i desktop -f dshow -i
> audio="Stereo Mix (Realtek High Definition Audio)" -c:v libx264 -r 30
> -preset ultrafast -tune zerolatency -crf 25 -pix_fmt yuv420p -c:a aac
> -strict -2 -ac 2 -b:a 96k -y "I:\Captured\TestAV.mp4"
>
>
>
>
>
>
>
> Please let me start off with some information regarding my system.  I have a
> 4GHZ/8 core AMD CPU with 128GB RAM, leveraging a 2TB Samsung SSD.  My Net
> connection is FIOS Gigabit service.  1GB down and 850MB up.  My Graphics
> card is NVIDIA GeForce GTX 560 with 2GB of memory.
>
> As you can see below I have only three direct show devices.  A webcam,
> webcam microphone and Stereo Mix.  There is no video device.
>
>
>
> I:\Captured>ffmpeg -list_devices true -f dshow -i dummy
>
> ffmpeg version N-86310-g220b24c Copyright (c) 2000-2017 the FFmpeg
> developers
>
>   built with gcc 7.1.0 (GCC)
>
>   configuration: --enable-gpl --enable-version3 --enable-cuda --enable-cuvid
> --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-nvenc
> --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r
> --enable-gnutls --enable-iconv --enable-libass --enable-libbluray
> --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme
> --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame
> --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenh264
> --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libsnappy
> --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame
> --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis
> --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264
> --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg
> --enable-lzma --enable-zlib
>
>   libavutil  55. 63.100 / 55. 63.100
>
>   libavcodec 57. 96.101 / 57. 96.101
>
>   libavformat57. 72.101 / 57. 72.101
>
>   libavdevice57.  7.100 / 57.  7.100
>
>   libavfilter 6. 90.100 /  6. 90.100
>
>   libswscale  4.  7.101 /  4.  7.101
>
>   libswresample   2.  8.100 /  2.  8.100
>
>   libpostproc54.  6.100 / 54.  6.100
>
> [dshow @ 00d36c00] DirectShow video devices (some may be both video
> and audio devices)
>
> [dshow @ 00d36c00]  "Logitech HD Pro Webcam C920"
>
> [dshow @ 00d36c00] Alternative name
> "@device_pnp_\\?\usb#vid_046d&pid_082d&mi_00#7&4c735ef&0&#{65e8773d-8f56
> -11d0-a3b9-00a0c9223196}\{bbefb6c7-2fc4-4139-bb8b-a58bba724083}"
>
> [dshow @ 00d36c00] DirectShow audio devices
>
> [dshow @ 00d36c00]  "Microphone (HD Pro Webcam C920)"
>
> [dshow @ 00d36c00] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{CC06F875-05FB-40A6-
> 8E75-977962237BDA}"
>
> [dshow @ 00d36c00]  "Stereo Mix (Realtek High Definition Audio)"
>
> [dshow @ 00d36c00] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{A4989FE2-26C5-4354-
> 8237-19F262B7BF79}"
>
> dummy: Immediate exit requested
>
>
>
> Regards,
>
>
>
> -Ron B
>
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] gdigrab just capturing a still frame when using window title

2017-05-17 Thread Roger Pack
On 5/12/17, Kenn Sippell  wrote:
> Hello!  I am attempting to use ffmpeg to record some video games. In this
> scenario I'm recording Dota 2 from my PC (working great) and from my laptop
> (having some problems).
>
> ffmpeg -f gdigrab -i desktop output.flv
>
> This command works fine on both my PC and my laptop.
>
> ffmpeg -f gdigrab -i title="Dota 2" output.flv
>
> This command works fine on my PC but does not work on my laptop.  On my
> laptop, I'm often seeing either a) the game window is transparent, and I
> can see the desktop or other windows beneath it, or b) a single random
> frame from the game. In all cases the mouse is capturing accurately.
>
> My laptop has two graphic cards:
>
>- Intel HD Graphics 4600
>- nVidia GeForce GTX 860M
>
> I thought this might just be a manifestation of this sort of issue --
> https://obsproject.com/forum/threads/laptop-black-screen-when-capturing-read-here-first.5965
> But I've tried configuring ffmpeg.exe as described with no results.
>
> Is this a well understood issue or one I can work around in gdigrab?
> Is there an alternative to gdigrab that can capture the game in Windows?
> I'm very performance conscious for my scenario (minimize CPU), is there an
> alternative to gdigrab that is less CPU intense?

Try grabbing the whole desktop and cropping help?
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Question about CBR bit rate settings

2017-04-19 Thread Roger Pack
I think you're right, the tutorial is probably outdated.

On 4/15/17, Gil Yoder  wrote:
> According to Korbel in *FFmpeg Basics*, "To set the constant bit rate for
> the output, three parameters must have the same value: bitrate (-b option),
> minimal rate (-minrate) and maximal rate (-maxrate)" (p. 62). That makes
> sense as far as it goes, but it is a bit unclear in my mind. It appears to
> me that -minrate and -maxrate effect only the video output. Is that right?
>
>
>
> If so, -b:v should be used instead of -b, along with -maxrate and -minrate
> to set the CBR of the video stream if the output has both audio and video.
> Is that correct?
>
>
>
> In such, the total bit rate will be the sum of the -b:v and -b:a options
> plus the demuxing overhead. Please, correct me if that's not right.
>
>
>
> I ask because I'm trying to construct an argument to produce a low bitrate
> live stream. So far I've come up with the following argument:
>
>
>
> ffmpeg -hide_banner -f dshow -sample_rate 11025 -video_size qvga -i
> video="HP Deluxe Webcam KQ246AA":audio="Microphone (Logitech USB Headset)"
> -f flv -r 15 -c:v libx264 -crf 30 -preset veryfast -maxrate 96k -minrate 96k
> -b:v 96k -bufsize 198k -pix_fmt yuv420p -g 30 -c:a aac -b:a 24k -metadata
> copyright="Copyright 2017 OABS" -metadata artist="Gil Yoder"
> "rtmp://***..***/live3/yoder?u=gil&e=1492253342&h=32628F7B51A3943A9D57FE
> 1B9D389328"
>
>
>
> This is the output for a short session:
>
>
>
> Guessed Channel Layout for Input Stream #0.1 : stereo
>
> Input #0, dshow, from 'video=HP Deluxe Webcam KQ246AA:audio=Microphone
> (Logitech USB Headset)':
>
>   Duration: N/A, start: 342651.162000, bitrate: N/A
>
> Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 320x240, 30
> fps, 30 tbr, 1k tbn, 1k tbc
>
> Stream #0:1: Audio: pcm_s16le, 11025 Hz, stereo, s16, 352 kb/s
>
> Stream mapping:
>
>   Stream #0:0 -> #0:0 (rawvideo (native) -> h264 (libx264))
>
>   Stream #0:1 -> #0:1 (pcm_s16le (native) -> aac (native))
>
> Press [q] to stop, [?] for help
>
> [libx264 @ 025760a0] using cpu capabilities: MMX2 SSE2Fast SSSE3
> SSE4.2 AVX
>
> [libx264 @ 025760a0] profile High, level 1.2
>
> [libx264 @ 025760a0] 264 - core 148 r2762 90a61ec - H.264/MPEG-4 AVC
> codec - Copyleft 2003-2017 - http://www.videolan.org/x264.html - options:
> cabac=1 ref=1 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=2 psy=1
> psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=1
> cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=7
> lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0
> bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0
> direct=1 weightb=1 open_gop=0 weightp=1 keyint=30 keyint_min=3 scenecut=40
> intra_refresh=0 rc_lookahead=10 rc=crf mbtree=1 crf=30.0 qcomp=0.60 qpmin=0
> qpmax=69 qpstep=4 vbv_maxrate=96 vbv_bufsize=198 crf_max=0.0 nal_hrd=none
> filler=0 ip_ratio=1.40 aq=1:1.00
>
> Output #0, flv, to
> 'rtmp://***..***/live3/yoder?u=gil&e=1492253342&h=32628F7B51A3943A9D57FE
> 1B9D389328':
>
>   Metadata:
>
> copyright   : Copyright 2017 OABS
>
> artist  : Gil Yoder
>
> encoder : Lavf57.67.100
>
> Stream #0:0: Video: h264 (libx264) ([7][0][0][0] / 0x0007),
> yuv420p(progressive), 320x240, q=-1--1, 96 kb/s, 15 fps, 1k tbn, 15 tbc
>
> Metadata:
>
>   encoder : Lavc57.86.103 libx264
>
> Side data:
>
>   cpb: bitrate max/min/avg: 96000/0/96000 buffer size: 198000 vbv_delay:
> -1
>
> Stream #0:1: Audio: aac (LC) ([10][0][0][0] / 0x000A), 11025 Hz, stereo,
> fltp, 24 kb/s
>
> Metadata:
>
>   encoder : Lavc57.86.103 aac
>
> [flv @ 0259f240] Failed to update header with correct
> duration.7.2kbits/s dup=0 drop=3 speed=1.02x
>
> [flv @ 0259f240] Failed to update header with correct filesize.
>
> frame=  936 fps=7.6 q=-1.0 Lsize=1354kB time=00:02:04.86 bitrate=
> 88.8kbits/s dup=0 drop=3 speed=1.02x
>
> video:949kB audio:363kB subtitle:0kB other streams:0kB global headers:0kB
> muxing overhead: 3.117937%
>
> [libx264 @ 025760a0] frame I:32Avg QP:25.15  size: 17474
>
> [libx264 @ 025760a0] frame P:253   Avg QP:30.90  size:  1266
>
> [libx264 @ 025760a0] frame B:651   Avg QP:35.10  size:   141
>
> [libx264 @ 025760a0] consecutive B-frames:  7.2%  0.2%  0.3% 92.3%
>
> [libx264 @ 025760a0] mb I  I16..4:  2.3%  5.2% 92.5%
>
> [libx264 @ 025760a0] mb P  I16..4:  0.3%  0.4%  0.3%  P16..4: 33.4%
> 9.0%  7.5%  0.0%  0.0%skip:49.1%
>
> [libx264 @ 025760a0] mb B  I16..4:  0.0%  0.0%  0.0%  B16..8:  3.5%
> 1.7%  0.2%  direct: 3.8%  skip:90.7%  L0:27.9% L1:42.5% BI:29.6%
>
> [libx264 @ 025760a0] 8x8 transform intra:8.1% inter:14.6%
>
> [libx264 @ 025760a0] coded y,uvDC,uvAC intra: 92.4% 95.3% 88.1%
> inter: 6.0% 6.4% 0.1%
>
> [libx264 @ 025760a0] i16 v,h,dc,p: 50% 20% 18% 13%
>
> [libx264 @ 00

Re: [FFmpeg-user] Unsynced audio while recording from several sources

2017-04-19 Thread Roger Pack
does async or vsync options help?

On 4/4/17, Valeriy Shevchuk  wrote:
> Hello, everyone.
> I'm trying to use FFmpeg to record video and 3 audio sources and use it to
> generate 3 different video files - each file should contain the same video
> stream but the different audio stream. The problem is that I got audio sync
> issues. The first audio stream is synced perfectly, but the second one has
> 1 sec lag, and the third one has like 2 sec lag.
>
> I've made a few tests so far and it seems that root cause of the issue is
> initialization time of video/audio devices. So, one device is already
> recording something but the second is still being opened and so on. I've
> tried to change input devices order and after that audio streams still have
> the same issue BUT if before 2nd and 3rd audio streams were some time ahead
> of video, after reordering they became to lag after the audio (audio for
> the same event appears with some delay). So this test confirms my version
> about device initialization times.
>
> But the question still, why the first audio stream is synchronized
> properly, while other 2 are not. and also, how could I overcome this
> issues? Any workarounds and ideas are highly appreciated.
>
> Here is FFmpeg command I'm using and it's output.
>
> ffmpeg.exe -f dshow -video_size 1920x1080 -i video="Logitech HD Webcam
>> C615"  -f dshow -i audio="Microphone (HD Webcam C615)" -f dshow -i
>> audio="Microphone Array (Realtek High Definition Audio)"
>> -filter_complex "[1:a]volume=1[a1];[2:a]volume=1[a2]" -vf
>> scale=h=1080:force_original_aspect_ratio=decrease -vcodec libx264
>> -pix_fmt yuv420p -crf 23 -preset ultrafast -acodec aac -vbr 5 -threads 0
>> -map v:0 -map [a1] -map [a2] -f tee
>> "[select=\'v,a:0\']C:/Users/vshevchu/Desktop/123/111/111_jjj1.avi|
>> [select=\'v,a:1\']C:/Users/vshevchu/Desktop/123/111/111_jjj2.avi"
>
>
> Here is the output:
> https://pastebin.com/A3FL05My
>
> PS. Actually, the issue is exactly the same when I'm not using "tee" muxer
> but writing all the audio streams to one container. So, "tee" isn't a
> suspect.
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] compile ffmpeg missing Xlib, or indevs is empty

2017-01-16 Thread Roger Pack
On 1/14/17, 密思刘  wrote:
> ffmpeg-3.2.2/b4$../configure --disable-ffplay --disable-ffprobe
> --disable-ffserver --disable-doc --disable-htmlpages --disable-manpages
> --disable-podpages --disable-txtpages --disable-logging --disable-protocols
> --enable-protocol='file,data,pipe' --disable-encoders --enable-encoder=gif
> --disable-decoders --enable-decoder='gif,rawvideo' --disable-outdevs
> --disable-filters --enable-filter='scale,zscale' --disable-muxers
> --enable-muxer='gif,rawvideo' --disable-demuxers
> --enable-demuxer='gif,rawvideo' --disable-hwaccels --disable-parsers
> --disable-bsfs --disable-indevs --enable-indev='dshow,x11grab'
> --extra-libs=-static --extra-cflags=--static --disable-sdl2
> --disable-bzlib  --disable-vdpau --disable-xvmc --disable-nvenc
> --enable-gpl --enable-nonfree  --enable-x11grab
> ERROR: Xlib not found
>
> If you think configure made a mistake, make sure you are using the latest
> version from Git.  If the latest version fails, report the problem to the
> ffmpeg-user@ffmpeg.org mailing list or IRC #ffmpeg on irc.freenode.net.
> Rerun configure with logging enabled (do not use --disable-logging), and
> include the log this produces with your report.

--enable-indev='dshow

dshow on linux might not work since it's a windows thing.
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Help with -video_size

2017-01-09 Thread Roger Pack
On 1/6/17, Tim Hiles  wrote:
> Hi Moritz, Roger
>
> Hi Moritz
>
>>
>> >
>> > Can you be a bit more clear? Are you trying to record the full 1366x768
>> > and resize it to 1024x768, or are you trying to capture only 1024x768
>> > of your screen?
>>
>
> I wanted to resize to 1024x768. Not capture part of the screen.
>
> If you only want a segment of your desktop, you can let dshow capture
>> the complete 1366x768, and have ffmpeg crop the video to the part you
>> want to preserve. Insert (e.g.) "-vf crop=x=0:y=0:w=1024:h=768".
>
>
> Again, I don't want to crop it, however your suggestion reminded me of the
> scaling filter, I see there is -vf scale=1024:-1 The question I would have
> though is whether you believe filters create more work for ffmpeg or the
> computer as opposed to regular parameters such as -video_size? In other
> words, is a filter considered re-encoding where -video_size wouldn't be.
> Either way, I plan on testing with the filter.

Theoretically you could tell the capture filter to "downscale" it for
you, which...might save a little cpu.  I wouldn't imagine that
resizing is awful but I don't know how ffmpeg does it by default
(swscale has a lot of various sizing algorithms) so it's worth
profiling/benchmarking.

> Hi Roger,
>
>>
>> >   By default, it captures the "full screen" of the main desktop monitor
>> >   [...] To configure it differently, run the provided "configuration
>> >   setup utilities/shortcuts" or adjust registry settings before
>> >   starting a run (advanced users only) [...]
>>
>> I think it might respect video_size as well but I'm not certain, and
>> definitely not sure if it i truncates versus scaled or what not,
>> anyway you could try it.  Yes, I'm not sure and I wrote it  LOL, guess
>> it's been awhile.
>
>
> I noticed the code hasn't been updated in awhile on the git page. Have you
> stopped development?

Haven't worked on it in awhile.

> I'm curious, was the initial intention of you writing
> this program, a response of ffmpeg not having GDIgrab or disliking the
> results at the time?

If gdigrab had existed at the time I probably wouldn't have created
it.  Its initial impetus was that VLC couldn't capture screen + audio
at the same time (and possibly still can't LOL).

> If you disliked the results, would you say GDIgrab
> has improved?  Would you recommend using GDIgrab? Is that something that
> ffmpeg devs are consistently developing?

FFmpeg seems more efficient than other things I'd used, that's why I use it.

> And if so, is this why you've
> stopped development of screen capture recorder? My concern I suppose is
> will screen-capture-recorder be unusable as ffmpeg develops. Let me make it
> clear there is no judgement here either way, I just want to know for my own
> future development as I like the results of screen-capture-recorder.

I suppose the biggest danger is if windows changes its dshow stuff
altogether, which they haven't yet but who knows at some point.
Cheers!
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Help with -video_size

2017-01-06 Thread Roger Pack
On 1/6/17, Moritz Barsnick  wrote:
> On Fri, Jan 06, 2017 at 00:14:31 -0800, Tim Hiles wrote:
>> I've tried recording my screen set at 1366 x 768 and specified -video_size
>> to 1024x768. The command did create a recording however it did not change
>> the size. The output is below (further comments after the output).
>
> Can you be a bit more clear? Are you trying to record the full 1366x768
> and resize it to 1024x768, or are you trying to capture only 1024x768
> of your screen?
>
>> ffmpeg -rtbufsize 1000M -f dshow -i
>> video="screen-capture-recorder":audio="Internal Mic (IDT High Definiti"
>> -flags +global_header -vcodec libx264 -pix_fmt  yuv420p -video_size
>> 1024x768 -preset ultrafast -acodec aac -ac 1 -ar 22050 "Sessiontest2.mp4"
>
> Since -video_size is an input option, it won't work this way. If you
> want to tell the input to use this size, you need to place it *before*
> the respective "-i"; which you did in your second example:
>
>> ...which is switching -video_size parameter before the codec and apply it
>> to my command this is what I get.
>>
>> ffmpeg -rtbufsize 1000M -f dshow -video_size 1024x768 -i
>> video="screen-capture-recorder":audio="Internal Mic (IDT High Definiti"
>> -flags +global_header -vcodec libx264 -pix_fmt yuv420p -video_size
>> 1024x768 -preset ultrafast -acodec aac -ac 1 -ar 22050 "Sessiontest2.mp4"
>
> The second "-video_size" has no effect, as mentioned above.

Yeah I suppose in the perfect world FFmpeg would at least *complain*
about unused parameters, but it is as Maritz described.

>> [dshow @ 00558fe0] Could not set video options
>> video=screen-capture-recorder:audio=Internal Mic (IDT High Definiti: I/O
>> error
>
> As the documentation says: "If the device does not support the
> requested options, it will fail to open."
>
> I would guess that the "-video_size" option is not supported by
> screen-capture-recorder. Reading about it does confirm this:
>
> https://github.com/rdp/screen-capture-recorder-to-video-windows-free
> See the section "Configuration":
>
>   By default, it captures the "full screen" of the main desktop monitor
>   [...] To configure it differently, run the provided "configuration
>   setup utilities/shortcuts" or adjust registry settings before
>   starting a run (advanced users only) [...]

I think it might respect video_size as well but I'm not certain, and
definitely not sure if it i truncates versus scaled or what not,
anyway you could try it.  Yes, I'm not sure and I wrote it  LOL, guess
it's been awhile.
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Audio slowly getting out of sync

2016-11-22 Thread Roger Pack
On 11/21/16, Gabor Alsecz  wrote:
> Dear All,
>
> I have an issue when encoding HLS stream from an RTSP mjpeg video stream +
> USB audio via dshow.
> At the beginning audio & video are in sync but slowly but steadily the
> sound drifting out of sync. Video seems faster, approximately each 10
> minutes the audio late +1 second.

I wonder if the RTSP has timestamps that are "getting out of sync"
if you want to see if the show stuff is getting out of sync, run with
"-loglevel debug" and it spits out tons of timing info.

Anyway there are some "vsync" options see if either help.

> I am using the following command:
> ffmpeg -y -loglevel info -f dshow -i audio="Microphone (Trust Gaming
> Microphone)" -pix_fmt bgr24 -video_size 1280x720 -framerate 25 -f rawvideo
>   -i "-" -b:a 160k -ar 44100 -pix_fmt yuv420p -profile:v baseline -r 25 -g
> 50 -hls_time 5 -hls_list_size 0 -hls_segment_filename file%03d.ts -f hls
> out.m3u8
>
>
> As you can see, this is rawvideo, feeding by memory bitmap "images" via
> piping (from C#) + capture raw audio via dshow.
>
> Any idea what/how to solve this issue? Thanks!
>
>
> br,
> Gabor
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Single Audio stream to multiple FFmpeg process?

2016-11-08 Thread Roger Pack
I think two devices can record from the same mic can't they? (but not
video, though maybe that's possible now in windows 10 dunno otherwise
you could split it in show or like you were saying with pipes "maybe"
LOL)
http://betterlogic.com/roger/2013/05/directshow-webcam-splitter/
GL!

On 11/8/16, Gabor Alsecz  wrote:
> Dear All,
>
> Is there any way - on Windows - to share a dshow USB microphone device to
> provide the same audio stream to two different FFmpeg process/pipe "same
> time"? Or this is not possible because of device exclusive usage by Windows
> or like that?
>
> Thanks!
>
>
> br,
> Gabor
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] ffmpeg3 crashes on sending rtsp

2016-09-15 Thread Roger Pack
On 9/13/16, Carl Eugen Hoyos  wrote:
> 2016-09-13 2:25 GMT+02:00 Shi Qiu :
>
>> ffmpeg3 only crash when I send rtsp.
>
> Can you test if this is a regression since 00e122bc / works
> with 4873952f?
> (Ticket #5844)

See also http://ffmpeg.org/pipermail/ffmpeg-user/2016-September/033610.html
and maybe (?) 
https://lists.libav.org/pipermail/libav-devel/2016-September/079184.html
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] screen capture

2016-09-02 Thread Roger Pack
You install this:
https://github.com/rdp/screen-capture-recorder-to-video-windows-free
another option is to use "gdigrab" for input option

On 9/2/16, Peter Balazovic  wrote:
> I want to capture the screen for example:
>
> ffmpeg -f dshow -i video="screen-capture-recorder" output.flv
>
> seems to me missing "screen-capture-recorder", how can I have
> "screen-capture-recorder"?
>
> C:\Users\Downloads\ffmpeg-20160901-be07c25-win64-static\ffmpeg-20160901-be07c25-win64-static\bin>ffmpeg
> -list_devices true -f dshow -i dummy
> ffmpeg version N-81516-gbe07c25 Copyright (c) 2000-2016 the FFmpeg
> developers
>   built with gcc 5.4.0 (GCC)
>   configuration: --enable-gpl --enable-version3 --disable-w32threads
> --enable-dxva2 --enable-libmfx --enable-nvenc --enable-avisynth
> --enable-bzlib --enable-lib
> ebur128 --enable-fontconfig --enable-frei0r --enable-gnutls
> --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b
> --enable-libcaca --enable-libfree
> type --enable-libgme --enable-libgsm --enable-libilbc
> --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb
> --enable-libopencore-amrwb --enable-lib
> openh264 --enable-libopenjpeg --enable-libopus --enable-librtmp
> --enable-libschroedinger --enable-libsnappy --enable-libsoxr
> --enable-libspeex --enable-libtheor
> a --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc
> --enable-libvorbis --enable-libvpx --enable-libwavpack
> --enable-libwebp --enable-libx264 --ena
> ble-libx265 --enable-libxavs --enable-libxvid --enable-libzimg
> --enable-lzma --enable-decklink --enable-zlib
>   libavutil  55. 29.100 / 55. 29.100
>   libavcodec 57. 54.101 / 57. 54.101
>   libavformat57. 48.101 / 57. 48.101
>   libavdevice57.  0.102 / 57.  0.102
>   libavfilter 6. 58.100 /  6. 58.100
>   libswscale  4.  1.100 /  4.  1.100
>   libswresample   2.  1.100 /  2.  1.100
>   libpostproc54.  0.100 / 54.  0.100
> [dshow @ 0067a400] DirectShow video devices (some may be both
> video and audio devices)
> [dshow @ 0067a400]  "Integrated Webcam"
> [dshow @ 0067a400] Alternative name
> "@device_pnp_\\?\usb#vid_1bcf&pid_2984&mi_00#7&2a1a095c&0&#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\global"
> [dshow @ 0067a400] DirectShow audio devices
> [dshow @ 0067a400]  "Microphone (10- Logitech USB He"
> [dshow @ 0067a400] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\Microphone (10-
> Logitech USB He"
> [dshow @ 0067a400]  "Microphone Array (Realtek High "
> [dshow @ 0067a400] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\Microphone Array
> (Realtek High "
> dummy: Immediate exit requested
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] (no subject)

2016-07-20 Thread Roger Pack
On 7/15/16, Tim Hiles  wrote:
> Hi all,
>
> Seeing odd behavior with screen capture recorder.
>
> This always works on win 7. No problems. Even when resolution is set to
> 1920 x 1080
>
> c:\ffmpeg\ffmpeg\bin\ffmpeg.exe -f dshow -i
> video="screen-capture-recorder":audio="Microphone Array (Realtek High
> Definition Audio)" -vcodec libx264 -pix_fmt yuv420p -preset ultrafast
> -acodec pcm_s16le -ac 1 -ar 22050 C:\CMT\output.mkv
>
> But it isn't working on this win 10 computer with the resolution set to
> 1920 x 1080.  What happens is that it actually crops the image down to
> 1536x864

gdigrab may work better for you, I think it handles DPI.
See also 
https://github.com/rdp/screen-capture-recorder-to-video-windows-free/issues/50
for possible work arounds.
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Webcam RTMP stream - video problems

2016-06-22 Thread Roger Pack
On 6/21/16, hdt1...@tom.com  wrote:
> Hello,
>
> I'm trying to capture audio and video from a webcam and stream it via RTMP
> to a Red5 media server, where others can then view the live stream.
> When I do it from a Flash applet (Publisher Demo from the Red5
> distribution), it works without problems.
> When I use ffmpeg, trying to stream *both* audio and video causes stuttering
> and delayed video playback.
> The video stream will be lagging behind what I do in front of the camera by
> approximately half a second, and it doesn't look "fluid".


What if you record to a local file?
Or you just mean there is an increase in latency?
If it's latency I've done a write up of what I know here:
https://trac.ffmpeg.org/wiki/StreamingGuide
Cheers/enjoy!

>
> My command line and environment:
> OS: Windows 7 x64
> FFmpeg version: N-58308-ge502783
> Command line:
> ffmpeg -f dshow -video_size 640x480 -framerate 15 -i "video=USB Video
> Device:audio=Microphone (4- USB Audio Device"^
> -r 15/1 -s 640x480^
> -acodec nellymoser -ac 1 -ar 22050^
> -aq 30^
> -q 28^
> -threads 0^
> -f flv rtmp://my-server-url/oflaDemo/test
>
> I'm using the Nellymoser codec because that's what the Flash applet uses - I
> also tried MP3 but then the delay is even worse.
> I've played around with the quality settings quite a bit, and some seem to
> result in a little less delay, but I couldn't find a convincing setting yet.
> If instead of streaming I write to a local flv file, it works great. If I
> disable the audio stream, it also works without problems.
> I don't think network bandwidth is an issue, as the server is connected via
> LAN.
>
> Do you have any suggestions regarding parameters/configuration, or what I
> could look into? Are other codecs more likely to work without problems?
>
> Hi,have you resolved this problem?and how to resolve it?Thanks.
>
>
>
>
>
>
>
> Best Regards,
> Niko
>
> Console output:
>
> Guessed Channel Layout for  Input Stream #0.1 : stereo
> Input #0, dshow, from 'video=USB Video Device:audio=Microphone (4- USB Audio
> Device':
>   Duration: N/A, start: 26089.932000, bitrate: 1411 kb/s
> Stream #0:0: Video: mjpeg, yuvj422p(pc), 640x480, 15 tbr, 1k tbn, 15
> tbc
> Stream #0:1: Audio: pcm_s16le, 44100 Hz, stereo, s16, 1411 kb/s
> [swscaler @ 00301600] deprecated pixel format used, make sure you
> did set range correctly
> Output #0, flv, to 'rtmp://fb10dtools-dev/oflaDemo/test':
>   Metadata:
> encoder : Lavf55.21.100
> Stream #0:0: Video: flv1 (flv) ([2][0][0][0] / 0x0002), yuv420p,
> 640x480, q=2-31, 200 kb/s, 1k tbn, 15 tbc
> Stream #0:1: Audio: nellymoser ([6][0][0][0] / 0x0006), 22050 Hz, mono,
> flt, 128 kb/s
> Stream mapping:
>   Stream #0:0 -> #0:0 (mjpeg -> flv)
>   Stream #0:1 -> #0:1 (pcm_s16le -> nellymoser)
> Press [q] to stop, [?] for help
> [nellymoser @ 00370280] Queue input is backward in time
>
>
>
>
> hdt1...@tom.com
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Help, cannot connect to my webcam (message : Could not RenderStream to connect pins, I/O error)

2016-06-15 Thread Roger Pack
What is your output of list_options for that device?

On 5/10/16, 김찬  wrote:
> Yes, I could use the webcam to record the screen containing the webcam
> video.(using another solution, which could probably use ffmpeg inside).
> Chan Kim
>
> -Original Message-
> From: ffmpeg-user [mailto:ffmpeg-user-boun...@ffmpeg.org] On Behalf Of Roger
> Pack
> Sent: Tuesday, May 10, 2016 10:57 PM
> To: FFmpeg user questions 
> Subject: Re: [FFmpeg-user] Help, cannot connect to my webcam (message :
> Could not RenderStream to connect pins, I/O error)
>
> can other programs use your lifecam?
>
> On 5/10/16, 김찬  wrote:
>> Thanks Carl,
>> Here is the output. I gave the recording command (ffmpeg -f dshow -i
>> video="..." test.mp4) while I was streaming the web cam video using
>> LifeCam player.(displaying the camera on my notebook).
>>
>> D:\ffmpeg-20160508-git-38eeb85-win64-static\bin
>>>ffmpeg -list_devices true -f dshow -i dummy
>> ffmpeg version N-79883-g38eeb85 Copyright (c) 2000-2016 the FFmpeg
>> developers
>>   built with gcc 5.3.0 (GCC)
>>   configuration: --enable-gpl --enable-version3 --disable-w32threads
>> --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r
>> --enable-gnutls --enable-iconv --enable-libass --enable-libbluray
>> --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme
>> --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmfx
>> --enable-libmp3lame --enable-libopencore-amrnb
>> --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus
>> --enable-librtmp --enable-libschroedinger --enable-libsnappy
>> --enable-libsoxr --enable-libspeex --enable-libtheora
>> --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc
>> --enable-libvorbis --enable-libvpx --enable-libwavpack
>> --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs
>> --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink
>> --enable-zlib
>>   libavutil  55. 24.100 / 55. 24.100
>>   libavcodec 57. 40.100 / 57. 40.100
>>   libavformat57. 36.100 / 57. 36.100
>>   libavdevice57.  0.101 / 57.  0.101
>>   libavfilter 6. 45.100 /  6. 45.100
>>   libswscale  4.  1.100 /  4.  1.100
>>   libswresample   2.  0.101 /  2.  0.101
>>   libpostproc54.  0.100 / 54.  0.100
>> [dshow @ 004ca940] DirectShow video devices (some may be both
>> video and audio devices) [dshow @ 004ca940]  "Microsoft
>> LifeCam VX-6000"
>> [dshow @ 004ca940] Alternative name
>> "@device_pnp_\\?\usb#vid_045e&pid_00f4&mi_00#7&1628b7e9&0&#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\global"
>> [dshow @ 004ca940]  "Webcam SC-13HDL11624N"
>> [dshow @ 004ca940] Alternative name
>> "@device_pnp_\\?\usb#vid_2232&pid_1024&mi_00#7&2dcaa785&0&#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\global"
>> [dshow @ 004ca940] DirectShow audio devices [dshow @
>> 004ca940]  "마이크(2- Microsoft LifeCam VX-600"
>> [dshow @ 004ca940] Alternative name
>> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\마이크(2- Microsoft
>> LifeCam VX-600"
>> [dshow @ 004ca940]  "마이크(Realtek High Definition Aud"
>> [dshow @ 004ca940] Alternative name
>> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\마이크(Realtek High
>> Definition Aud"
>> dummy: Immediate exit requested
>>
>> D:\ffmpeg-20160508-git-38eeb85-win64-static\bin
>>>ffmpeg -f dshow -i video="Microsoft LifeCam VX-6000" test.mp4
>> ffmpeg version N-79883-g38eeb85 Copyright (c) 2000-2016 the FFmpeg
>> developers
>>   built with gcc 5.3.0 (GCC)
>>   configuration: --enable-gpl --enable-version3 --disable-w32threads
>> --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r
>> --enable-gnutls --enable-iconv --enable-libass --enable-libbluray
>> --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme
>> --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmfx
>> --enable-libmp3lame --enable-libopencore-amrnb
>> --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus
>> --enable-librtmp --enable-libschroedinger --enable-libsnappy
>> --enable-libsoxr --enable-libspeex --enable-libtheora
>> --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc
>> --enable-libvorbis --enable-libvpx --enable-libwavpack
>> --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs
>> --enable-libxvid --enable-

Re: [FFmpeg-user] Using ffmpeg to stream to or from Blackmagic Decklink Card

2016-06-14 Thread Roger Pack
On 6/13/16, davood afshari  wrote:
> Hello,
> I want to use blackmagic cards with ffmpeg to stream their input but there
> is a problem.
> When I use "H264 Pro" Encoder as a "decklink" or "dshow" input, I see this
> error in command line output of ffmpeg. Device is ok and I can Use it with
> other applications like MXPTiny or MXLight.
> Here is the command and output:
> *>ffmpeg -f dshow -list_devices true -i dummy*
> ffmpeg version 3.0.1 Copyright (c) 2000-2016 the FFmpeg developers
>   built with gcc 5.3.0 (GCC)
>   configuration: --enable-gpl --enable-version3 --disable-w32threads
> --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r
> --enable-gnutls --enable-iconv --enable-libass --enable-libbluray
> --enable-libbs2b --enable-libcaca --enable-libdcadec --enable-libfreetype
> --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug
> --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb
> --enable-libopenjpeg --enable-libopus --enable-librtmp
> --enable-libschroedinger --enable-libsoxr --enable-libspeex
> --enable-libtheora --enable-libtwolame --enable-libvidstab
> --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx
> --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265
> --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma
> --enable-decklink --enable-zlib
>   libavutil  55. 17.103 / 55. 17.103
>   libavcodec 57. 24.102 / 57. 24.102
>   libavformat57. 25.100 / 57. 25.100
>   libavdevice57.  0.101 / 57.  0.101
>   libavfilter 6. 31.100 /  6. 31.100
>   libswscale  4.  0.100 /  4.  0.100
>   libswresample   2.  0.101 /  2.  0.101
>   libpostproc54.  0.100 / 54.  0.100
> [dshow @ 005f5272ad40] DirectShow video devices (some may be both video
> and audio devices)
> [dshow @ 005f5272ad40]  "Decklink Video Capture"
> [dshow @ 005f5272ad40] Alternative name
> "@device_sw_{860BB310-5D01-11D0-BD3B-00A0C911CE86}\{44A8B5C7-13B6-4211-BD40-35B629D9E6DF}"
> [dshow @ 005f5272ad40]  "screen-capture-recorder"
> [dshow @ 005f5272ad40] Alternative name
> "@device_sw_{860BB310-5D01-11D0-BD3B-00A0C911CE86}\{4EA69364-2C8A-4AE6-A561-56E4B5044439}"
> [dshow @ 005f5272ad40] DirectShow audio devices
> [dshow @ 005f5272ad40]  "Microphone (High Definition Audio Device)"
> [dshow @ 005f5272ad40] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{CBA42E33-151B-4975-B9A5-68B2552152F4}"
> [dshow @ 005f5272ad40]  "virtual-audio-capturer"
> [dshow @ 005f5272ad40] Alternative name
> "@device_sw_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\{8E146464-DB61-4309-AFA1-3578E927E935}"
> [dshow @ 005f5272ad40]  "Decklink Audio Capture"
> [dshow @ 005f5272ad40] Alternative name
> "@device_sw_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\{AAA22F7E-5AA0-49D9-8C8D-B52B1AA92EB7}"
> dummy: Immediate exit requested
>
>
> and when I use "Decklink Video Capture" to identify its options, here is
> the output:
> *>ffmpeg -f dshow -list_options true -i video="Decklink Video Capture"*
> ffmpeg version 3.0.1 Copyright (c) 2000-2016 the FFmpeg developers
>   built with gcc 5.3.0 (GCC)
>   configuration: --enable-gpl --enable-version3 --disable-w32threads
> --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r
> --enable-gnutls --enable-iconv --enable-libass --enable-libbluray
> --enable-libbs2b --enable-libcaca --enable-libdcadec --enable-libfreetype
> --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug
> --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb
> --enable-libopenjpeg --enable-libopus --enable-librtmp
> --enable-libschroedinger --enable-libsoxr --enable-libspeex
> --enable-libtheora --enable-libtwolame --enable-libvidstab
> --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx
> --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265
> --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma
> --enable-decklink --enable-zlib
>   libavutil  55. 17.103 / 55. 17.103
>   libavcodec 57. 24.102 / 57. 24.102
>   libavformat57. 25.100 / 57. 25.100
>   libavdevice57.  0.101 / 57.  0.101
>   libavfilter 6. 31.100 /  6. 31.100
>   libswscale  4.  0.100 /  4.  0.100
>   libswresample   2.  0.101 /  2.  0.101
>   libpostproc54.  0.100 / 54.  0.100
> [dshow @ 0036677bad40] Unable to BindToObject for Decklink Video Capture
> [dshow @ 0036677bad40] Could not find video device with name [Decklink
> Video Capture] among source devices of type video.
> video=Decklink Video Capture: I/O error

Hmm I have run into that BindtoObject failure only once but it was in
some freak situation where a DLL was removed, probably not yours.  Are
you sure the other software are creating dshow graphs?
Anyway ffmpeg also has a "native" decklink input option (that I know
nothing about) maybe that will work better.

https://www.ffmpeg.org/ffmpeg-devices.html#decklink

This thread

Re: [FFmpeg-user] Any way to get better use of the cpu?

2016-06-03 Thread Roger Pack
On 2/23/16, Carl Eugen Hoyos  wrote:
> Roger Pack  gmail.com> writes:
>
>> I'm referring more to the fact that if I output to two
>> outputs, from the same ffmpeg instance, in essence, this:
>>
>> ffmpeg -i input output1 output2
>>
>> takes twice as long as running these two in parallel:
>>
>> ffmpeg -i input output1
>> ffmpeg -i input output2
>
> The tee muxer fixes this issue.

Hmm...it seems the tee muxer is mostly for encoding once, then
outputting to several (the same encoded content) it apperas?  In this
particular instance I'm trying to encode multiple streams
simultaneously "in parallel" so that I can use more cores (assuming
each encoding uses at most 2 cores, which seems to be the case here).
https://gist.github.com/rdp/0a152670b742053a2a69e8d54e613b9a was my
attempt at it.
I suppose it just isn't possible today, and I may file a trac feature
request for it sometime.
Cheers!
-roger-
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] dvb_teletext -> .srt fails?

2016-06-01 Thread Roger Pack
On 1/15/16, Marton Balint  wrote:
>
> On Fri, 15 Jan 2016, Roger Pack wrote:
>
>> similar to
>> https://lists.ffmpeg.org/pipermail/ffmpeg-user/2015-June/027029.html
>> and
>>
>> https://trac.ffmpeg.org/ticket/3025
>>
>> My "quest" is to convert an dvb_teletext stream to some human readable
>> (ex: .srt) format.
>>
>> Ex: using this file:
>>
>> https://trac.ffmpeg.org/attachment/ticket/3025/sbs.2.5M.ts
>>
>> I don't seem to be having much luck (DVBT stream).  Any feedback welcome
>> :)
>> -roger-
>
> That DVB teletext stream does not seem to contain any subtitles. Try to
> upload a bigger part somewhere if you are sure there should be subtitles.
>
> You can check with ffprobe if there are any actual decoded subtitles:
>
> ./ffprobe -of xml -select_streams s -show_frames

OK thank you for the hint, seems you were correct, that file has no subtitles.
If I try it with a file that does (ex:
https://sourceforge.net/projects/mplayer-edl/files/Roger60.ts/download)
then it does work.
Curiously, that ffprobe showed "no packets" initially then I realized
that it itself needed to be compiled with --enable-libzvbi
then it showed the packet timestamps at least.
Thanks!
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] I can't find any video devices on the Windows 7 source system

2016-05-30 Thread Roger Pack
On 5/28/16, John Lewis  wrote:
> If that is software and It didn't come with the Operating system or
> ffmepg, the Streaming Guide didn't tell me about it.
OK attempted to add some more links to the documentation about it.
> What is the difference between gdigrab and dshow?


Let me know if the documentation is still insufficient so I can adjust it.
Thanks!
> On 05/28/2016 02:38 AM, Roger Pack wrote:
>> Did you install a screen capturing dshow device (ex:
>> screen-capture-recorder)?
>> Also see gdigrab.
>> GL!
>>
>> On 5/27/16, John Lewis  wrote:
>>> I would like to stream a desktop from one computer running Windows 7 to
>>> another to  on a LAN running Debian 8 to show an application that is on
>>> the screen from the first computer through to the second computer that
>>> is connected to a projector. The application can't easily run on the
>>> Debian system connected to the projector because the software wasn't
>>> built for the operating system.
>>>
>>> I can't find any video devices on the Windows source system. I have
>>> command output attached.
>>>
>> ___
>> ffmpeg-user mailing list
>> ffmpeg-user@ffmpeg.org
>> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>>
>> To unsubscribe, visit link above, or email
>> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
>
>
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Can FFMPEG output as a directshow device viewable by other windows applications ?

2016-05-27 Thread Roger Pack
As far as I know no such filter exists, though many requests have been
made, perhaps it's time to do something there :)

On 9/19/15, Jeffrey Suckow  wrote:
> I looked at ffshow.
>
> I am not a programmer.
>
> Regarding the deinterlace process, ffshow has everything I need.
>
> The question is what program can grab a video from my capture card, process
> using ffdshow and then output as a virtual webcam ?
>
> Jeff
>
>
> On Thu, Sep 17, 2015 at 7:37 PM, Roger Pack  wrote:
>
>> You may be able to use ffdshow for it [?]
>>
>> On 9/16/15, Jeffrey Suckow  wrote:
>> > I am grabbing an interlace video signal (1080i) with a video capture
>> > card
>> > compatible Directshow (Blackmagic Intensity Pro 4k).
>> >
>> > The video signal is then picked up by a machine vision software.
>> >
>> > The interlace signal is causing problems. I would like to use FFMPEG to
>> > deinterlace the signal on the fly and output as a directshow stream that
>> > can be pickup by the machine vision software (like if it was a webcam).
>> >
>> > Is this possible ?
>> >
>> > Thanks.
>> > Jeff
>> > ___
>> > ffmpeg-user mailing list
>> > ffmpeg-user@ffmpeg.org
>> > http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>> >
>> ___
>> ffmpeg-user mailing list
>> ffmpeg-user@ffmpeg.org
>> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>>
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] I can't find any video devices on the Windows 7 source system

2016-05-27 Thread Roger Pack
Did you install a screen capturing dshow device (ex: screen-capture-recorder)?
Also see gdigrab.
GL!

On 5/27/16, John Lewis  wrote:
> I would like to stream a desktop from one computer running Windows 7 to
> another to  on a LAN running Debian 8 to show an application that is on
> the screen from the first computer through to the second computer that
> is connected to a projector. The application can't easily run on the
> Debian system connected to the projector because the software wasn't
> built for the operating system.
>
> I can't find any video devices on the Windows source system. I have
> command output attached.
>
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] nvenc ffmpeg

2016-05-16 Thread Roger Allen
On Mon, May 16, 2016 at 2:50 AM, Аз Есмь  wrote:
> i've got the same problem. GeForce GT 610, nVidia driver 361.42, Kubuntu 
> 16.04.

Supported GPUs are listed here:
https://developer.nvidia.com/nvidia-video-codec-sdk#gpulist which for
GeForce cards can be summarized as "Kepler & Maxwell GPUs only".

According to https://en.wikipedia.org/wiki/GeForce_600_series the GT
610 is a GF119 or Fermi card.  That came out prior to the Kepler cards
that first included NVENC.
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] nvenc ffmpeg

2016-05-15 Thread Roger Allen
On Sat, May 14, 2016 at 5:36 AM, gofrane  wrote:
>
>  ffmpeg -y -i input.mp4  -vcodec  nvenc -b:v 5M  -acodec copy OUTPUT.mp4*
>
snip

> [nvenc @ 0x2721540] No NVENC capable devices found
>

This seems like it is the likely error to focus on.  Are you sure your
NVIDIA GPU has NVENC?  Do you have recent drivers installed?
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Help, cannot connect to my webcam (message : Could not RenderStream to connect pins, I/O error)

2016-05-10 Thread Roger Pack
can other programs use your lifecam?

On 5/10/16, 김찬  wrote:
> Thanks Carl,
> Here is the output. I gave the recording command (ffmpeg -f dshow -i
> video="..." test.mp4) while I was streaming the web cam video using LifeCam
> player.(displaying the camera on my notebook).
>
> D:\ffmpeg-20160508-git-38eeb85-win64-static\bin
>>ffmpeg -list_devices true -f dshow -i dummy
> ffmpeg version N-79883-g38eeb85 Copyright (c) 2000-2016 the FFmpeg
> developers
>   built with gcc 5.3.0 (GCC)
>   configuration: --enable-gpl --enable-version3 --disable-w32threads
> --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r
> --enable-gnutls --enable-iconv --enable-libass --enable-libbluray
> --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme
> --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmfx
> --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb
> --enable-libopenjpeg --enable-libopus --enable-librtmp
> --enable-libschroedinger --enable-libsnappy --enable-libsoxr
> --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab
> --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx
> --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265
> --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma
> --enable-decklink --enable-zlib
>   libavutil  55. 24.100 / 55. 24.100
>   libavcodec 57. 40.100 / 57. 40.100
>   libavformat57. 36.100 / 57. 36.100
>   libavdevice57.  0.101 / 57.  0.101
>   libavfilter 6. 45.100 /  6. 45.100
>   libswscale  4.  1.100 /  4.  1.100
>   libswresample   2.  0.101 /  2.  0.101
>   libpostproc54.  0.100 / 54.  0.100
> [dshow @ 004ca940] DirectShow video devices (some may be both video
> and audio devices)
> [dshow @ 004ca940]  "Microsoft LifeCam VX-6000"
> [dshow @ 004ca940] Alternative name
> "@device_pnp_\\?\usb#vid_045e&pid_00f4&mi_00#7&1628b7e9&0&#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\global"
> [dshow @ 004ca940]  "Webcam SC-13HDL11624N"
> [dshow @ 004ca940] Alternative name
> "@device_pnp_\\?\usb#vid_2232&pid_1024&mi_00#7&2dcaa785&0&#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\global"
> [dshow @ 004ca940] DirectShow audio devices
> [dshow @ 004ca940]  "마이크(2- Microsoft LifeCam VX-600"
> [dshow @ 004ca940] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\마이크(2- Microsoft LifeCam
> VX-600"
> [dshow @ 004ca940]  "마이크(Realtek High Definition Aud"
> [dshow @ 004ca940] Alternative name
> "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\마이크(Realtek High
> Definition Aud"
> dummy: Immediate exit requested
>
> D:\ffmpeg-20160508-git-38eeb85-win64-static\bin
>>ffmpeg -f dshow -i video="Microsoft LifeCam VX-6000" test.mp4
> ffmpeg version N-79883-g38eeb85 Copyright (c) 2000-2016 the FFmpeg
> developers
>   built with gcc 5.3.0 (GCC)
>   configuration: --enable-gpl --enable-version3 --disable-w32threads
> --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r
> --enable-gnutls --enable-iconv --enable-libass --enable-libbluray
> --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme
> --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmfx
> --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb
> --enable-libopenjpeg --enable-libopus --enable-librtmp
> --enable-libschroedinger --enable-libsnappy --enable-libsoxr
> --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab
> --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx
> --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265
> --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma
> --enable-decklink --enable-zlib
>   libavutil  55. 24.100 / 55. 24.100
>   libavcodec 57. 40.100 / 57. 40.100
>   libavformat57. 36.100 / 57. 36.100
>   libavdevice57.  0.101 / 57.  0.101
>   libavfilter 6. 45.100 /  6. 45.100
>   libswscale  4.  1.100 /  4.  1.100
>   libswresample   2.  0.101 /  2.  0.101
>   libpostproc54.  0.100 / 54.  0.100
> [dshow @ 0041a9a0] Could not RenderStream to connect pins
> video=Microsoft LifeCam VX-6000: I/O error
>
> D:\ffmpeg-20160508-git-38eeb85-win64-static\bin
>
> -Original Message-
> From: ffmpeg-user [mailto:ffmpeg-user-boun...@ffmpeg.org] On Behalf Of Carl
> Eugen Hoyos
> Sent: Tuesday, May 10, 2016 3:00 AM
> To: ffmpeg-user@ffmpeg.org
> Subject: Re: [FFmpeg-user] Help, cannot connect to my webcam (message :
> Could not RenderStream to connect pins, I/O error)
>
> 김찬  etri.re.kr> writes:
>
>> to see the web cam is recognized by ffmpeg. (It read "Microsoft
>> LifeCam VX-6000")
>>
>> So I endtered to record the screen,
>>
>> > ffmpeg -f dshow -i video="Microsoft LifeCam VX-6000" test.mp4
>
> Please provide your failing command line together with the complete, uncut
> console output, use copy-paste, do not attach scr

Re: [FFmpeg-user] named pipes ffmpeg

2016-05-04 Thread Roger Pack
https://trac.ffmpeg.org/ticket/986 appears to be the "canonical" named
pipe page, besides that I've never tried it, sorry.

On 5/3/16, Robin Stevens  wrote:
> I tried pipes in different formats/syntaxis.
>
> Previously I used the multipurpose-encoder from Datastead to setup a
> rtsp-stream. After some tests it became clear that this software is piping
> video into ffmpeg. So I looked at the console log of ffmpeg from the
> multipurpose-encoder and there was listed that ffmpeg gets his stream from
> "\\.\pipe\7EB76D4B9CD3464C93908CCB9E3CE1CF".
>
> So I closed the ffmpeg executable that was executed from the
> multipurpose-encoder and tried to connect with the same build/executable to
> that pipe. The only thing I get was the error message "Invalid argument".
> Then I tried to instantiate my own named pipe in windows with the same
> syntax as Datastead is doing that. Like "\\.\pipe\tmp_pipe1".
>
> When I tried to open the pipe with ffplay I get the following message:
> "pipe:\\.\pipe\tmp_pipe1: Cannot allocate memory"
>
> When I tried to open the pipe with ffmpeg I get the following message:
> "[IMGUTILS @ 005af780] Picture size 0x0 is invalid
> pipe:\\.\pipe\tmp_pipe1: Invalid argument"
>
> When I try to write to the pipe with ffmpeg, it prints the console with a
> bunch of data and after a few seconds it isn't responding anymore.
>
> I know the "pipe" protocol is compiled with this version of ffmpeg and while
> the software of Datastead is executing ffmpeg with a named pipe as input
> parameter it seems that is has to work.
>
> Do I need a different kind of pipe here? Or do have to add protocol
> parameters to write to this pipe? Does anyone experience with named pipes?
>
> Kind regards,
>
> Robin
>
> -Oorspronkelijk bericht-
> Van: ffmpeg-user [mailto:ffmpeg-user-boun...@ffmpeg.org] Namens Roger Pack
> Verzonden: dinsdag 3 mei 2016 3:43
> Aan: FFmpeg user questions 
> Onderwerp: Re: [FFmpeg-user] named pipes ffmpeg
>
> On 4/29/16, Robin Stevens  wrote:
>> I am having trouble to get ffmpeg write to a named pipe in windows. I
>> know it's possible to write to a anonymous pipe with the command:
>>
>> ffmpeg.exe -vsync passthrough -f dshow -i video="AVerMedia SD 1
>> Capture":audio="AVerMedia SD Audio Cap 1 (AVerM" -vcodec rawvideo -f
>> matroska -
>>
>> And I can read the pipe with ffplay.exe -i -
>>
>> But is it possible to write to a named pipe?
>>
>> I tried to by creating a named pipe with the following c# code:
>>
>> NamedPipeServerStream p_from_ffmpeg = new
>> NamedPipeServerStream("tmp_pipe1",
>> PipeDirection.In,
>> 1,
>> PipeTransmissionMode.Byte,
>> PipeOptions.WriteThrough,
>> 1000, 1000);
>>
>> I checked if the pipe really exist with the pipelist.exe from
>> https://technet.microsoft.com/en-us/sysinternals/dd581625.aspx.
>>
>> Then I tried to write the output data from ffmpeg to the pipe by using
>> the following command:
>>
>> ffmpeg.exe -vsync passthrough -f dshow -i video="AVerMedia SD 1
>> Capture":audio="AVerMedia SD Audio Cap 1 (AVerM" -vcodec rawvideo -f
>> matroska tmp_pipe1
>>
>> and the following:
>>
>> ffmpeg.exe -vsync passthrough -f dshow -i video="AVerMedia SD 1
>> Capture":audio="AVerMedia SD Audio Cap 1 (AVerM" -vcodec rawvideo -f
>> matroska pipe:1 > tmp_pipe1
>>
>> Both commands are adding a file named "tmp_pipe1" to the directory of
>> the ffmpeg executable while the anonymous pipe isn't adding a file and
>> is working perfectly for me. The only problem is that I would like to
>> deinterlace, record with a codec and render/view more than one
>> channel. So I thought to run one instance of ffmpeg and pipe the
>> outputs to multiple instances of ffplay. I don't know whether it's
>> possible to write the output of ffmpeg to a named pipe in windows. Does
>> someone has an answer?
>
> I think you have to name it something like \\pipe_name or some odd...
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org
> with subject "unsubscribe".
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] screen_capture_recorder to record 2nd monitor

2016-05-02 Thread Roger Pack
you can use the crop filter or set screen capture recorder boundaries
itself (in the registry, see its README).
HTH.
-roger-

On 4/25/16, Tim Hiles  wrote:
> I use exactly what they suggest on the screen_capture_recorder ffmpeg faq
> to record desktop.
>
> ffmpeg.exe -f dshow -i video="screen-capture-recorder":audio=%Device%
> -vcodec libx264 -pix_fmt yuv420p -preset ultrafast -acodec pcm_s16le -ac 1
> -ar 22050 -t %Duration% out.mkv
>
> works fine. But it records everything. I specifically only want to record
> what's on the 2nd monitor.
>
> Windows 7 64 bit.
>
> Anyone have any ideas?
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] ffmpage win7 screen record cursor is abnormal

2016-05-02 Thread Roger Pack
So mouse is the wrong size or wrong pointer or what here?

On 4/28/16, Ni Wesley  wrote:
> Hi guys,
>
> I hit a problem when recording screen on window 7 using ffmpeg.
>
> Snapshot here: [image: enter image description here]
> 
>
> So, you guys see the big red point is recorded mouse cursor.
>
> I am using PPT's default pencil, so, actually, cursor is a very small
> point. But it becomes so big when play.
>
> Command I use:
>
> ffmpeg.exe -y -rtbufsize 500M -f gdigrab -framerate 5  -draw_mouse 1
> -i desktop -f dshow -i audio=%s -af "highpass=f=200, lowpass=f=3000"
> -c:v libx264 -r 5 -preset medium -tune zerolatency -crf 35 -pix_fmt
> yuv420p -c:a libvo_aacenc -ac 2 -b:a 48k  -fs 50M  -movflags
> +faststart
>
> Anyone hit this before?
>
> Thanks.
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] named pipes ffmpeg

2016-05-02 Thread Roger Pack
On 4/29/16, Robin Stevens  wrote:
> I am having trouble to get ffmpeg write to a named pipe in windows. I know
> it's possible to write to a anonymous pipe with the command:
>
> ffmpeg.exe -vsync passthrough -f dshow -i video="AVerMedia SD 1
> Capture":audio="AVerMedia SD Audio Cap 1 (AVerM" -vcodec rawvideo -f
> matroska -
>
> And I can read the pipe with ffplay.exe -i -
>
> But is it possible to write to a named pipe?
>
> I tried to by creating a named pipe with the following c# code:
>
> NamedPipeServerStream p_from_ffmpeg = new NamedPipeServerStream("tmp_pipe1",
> PipeDirection.In,
> 1,
> PipeTransmissionMode.Byte,
> PipeOptions.WriteThrough,
> 1000, 1000);
>
> I checked if the pipe really exist with the pipelist.exe from
> https://technet.microsoft.com/en-us/sysinternals/dd581625.aspx.
>
> Then I tried to write the output data from ffmpeg to the pipe by using the
> following command:
>
> ffmpeg.exe -vsync passthrough -f dshow -i video="AVerMedia SD 1
> Capture":audio="AVerMedia SD Audio Cap 1 (AVerM" -vcodec rawvideo -f
> matroska tmp_pipe1
>
> and the following:
>
> ffmpeg.exe -vsync passthrough -f dshow -i video="AVerMedia SD 1
> Capture":audio="AVerMedia SD Audio Cap 1 (AVerM" -vcodec rawvideo -f
> matroska pipe:1 > tmp_pipe1
>
> Both commands are adding a file named "tmp_pipe1" to the directory of the
> ffmpeg executable while the anonymous pipe isn't adding a file and is
> working perfectly for me. The only problem is that I would like to
> deinterlace, record with a codec and render/view more than one channel. So I
> thought to run one instance of ffmpeg and pipe the outputs to multiple
> instances of ffplay. I don't know whether it's possible to write the output
> of ffmpeg to a named pipe in windows. Does someone has an answer?

I think you have to name it something like
\\pipe_name
or some odd...
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Fwd: screen_capture_recorder to record 2nd monitor

2016-05-02 Thread Roger Pack
You can unsubscribe here:
http://ffmpeg.org/mailman/listinfo/ffmpeg-user
HTH.
-roger-

On 4/27/16, G A  wrote:
> stop sending me this, goddamn this is 6th fucking time.
>
>> Begin forwarded message:
>>
>> From: Tim Hiles 
>> Subject: [FFmpeg-user] screen_capture_recorder to record 2nd monitor
>> Date: April 25, 2016 at 17:01:27 PDT
>> To: FFmpeg user questions 
>> Reply-To: FFmpeg user questions 
>>
>> I use exactly what they suggest on the screen_capture_recorder ffmpeg faq
>> to record desktop.
>>
>> ffmpeg.exe -f dshow -i video="screen-capture-recorder":audio=%Device%
>> -vcodec libx264 -pix_fmt yuv420p -preset ultrafast -acodec pcm_s16le -ac
>> 1
>> -ar 22050 -t %Duration% out.mkv
>>
>> works fine. But it records everything. I specifically only want to record
>> what's on the 2nd monitor.
>>
>> Windows 7 64 bit.
>>
>> Anyone have any ideas?
>> ___
>> ffmpeg-user mailing list
>> ffmpeg-user@ffmpeg.org
>> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
>
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Blackmagic card issue capturing

2016-03-03 Thread Roger Pack
full command line and uncut console output of failing run please?
-roger-

On 3/3/16, Christian Bianchini  wrote:
> Yes, by using vsync it works fine, who knows why.. maybe because I dont
> capture audio.
>
> On 2 March 2016 at 16:12, Roger Pack  wrote:
>
>> On 3/2/16, Christian Bianchini  wrote:
>> > I have ran with log level and this is the result when it successfully
>> > capture at 60 FPS:
>> >
>> > frame=  153 fps= 61 q=0.0 size=   51752kB time=00:00:02.33
>> > bitrate=181487.1kbits
>> > dshow passing through packet of type video size  4147200 timestamp
>> 26343746
>> > orig
>> >  timestamp 26078449 graph timestamp 26343718 diff 265269 Decklink Video
>> > Capture
>> > dshow passing through packet of type video size  4147200 timestamp
>> 26509532
>> > orig
>> >  timestamp 26245282 graph timestamp 26509509 diff 264227 Decklink Video
>> > Capture
>> > ap
>> >
>> > This is when it fails
>> >
>> >
>> > frame=2 fps=0.0 q=0.0 size=   0kB time=00:00:00.00 bitrate=N/A
>> > dup=0 dro
>> > dshow passing through packet of type video size  4147200 timestamp
>> > 716543710
>> > 73 orig timestamp 716543716327540 graph timestamp 71654371073 diff
>> > -5438667
>> > Decklink Video Capture
>> > *** dropping frame 2 from stream 0 at ts 0
>> > dshow passing through packet of type video size  4147200 timestamp
>> > 716543710
>> > 73 orig timestamp 716543716494373 graph timestamp 71654371073 diff
>> > -5605500
>> > Decklink Video Capture
>> > *** dropping frame 2 from stream 0 at ts 0
>> > dshow passing through packet of type video size  4147200 timestamp
>> > 716543710
>> > 73 orig timestamp 716543716661206 graph timestamp 71654371073 diff
>> > -5772333
>> > Decklink Video Capture
>> > *** dropping frame 2 from stream 0 at ts 0
>> > dshow passing through packet of type video size  4147200 timestamp
>> > 716543710
>> > 73 orig timestamp 716543716828040 graph timestamp 71654371073 diff
>> > -5939167
>> > Decklink Video Capture
>> >
>> >
>> > Does it makes any sense? is this the capture card failure?
>>
>> So reproducibly and reliably it always says "dropping frame 2" when it
>> is failing?
>>
>> I don't know about vsync, it might be OK
>> ___
>> ffmpeg-user mailing list
>> ffmpeg-user@ffmpeg.org
>> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>>
>
>
>
> --
> ---
> Hardware & Software Developer
> christ...@bianchini.ch 
> www.max246.ch
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] Blackmagic card issue capturing

2016-03-02 Thread Roger Pack
On 3/2/16, Christian Bianchini  wrote:
> I have ran with log level and this is the result when it successfully
> capture at 60 FPS:
>
> frame=  153 fps= 61 q=0.0 size=   51752kB time=00:00:02.33
> bitrate=181487.1kbits
> dshow passing through packet of type video size  4147200 timestamp 26343746
> orig
>  timestamp 26078449 graph timestamp 26343718 diff 265269 Decklink Video
> Capture
> dshow passing through packet of type video size  4147200 timestamp 26509532
> orig
>  timestamp 26245282 graph timestamp 26509509 diff 264227 Decklink Video
> Capture
> ap
>
> This is when it fails
>
>
> frame=2 fps=0.0 q=0.0 size=   0kB time=00:00:00.00 bitrate=N/A
> dup=0 dro
> dshow passing through packet of type video size  4147200 timestamp
> 716543710
> 73 orig timestamp 716543716327540 graph timestamp 71654371073 diff
> -5438667
> Decklink Video Capture
> *** dropping frame 2 from stream 0 at ts 0
> dshow passing through packet of type video size  4147200 timestamp
> 716543710
> 73 orig timestamp 716543716494373 graph timestamp 71654371073 diff
> -5605500
> Decklink Video Capture
> *** dropping frame 2 from stream 0 at ts 0
> dshow passing through packet of type video size  4147200 timestamp
> 716543710
> 73 orig timestamp 716543716661206 graph timestamp 71654371073 diff
> -5772333
> Decklink Video Capture
> *** dropping frame 2 from stream 0 at ts 0
> dshow passing through packet of type video size  4147200 timestamp
> 716543710
> 73 orig timestamp 716543716828040 graph timestamp 71654371073 diff
> -5939167
> Decklink Video Capture
>
>
> Does it makes any sense? is this the capture card failure?

So reproducibly and reliably it always says "dropping frame 2" when it
is failing?

I don't know about vsync, it might be OK
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] Blackmagic card issue capturing

2016-03-01 Thread Roger Pack
On 3/1/16, Christian Bianchini  wrote:
> I have got the ffmpeg version N-78598-g98a0053  and trying to capturing
> from a Blackmagic card, which sometimes works and sometimes doesnt.
>
> ffmpeg.exe -y -f dshow -video_size 1920x1080 -pixel_format uyvy422
> -rtbufsize 2100 -framerate 59.94 -i "video=Decklink Video Capture" -codec:v
> libx264 -preset ultrafast -an -crf 0 test.mkv
>
> The issue is a not recording at 60 FPS but at  0.. 2...1.3.0.. and the
> file doesnt get big.
>
> After I stop and restart the process, it captures at 60FPS without issues
> and the video is great, but I dont get why I need to restart 3-4 times to
> get the record working.
>
>
> this the output that I get when it doesnt work:

Appears to not be receiving input.  If you wait awhile it never works?
What if you add -loglevel verbose (to the failing runs)?
Unfortunately nothing comes to mind...
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] Any way to get better use of the cpu?

2016-02-17 Thread Roger Pack
On 1/30/16, Carl Eugen Hoyos  wrote:
> Roger Pack  gmail.com> writes:
>
>> ffmpeg -i input.ts -map 0:p:1344 1344.mp4 -map 0:p:1345
>> 1345.mp4 -map 0:p:1346 1346.mp4
>
> Which encoder are you using?

libx264

> You do know that this is not x264-feature-requests, don't you?

Yes.

>> Is there any tricks to get more cpu "utilization" here
>
> The tricks lead to slower conversion time.

I'm referring more to the fact that if I output to two outputs, from
the same ffmpeg instance, in essence, this:

ffmpeg -i input output1 output2

takes twice as long as running these two in parallel:

ffmpeg -i input output1
ffmpeg -i input output2

It appears to me that the encoders are working in serial (at least in
this instance, which is libx264 -preset ultrafast).   In the first
example it uses around 250% cpu, in the second, it uses around 500%
cpu.
Would be nice to be able to have the first use 500% cpu, if that makes sense.
I assume it's just a feature request.
Cheers!
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] Capture Webcam to Broadcasting Program

2016-02-16 Thread Roger Pack
On 2/16/16, James Stortz  wrote:
> ​Good day,
>
> I want to use my c920 in Wirecast on a Windows 8.1 desktop.
>
> (This was the line of webcams from Logitech that had notorious issues. A
> thread pointed out FFmpeg support via DirectShow.)
>
> I can now successfully output--to video files! But, how could I output
> directly to my broadcasting program?

https://trac.ffmpeg.org/wiki/StreamingGuide may be helpful
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] [ffmpeg + screen-capture-recorder] - Targeting a specific Windows session with the command line ?

2016-02-04 Thread Roger Pack
gdigrab has a "window" option FWIW

On 2/2/16, JuliaM  wrote:
> Hi
>
> I'm using ffmpeg + screen-capture-recorder to record my own and student
> work.
>
> For my situation, I use a Remote desktop to follow their work (to resume I
> open/use an Admin session on the student computer while he's working to
> check if he's doing everything good - so 2 sessions on 1 computer).
>
> Now I would like to know if it's possible (with specific arguments or other
> ways) to use ffmeg/screen-capture-recorder to record the student session
> while I'm on his Admin session ? *Targeting a specific Windows session with
> the command line ?*
>
> Right now I'm using this command line = /*ffmpeg -f dshow -i
> video="screen-capture-recorder" -f dshow -ac 2 -i
> audio="virtual-audio-capturer" -ar 48000 -acodec libmp3lame -ab 192k -r 30
> -vcodec libx264 -crf 18 -preset ultrafast -f mpegts task1.mpg*/
>
> Have you some advices ?
>
> Thanks for your feedback.
>
>
>
> --
> View this message in context:
> http://ffmpeg-users.933282.n4.nabble.com/ffmpeg-screen-capture-recorder-Targeting-a-specific-Windows-session-with-the-command-line-tp4674390.html
> Sent from the FFmpeg-users mailing list archive at Nabble.com.
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] dshow: Multi channel Video Capture Cards for FFmpeg

2016-02-03 Thread Roger Pack
It probably will be, if it has normal dshow interfaces (sorry for the
late reply).

On 10/19/15, Andrew Blake  wrote:
> Hello experts!
>
> Sorry i'm a bit of a newbie with ffmpeg so please bare with me! I'm trying
> to convert multiple PAL video camera streams to images (16+ streams) ...and
> I'm wondering how I can identify if a Video Capture card will be supported
> by FFmpeg through Dshow and if there is someway i can find out before I buy
> the card, install it and run:  ffmpeg -list_devices true -f dshow -i dummy
> ..? Is there a way to check a Video Capture card somehow before buying it
> or can i assume that ffmpeg/dshow is supported in most multichannel video
> capture cards? a little googling suggests that the NUUO cards are supported?
>
> So sorry if this isn't the right forum for this question,
>
> Thank you!
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


[FFmpeg-user] Any way to get better use of the cpu?

2016-01-29 Thread Roger Pack
Hello.
I'm attempting to decode and "re encode" live TV.
I noticed that when I attempt to transcode say, 6 streams at a time,
ffmpeg seems to end up using "at most" 50% of the cpu available on the
given box.

The basic syntax I'm using is

ffmpeg -i input.ts -map 0:p:1344 1344.mp4 -map 0:p:1345 1345.mp4 -map
0:p:1346 1346.mp4

So multiple outputs that come from different parts of the same input
(it's not a file, its a live TS capture in this particular case).

I've noticed that the speed indicator ends up, in my instance, saying
something like

frame=  696 fps= 16 q=-1.0 Lq=-1.0 q=-1.0 q=-1.0 q=-1.0 q=-1.0 q=-1.0
size=N/A time=00:00:27.84 bitrate=N/A dup=217 drop=0 speed=0.645x

(full command line and somewhat cleaned up console output:
https://gist.github.com/rdp/51448985bfa48a1f72a2 )

However, if I look at cpu utilization, it hovers right around 50%
consistently (two cores out of 4 basically), regardless of "-threads"
setting applied.

I'm able to capture the incoming stream "fast enough" (for instance,
if I only transocde a few substreams, everything comes in fine, speed
keeps up), and if I test "decode only" (all streams), it handles it
fine speed=3.5x.

Is there any tricks to get more cpu "utilization" here or is this more
of something to be implemented in a google summer of code or what not
(assuming some speedup is possible)?

In trying to create a reproducible sample, I came up with this (not
sure if its the same problem or not):

https://trac.ffmpeg.org/raw-attachment/ticket/3025/sbs.2.5M.ts

run this, you'll see that it uses around 200% cpu regardless of cores available:

./ffmpeg_g -ignore_unknown -y -i <(while cat sbs.2.5M.ts; do :; done)
  -map 0:p:817 -c:v libx264 -preset ultrafast -sn 817.mp4   -map
0:p:818 -c:v libx264 -preset ultrafast -sn 818.mp4   -map 0:p:820 -c:v
libx264 -preset ultrafast -sn 820.mp4  -map 0:p:821 -c:v libx264
-preset ultrafast -qp 0 -sn 821.mp4-map 0:p:830 -c:v libx264
-preset ultrafast -sn 830.mp4  -map 0:p:832 -c:v libx264 -preset
ultrafast -qp 0 -sn 832.mp4  -map 0:p:819 -c:v libx264 -preset
ultrafast -qp 0 -sn 819.mp4   -ignore_unknown -loglevel info

Here's a smaller example, possibly related, uses 600% (out of 800%) on
a different box:

./ffmpeg -i <(while cat sbs.2.5M.ts; do :; done) -map p:817 -y -sn
817.mp4 -ignore_unknown

Any theories out there as to why it ends up using just 200% cpu? Or
any ideas/workarounds to be able to transcode multiple at the same
time from the same source and use all the cpu available?

Thanks!
-roger-
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


[FFmpeg-user] dvb_teletext -> .srt fails?

2016-01-15 Thread Roger Pack
similar to https://lists.ffmpeg.org/pipermail/ffmpeg-user/2015-June/027029.html
and

https://trac.ffmpeg.org/ticket/3025

My "quest" is to convert an dvb_teletext stream to some human readable
(ex: .srt) format.

Ex: using this file:

https://trac.ffmpeg.org/attachment/ticket/3025/sbs.2.5M.ts

I don't seem to be having much luck (DVBT stream).  Any feedback welcome :)
-roger-

ffmpeg.exe   -txt_format text -i c:\users\rdp\Downloads\sbs.2.5M.ts
-map 0:i:0x2c -y out.srt
ffmpeg version N-77844-g62051f3 Copyright (c) 2000-2016 the FFmpeg developers
  built with gcc 5.3.0 (GCC)
  configuration: --arch=x86 --target-os=mingw32
--cross-prefix=/Users/rdp2/dev/ffmpeg-windows-build-helpers2/sandbox/cross_compi
lers/mingw-w64-i686/bin/i686-w64-mingw32- --pkg-config=pkg-config
--disable-w32threads --enable-gpl --enable-libsoxr --enable-fo
ntconfig --enable-libass --enable-libutvideo --enable-libbluray
--enable-iconv --enable-libtwolame --extra-cflags=-DLIBTWOLAME_S
TATIC --enable-libzvbi --enable-libcaca --enable-libmodplug
--extra-libs=-lstdc++ --extra-libs=-lpng --enable-libvidstab --enabl
e-libx265 --enable-decklink --extra-libs=-loleaut32 --enable-libx264
--enable-libxvid --enable-libmp3lame --enable-version3 --en
able-zlib --enable-librtmp --enable-libvorbis --enable-libtheora
--enable-libspeex --enable-libopenjpeg --enable-gnutls --enable
-libgsm --enable-libfreetype --enable-libopus --enable-frei0r
--enable-filter=frei0r --enable-libvo-aacenc --enable-bzlib --enab
le-libxavs --enable-libopencore-amrnb --enable-libopencore-amrwb
--enable-libvo-amrwbenc --enable-libschroedinger --enable-libvp
x --enable-libilbc --enable-libwavpack --enable-libwebp
--enable-libgme --enable-dxva2 --enable-libdcadec --enable-avisynth
--en
able-gray --enable-libopenh264 --extra-libs=-lpsapi --extra-cflags=
--enable-static --disable-shared --prefix=/Users/rdp2/dev/ff
mpeg-windows-build-helpers2/sandbox/cross_compilers/mingw-w64-i686/i686-w64-mingw32
--enable-runtime-cpudetect
  libavutil  55. 12.100 / 55. 12.100
  libavcodec 57. 22.100 / 57. 22.100
  libavformat57. 21.101 / 57. 21.101
  libavdevice57.  0.100 / 57.  0.100
  libavfilter 6. 23.100 /  6. 23.100
  libswscale  4.  0.100 /  4.  0.100
  libswresample   2.  0.101 /  2.  0.101
  libpostproc54.  0.100 / 54.  0.100
[mpegts @ 054a3860] PES packet size mismatch
Last message repeated 8 times
[mpegts @ 054a3860] probed stream 2 failed
[mpeg2video @ 05564b00] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 05d52560] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 0556b7a0] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 055724a0] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 05d52560] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 0556b7a0] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 05564b00] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 05d52560] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 0556b7a0] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 05564b00] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 0556b7a0] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 05d52560] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 055724a0] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 05564b00] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 055724a0] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 05d52560] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 055724a0] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 0556b7a0] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 05d52560] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
Last message repeated 1 times
[mpeg2video @ 0556b7a0] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpegts @ 054a3860] DTS 5653752687 < 14243683679 out of order
[mpeg2video @ 05564b00] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 055724a0] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 0556b7a0] Invalid frame dimensions 0x0, or possibly
encountered data before first frame.
[mpeg2video @ 055724a0] Invalid frame dimensions 0x0, or possibly
encountered data 

  1   2   >