Hi All,

I have 3 USB cameras streaming video at a specific resolution. I am trying
to merge the videos together using hstack and display using ffplay.

This is the command I am using:

ffmpeg -f dshow -pix_fmt uyvy422 -video_size 1280x2160 -i video="CAM_0" -f
dshow -pix_fmt uyvy422 -video_size 1280x2160 -i video="CAM_1" -f dshow
-pix_fmt uyvy422 -video_size 1280x2160 -i video="CAM_2" -filter_complex
hstack=3 -f nut - | ffplay -

There is a special case in this system. The USB Camera system will only
start video stream only when 3 cameras are enabled (or started by ffmpeg).
However when I am using the above command, ffmpeg starts CAM_0 and waits
for data. As a result CAM_1 and CAM_2 are not started and video stream does
not start.

Is there any way where I can start all 3 inputs simultaneously using ffmpeg
and then merge them together using hstack?

Thanks!
_______________________________________________
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Reply via email to