Hi,

I'm trying my luck with a gpu-only chain for stabilizing a video with 
deshake_opencl. Sounds easy enough right? This is the command line I'm using:

ffmpeg -init_hw_device vaapi=va:/dev/dri/renderD128 \
        -init_hw_device opencl=oc@va \
        -filter_hw_device oc \
        -vaapi_device /dev/dri/renderD128 \
        -hwaccel vaapi -hwaccel_output_format vaapi \
        -i test.mp4 \
        -vf 
"hwmap=derive_device=opencl,deshake_opencl,hwmap=derive_device=vaapi:reverse=1" 
\
        -c:a copy -c:v h264_vaapi -y test_deshake.mp4

The filter chain is assembled without problem so it seems to be valid, but when 
it starts I get the following runtime error:

[out#0/mp4 @ 0x20ddbd00] Starting thread...
[hwmap @ 0x7ff69c003280] Filter input: vaapi, 1920x1080 (512).
[AVHWFramesContext @ 0x7ff6980ab800] Map surface 0x2.
[AVHWFramesContext @ 0x7ff6980ab800] Map QSV/VAAPI surface 0x2 to OpenCL.
[h264 @ 0x20ece040] nal_unit_type: 1(Coded slice of a non-IDR picture), 
nal_ref_idc: 1
[h264 @ 0x20ece040] Param buffer (type 0, 672 bytes) is 0x3.
[h264 @ 0x20ece040] Param buffer (type 1, 240 bytes) is 0x2.
[h264 @ 0x20ece040] Slice 0 param buffer (3128 bytes) is 0x1.
[h264 @ 0x20ece040] Slice 0 data buffer (104301 bytes) is 0.
[h264 @ 0x20ece040] Decode to surface 0.
[hwmap @ 0x7ff69c003280] Filter output: opencl, 1920x1080 (512).
[h264_vaapi @ 0x20d93540] Input frame: 1920x1080 (0).
[h264_vaapi @ 0x20d93540] Pick nothing to encode next - need more input for 
timestamps.
[vf#0:0 @ 0x20d85c80] Error while filtering: Cannot allocate memory
[vf#0:0 @ 0x20d85c80] Task finished with error code: -12 (Cannot allocate 
memory)

I've tried the usual tricks, like limiting the number of extra frames, reducing 
the amount of threads
but it doesn't help.

Are there any additional tricks I could try?

Regards,
Matthias




_______________________________________________
ffmpeg-user mailing list -- [email protected]
To unsubscribe send an email to [email protected]

Reply via email to