>> That means, for YUV420P (I420) with 12 bits per pixel to RGB32,
>> src_stride[3] = {320*12/8, 320*12/8, 320*12/8};
>> dst_stride[3] = {4*dstwidth, 4*dstwidth, 4*dstwidth};
>>
>> src[3] = {AVFrame->linesize, NULL, NULL};
>>
>>
> no, i think you have the wrong end of the stick:
>
> AVFrame * srcFrame, *dstFrame; // src frame was decoded earlier, dst frame
> has been allocated already
> SwsContext * context; // previously created scaler context
>
> then
>
> sws_scale(context, srcFrame->data, srcFrame->linesize, 0, height,
> dstFrame->data, dstFrame->linesize);
Sorry, I should have said I'm actually using swscale within a custom
program not using AVFrame, but raw buffers, like in swscale-example.c
(http://cekirdek.pardus.org.tr/~ismail/ffmpeg-docs/swscale-example_8c-source.html).
The code there is somewhat like
uint8_t *src[3];
uint8_t *dst[3];
int srcStride[3], dstStride[3];
for(i=0; i<3; i++) {
srcStride[i] = srcW*4;
dstStride[i] = dstW*4;
src[i] = (uint8_t*) malloc(srcStride[i]*srcH);
dst[i] = (uint8_t*) malloc(dstStride[i]*dstH);
}
// ...
SwsContext *context; // ...
sws_scale(context, src, srcStride, 0, srcH, dst, dstStride);
But then, I had overlooked that in that example there are 3 planes for
src and dst. I actually have a YUV420p buffer (DirectShow from a
webcam), and a raw rgba dst buffer. I'm not sure how that YUV420p src
is organized in planes. I had just successfully swscaled BGR24 to
RGB32, but I'm stuck here.
Thanks, Juan
_______________________________________________
libav-user mailing list
[email protected]
https://lists.mplayerhq.hu/mailman/listinfo/libav-user