Dear libav users,
I'd be grateful for any pointers, hints or tips anyone can give me,
I've been struggling with this problem for a month now!
My goal is to write DNX36 video essence created from a stream of
RGB48p video data. The steps I'm following are:
1. Crop 48 bit planar RGB video from 2048 x 1152 to 1920 x 1080
2. Convert to YUV422p by taking the most significant byte for each
pixel from each RGB plane and performing the standard calculations to
create YUV and copy the Y value to it's plane each time and the UV
values to their planes every other pixel.
3. I then pass my YUV422p from to avcodec_encode_video() with a
context that's been setup and the codec successfully opened.
4. The resulting length of the encoded bytes is correct and I write
the data to a file.
After that, I manually wrap the file in a quicktime movie and watch a
mostly green screen with some lines in it. I'm supposed to see a
mostly white screen with a clapper board in it.
I'm finding it really hard to test. I can't think of a suitable way
to test my calculations. I've gone through the first dozen pixels
from the RGB files and calculated their YUV equivalents using a
spreadsheet. These match the first few values I can output. At HD
res over 25 frames, I just can't check all the values manually. Does
anyone have any suggestions?
Is there a video player capable of displaying YUV422p essence?
Viewing the output frame by frame on my mac would be ideal.
The following is my code for doing this transformation. As I'm trying
to be as efficient as possible, I'm cropping the image as a I process
each pixel. The image cropping is just a temporary hack until I can
get this to work, then I'll convert to full frame yuv422 and figure
out how sws_scale() works.
Apologies for the Objective-C method signature but the rest is pretty
much pure C. I'd appreciate comments / feedback as it's the first
time I've written anything in C (I'm a Java programmer) and it's the
first time I've played with video.
/* Crops the image to the initialised targetWidth and targetHeight
whilst converting the 48bit RGB image to a 16 bit YUV422
*
*/
- (void) planarRGBp48:(uint8_t*) src toYUV422p16: (uint8_t*) yuv
withWidth: (int) width andHeight: (int) height;
{
int r, g, b, r1, g1, b1; // r1, g1, b1 are used to calculate the
average Cr and Cb values
uint8_t *p, *yuvcopy; // copy pointers, used to advance
through memory
int planeLength, redPos, greenPos, bluePos, cbPos, crPos; //
positions in memory to the start of the different image planes
int lum, cb, cr; // variables to store the calculated
YUV values
planeLength = width * height * 2; // e.g. the length of the red
plane, for 2 bytes per pixel
redPos = 1; // little endian order, make this 0 if it's Big Endian
and adjust the ones below too
greenPos = planeLength + 1;
bluePos = planeLength * 2 + 1;
cbPos = targetWidth;
crPos = targetWidth + (targetWidth / 2);
p = src; // copy the pointer of the memory we're reading,
where the RGB image is
yuvcopy = yuv; // make a copy of the pointer to the area of memory
we're going to put the luma values
// Does the image need cropping?
int diffWidth = width - targetWidth;
int diffHeight = height - targetHeight;
int firstPixelOfCroppedImage = diffHeight / 2 * width * 2;
int lastPixelOfCroppedImage = (height - diffHeight) * width * 2 - 1;
// RGB is planar format, so it's all the red first, then the green,
then the blue
//
// Also, it's 16 bits per pixel colour, e.g. the first two bytes in
'src' are for the first red pixel
//
// Need to convert the 16 bit value to 8 bits, so we'll just drop the
lowest order bits.
// Assuming byte order is little endian as this was written for a mac
in the first place and that's what intel use.
// So for the first pixel, we drop src[0] and use src[1] instead.
int pixelCount;
for ( pixelCount = 0; pixelCount < planeLength -4; pixelCount +=
4) // This loop processes 4 bytes at a time
{
// Begin Crop Check
if (pixelCount < firstPixelOfCroppedImage || pixelCount >
lastPixelOfCroppedImage)
{
p += 4; // move on two src pixels
continue; // Need to skip this pixel as it's in the
cropped area
}
if (pixelCount % width < diffWidth || pixelCount % width > width -
diffWidth)
{
p += 4; // move on two src pixels
continue; // Need to skip this pixel as it's in the
cropped area
}
// End Crop Check
// Get red, green and blue values from src materail
r = p[redPos];
g = p[greenPos];
b = p[bluePos];
// Copy these values to the variables that we'll use for 2 pixel
averaging
r1 = r;
g1 = g;
b1 = b;
// Calculate the luma value for the
lum = (66 * r + 129 * g + 25 * b) >> SCALEBITS + 16;
yuvcopy[0] = lum;
p += 2; // move on to the next src pixel
yuvcopy++; // move to the luma pointer on to the next btye in
memory
// Get red, green and blue values from src material for the
next pixel
r = p[redPos];
g = p[greenPos];
b = p[bluePos];
// Add the second set of rgb values to the first set, divide by 2 so
we have an average
r1 = (r1 + r) / 2;
g1 = (g1 + g) / 2;
b1 = (b1 + b) / 2;
lum = ((66 * r + 129 * g + 25 * b) >> 8) + 16;
yuvcopy[0] = lum;
cb = ((-38 * r1 - 74 * g1 + 112 * b1) >> 8) + 128;
yuvcopy[cbPos] = cb;
cr = ((112 * r1 - 94 * g1 - 18 * b1) >> 8) + 128;
yuvcopy[crPos] = cr;
p += 2; // move on to the next src pixel
yuvcopy++; // move to the luma pointer on to the next btye in the
'yuv' planer memory
}
NSLog(@"Last pixel processed was %d", pixelCount);
}
Essentially, my query is: Anyone know of a media player that runs on a
mac capable of playing raw YUV422p essence?
Kind Regards,
Chris.
_______________________________________________
libav-user mailing list
[email protected]
https://lists.mplayerhq.hu/mailman/listinfo/libav-user