Am 09.11.12 11:07, schrieb Achim Breidenbach:
Hi Christophe,

I refere to your code, that you posted earlier:

  if (vImageScale_ARGB8888(&buffer, &vDestBuffer, NULL, 0))
One thing that won't work is your vImage-Scalling algorithm, because you are 
using the ARGB variant (4 bytes per pixel) and putting in a RGB (3-byte per 
pixel) buffer with vDestBuffer. I thinkk this is why you are getting all those 
colors, because the vImage-scaling scales an 4 byte pixel into a 3 byte pixel 
which won't work.
Maybe you should post your code again here, so we can have a closer look at 
your current implementation.

Hello Achim,

thank you for your offer. See my code below.
However, in the meantime, I am already making sure that my destination buffer is derived from a bitmap representation with 4 Bytes per sample.

These settings here below for newRep don't do a thing though for the image format in vDestBuffer. I can set whatever I want, I am always gettng the same result.

   ///////////////////////////////////////////////////////
    // we got the image locked now, read out pixel data ...
    ///////////////////////////////////////////////////////
    vImage_Buffer           buffer;
    vImage_Buffer           vDestBuffer;


NSBitmapImageRep* newRep = [[[NSBitmapImageRep alloc] initWithBitmapDataPlanes:NULL
pixelsWide:48
pixelsHigh:50
bitsPerSample:8
samplesPerPixel:4
hasAlpha:YES // ACTUALLY NOT,BUT THIS MAKES NO DIFFERENCE
isPlanar:NO
colorSpaceName:NSDeviceRGBColorSpace
bitmapFormat:0
bytesPerRow: 0
bitsPerPixel:32] autorelease];

    ///////////////////////////////////////////////////////
    // Set up the vImage buffer for the source image
    ///////////////////////////////////////////////////////
    buffer.data     = (void*)[imageToUse bufferBaseAddress];
    buffer.rowBytes = [imageToUse bufferBytesPerRow];
    buffer.width    = [imageToUse bufferPixelsWide];
    buffer.height   = [imageToUse bufferPixelsHigh];

    buffer.data     = (void*)[imageToUse bufferBaseAddress];
    buffer.rowBytes = [imageToUse bufferBytesPerRow];
    buffer.width    = [imageToUse bufferPixelsWide];
    buffer.height   = [imageToUse bufferPixelsHigh];

    ///////////////////////////////////////////////////////
    // Set up the vImage buffer for the destination
    ///////////////////////////////////////////////////////
    vDestBuffer.data = [newRep bitmapData];
    vDestBuffer.height = [newRep pixelsHigh];
    vDestBuffer.width = [newRep pixelsWide];
    vDestBuffer.rowBytes = [newRep bytesPerRow];

    ///////////////////////////////////////////////////////
    // do the scale
    ///////////////////////////////////////////////////////
    if (vImageScale_ARGB8888(&buffer, &vDestBuffer, NULL, 0))
    {

        [imageToUse unlockBufferRepresentation];

        return NO;

    }



To me, this looks as if the format of the source input image "imageToUse" would planar, with premultiplied alpha? The image used however for the input image is a normal 32 bit PNG file with no transparency.




Here is the whole thing again:

- (BOOL)execute:(id <QCPlugInContext>)context atTime:(NSTimeInterval)time withArguments:(NSDictionary *)arguments
{

if([connectSocket isDisconnected] || [self didValueForInputKeyChange:@"inputPort"] || [self didValueForInputKeyChange:@"inputHost"])
    {
        // stop listening for incomming connections
        [connectSocket disconnect];

if(![connectSocket connectToHost:self.inputIpAddressPixelmaster onPort:self.inputPort error:nil])
            NSLog(@"could not connect to host, port");
    }



    // get the image to use
    id<QCPlugInInputImageSource> imageToUse = self.inputImage;

CGColorSpaceRef colorSpace = (CGColorSpaceGetModel([imageToUse imageColorSpace]) == kCGColorSpaceModelRGB ? [imageToUse imageColorSpace] : [context colorSpace]);

    // DEBUG
    NSLog(@"Colorspace: %@", colorSpace);
    NSLog(@"Buffer pixelformat: %@", [imageToUse bufferPixelFormat]);
    NSLog(@"Bounds: %@", [imageToUse imageBounds]);
    // </DEBUG>


if (![imageToUse lockBufferRepresentationWithPixelFormat:QCPlugInPixelFormatBGRA8
colorSpace:colorSpace
forBounds:[imageToUse imageBounds]])
    {

        NSLog(@"Locking of image failed.");
        return NO;

    }

    ///////////////////////////////////////////////////////
    // we got the image locked now, read out pixel data ...
    ///////////////////////////////////////////////////////
    vImage_Buffer           buffer;
    vImage_Buffer           vDestBuffer;


NSBitmapImageRep* newRep = [[[NSBitmapImageRep alloc] initWithBitmapDataPlanes:NULL
pixelsWide:48
pixelsHigh:50
bitsPerSample:8
samplesPerPixel:4
hasAlpha:YES
isPlanar:NO
colorSpaceName:NSDeviceRGBColorSpace
bitmapFormat:0
bytesPerRow: 0
bitsPerPixel:32] autorelease];

    ///////////////////////////////////////////////////////
    // Set up the vImage buffer for the source image
    ///////////////////////////////////////////////////////
    buffer.data     = (void*)[imageToUse bufferBaseAddress];
    buffer.rowBytes = [imageToUse bufferBytesPerRow];
    buffer.width    = [imageToUse bufferPixelsWide];
    buffer.height   = [imageToUse bufferPixelsHigh];

    buffer.data     = (void*)[imageToUse bufferBaseAddress];
    buffer.rowBytes = [imageToUse bufferBytesPerRow];
    buffer.width    = [imageToUse bufferPixelsWide];
    buffer.height   = [imageToUse bufferPixelsHigh];

    ///////////////////////////////////////////////////////
    // Set up the vImage buffer for the destination
    ///////////////////////////////////////////////////////
    vDestBuffer.data = [newRep bitmapData];
    vDestBuffer.height = [newRep pixelsHigh];
    vDestBuffer.width = [newRep pixelsWide];
    vDestBuffer.rowBytes = [newRep bytesPerRow];

    ///////////////////////////////////////////////////////
    // do the scale
    ///////////////////////////////////////////////////////
    if (vImageScale_ARGB8888(&buffer, &vDestBuffer, NULL, 0))
    {

        [imageToUse unlockBufferRepresentation];

        return NO;

    }

    ///////////////////////////////////////////////////////
    // read out pixeldata from the scaled down image
    ///////////////////////////////////////////////////////
    NSMutableData* imgData = [[NSMutableData alloc] init];

    for(int y=0;y<vDestBuffer.height;y++)
    {

        for(int x=0;x<vDestBuffer.width;x++)
        {

            long currentLineOffset = y * vDestBuffer.rowBytes + x;

[imgData appendBytes:&vDestBuffer.data[currentLineOffset] length:3];

        }

    }




    ///////////////////////////////////////////////////////
    // Protocol setup from hereon in order to send data
    // for that to happen, we need to build the protocol
    ///////////////////////////////////////////////////////
    NSLog(@"connected %d¸", [connectSocket isConnected]);

    ///////////////////////////////////////////////////////
    // DEBUG: fixed data length (BAD! to be fixed)
    ///////////////////////////////////////////////////////
    int lengthByte = 7235;

    ///////////////////////////////////////////////////////
    // set up the sendBuffer
    ///////////////////////////////////////////////////////
    NSMutableData* sendBuffer = [[[NSMutableData alloc] init] autorelease];
    // retain apparently fixed my issue with the crash


    // build the header
    NSString* header = @"pixmaster-prot-vers1";

[sendBuffer appendData:[header dataUsingEncoding:NSASCIIStringEncoding]];



    //databytes per protocol

    Byte lengthBytes[2] = {28, 67};

    [sendBuffer appendBytes:lengthBytes length:2 ] ;


    // typeByte
    Byte typeByte = 4;

    [sendBuffer appendBytes:&typeByte length:1];



    // TBA MESSAGE ID (=A COUNTER)
    counter++;
    Byte messageID[2] = { counter/256, counter % 256};
    [sendBuffer appendBytes:messageID length:2];

    Byte filterEnable = 0;
    [sendBuffer appendBytes:&filterEnable length:1];

    Byte fixImagesEnable =0;
    [sendBuffer appendBytes:&fixImagesEnable length:1];


    // Byte 27 = 50 in the recorded stream, is this the fps  in Hz?
    Byte reserved[5] = {0,0,0,0,0};
    [sendBuffer appendBytes:reserved length:5];


    // now apprend image data!
    [sendBuffer appendBytes:[imgData bytes] length:[imgData length]];

    [imgData release];
    [imageToUse unlockBufferRepresentation];


    // now end the buffer
    NSString* endString = @"end";
[sendBuffer appendData:[endString dataUsingEncoding:NSASCIIStringEncoding]];

    NSData* data = [[NSData dataWithData:sendBuffer]retain];
    [connectSocket writeData:data  withTimeout:-1 tag:0];

    //[sendBuffer release];



    return YES;

}

--
Christophe Leske
multimedial.de

----------------------------------------
www.multimedial.de - [email protected]
Hohler Strasse 17 - 51645 Gummersbach
+49(0)2261-99824540 // +49(0)177-2497031
----------------------------------------

_______________________________________________
Do not post admin requests to the list. They will be ignored.
Quartzcomposer-dev mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to