I have the following python script that generates a 32 bit tiff file and then 
reads that file back. I am checking what appears to be an error when I write 
the file. I ran the same script on a OSX build and successfully read back in 
the same data I wrote out, but when I run it on Ubuntu, the data is nowhere 
near correct. I see no trend similar to the data I wrote. I am running with 
version 4.0.3 of the tiff library. Would anyone know what accounts for this?

def test(path, resolution=1024, minValue=0.0, maxValue=1.0):

    format = os.path.splitext(path)[1]
    ramp = oiio.ImageOutput.create(path)

    spec = oiio.ImageSpec()
    spec.set_format( oiio.FLOAT )
    spec.width = resolution
    spec.height = 1
    spec.nchannels = 3

    ramp.open (path, spec, oiio.Create)

    data = array.array("f", "\0" * spec.width * spec.height * spec.nchannels * 
4)
    for i in range(resolution):
        value = float(i)/(resolution-1) * (maxValue - minValue) + minValue
        print value 
        data[i*spec.nchannels +0] = value
        data[i*spec.nchannels +1] = value
        data[i*spec.nchannels +2] = value

    #WRITE THE IMAGE OUT
    ramp.write_image(spec.format, data)
    ramp.close()

    #READ THE IMAGE I JUST WROTE
    ramp = oiio.ImageInput.open( path )

    spec = ramp.spec()
    type = spec.format.basetype
    width = spec.width
    height = spec.height
    channels = spec.nchannels

    type = oiio.FLOAT
    data = ramp.read_image(type)

    for value in data:
        print value
_______________________________________________
Oiio-dev mailing list
[email protected]
http://lists.openimageio.org/listinfo.cgi/oiio-dev-openimageio.org

Reply via email to