Frank Bhattacharyya wrote:
BTW: We had some trouble to integrate the h264 encoder into the codec
server and the TI coded ref ( http://wiki.davincidsp.com/index.php/FAQ )
indicates that
H264ENC.alg.dataSection = "DDR";
H264ENC.alg.udataSection = "DDR";
should be set in the server cfg file, but IMO it should be
H264ENC.alg.dataSection = "DDR";
H264ENC.alg.udataSection = "DDR";
H264ENC.alg.codeSection = "DDR";
and more than that: Shouldn’t it be DDR2? I think my knowledge is far
too little in that huge TI universe ;) .
This depends on what you named that section in your server tconf. For
example.
{
comment: "DDR2: off-chip memory for code and data",
name: "DDR2",
base: 0x8FA00000,
len: 0x00400000,
space: "code/data"
},
/* H264DEC codec configuration */
H264DEC.alg.watermark = false;
H264DEC.alg.codeSection = "DDR2";
Calling
// setup
imgenc_params.size = sizeof(IIMGENC1_Params);
imgenc_params.maxHeight = 480;
imgenc_params.maxWidth = 720;
imgenc_params.maxScans = 3;
imgenc_params.dataEndianness = XDM_BYTE;
imgenc_params.forceChromaFormat = XDM_YUV_422ILE;
// call
iencHandle = IMGENC1_create(codecengine, encoderName,
&imgenc_params );
always fails (returns NULL), if I use
iencHandle = IMGENC1_create(codecengine, encoderName, NULL );
a non null ptr is returned but I can’t do any
IMGENC1_control/process calls (all fail).
I have attached a trace log of a sample run (CE_DEBUG=2, DSKT2
trace enabled)).
XDM_YUV_422ILE is not listed as a supported output format for that codec
(pg. 37 of the userguide datasheet). It also states that maxScans should
be set to XDM_DEFAULT for that codec. I would certainly try playing
around with those params. I have not used any of the encode codecs,
however I do recall that some of the params required for the MPEG4 and
H264 decode codecs were not clearly documented as required.
_______________________________________________
Davinci-linux-open-source mailing list
Davinci-linux-open-source@linux.davincidsp.com
http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source