On Wed, Feb 12, 2014 at 11:20 AM, Sinthuja Ragendran <[email protected]>wrote:

> Hi inosh,
>
> It's a bug and recently I came across this issue, and already created the
> JIRA [1]. Since the branch was frozen at that time, I couldn't commit it
> immediately and I committed it yesterday in r196670.
>

That's great.

>
> [1] https://wso2.org/jira/browse/BAM-1430
>
> Thanks,
> Sinthuja.
>
>
> On Wed, Feb 12, 2014 at 11:12 AM, Inosh Goonewardena <[email protected]>wrote:
>
>> Hi Sinthuja,
>>
>>
>> On Tue, Feb 11, 2014 at 7:18 PM, Sinthuja Ragendran <[email protected]>wrote:
>>
>>> Hi aruna,
>>>
>>>
>>>
>>>  On Tue, Feb 11, 2014 at 6:47 PM, Aruna Karunarathna <[email protected]>wrote:
>>>
>>>> Hi all,
>>>>
>>>> I am trying to execute a hive script to extract data from BAM to MySQL
>>>> database. Following is the hive script I am using.
>>>>
>>>>
>>>> drop table BambooEmmaTable;
>>>>
>>>> drop table SummarizedBambooEmmaTable;
>>>>
>>>> CREATE EXTERNAL TABLE IF NOT EXISTS BambooEmmaTable (key STRING,
>>>> payload_reportID STRING ,payload_packageName STRING,
>>>> payload_numberOfPackages STRING , payload_numberOfClasses STRING,
>>>> payload_numberOfMethods STRING , payload_numberOfBlocks STRING ,
>>>> payload_numberOfLines    STRING,
>>>> payload_numberOfFiles STRING, payload_allClassCoverage STRING,
>>>> payload_allMethodCoverage STRING , payload_allBlockCoverage STRING,
>>>> payload_allLineCoverage     STRING,
>>>>  payload_classCoverage STRING, payload_methodCoverage STRING,
>>>> payload_blockCoverage STRING, payload_lineCoverage STRING) STORED BY
>>>> 'org.apache.hadoop.hive.cassandra.CassandraStorageHandler' WITH
>>>> SERDEPROPERTIES (
>>>> "wso2.carbon.datasource.name" = "WSO2BAM_CASSANDRA_DATASOURCE",
>>>> "cassandra.cf.name" = "Bamboo_Emma_Stream",
>>>> "cassandra.columns.mapping" = ":key,payload_reportID ,
>>>> payload_packageName  , payload_numberOfPackages   ,
>>>> payload_numberOfClasses  , payload_numberOfMethods   ,
>>>> payload_numberOfBlocks   , payload_numberOfLines ,
>>>> payload_numberOfFiles  , payload_allClassCoverage  ,
>>>> payload_allMethodCoverage   , payload_allBlockCoverage  ,
>>>> payload_allLineCoverage       ,
>>>> payload_classCoverage  , payload_methodCoverage  ,
>>>> payload_blockCoverage  , payload_lineCoverage" );
>>>>
>>>>
>>>> CREATE EXTERNAL TABLE IF NOT EXISTS
>>>>
>>>> SummarizedBambooEmmaTable( payload_reportID STRING ,
>>>> payload_packageName STRING, payload_numberOfPackages STRING ,
>>>> payload_numberOfClasses STRING,
>>>> payload_numberOfMethods STRING , payload_numberOfBlocks STRING ,
>>>> payload_numberOfLines    STRING,
>>>> payload_numberOfFiles STRING, payload_allClassCoverage STRING,
>>>> payload_allMethodCoverage STRING , payload_allBlockCoverage STRING,
>>>> payload_allLineCoverage     STRING,
>>>>  payload_classCoverage STRING, payload_methodCoverage STRING,
>>>> payload_blockCoverage STRING, payload_lineCoverage STRING )
>>>>
>>>> STORED BY 'org.wso2.carbon.hadoop.hive.jdbc.storage.JDBCStorageHandler'
>>>>
>>>>     TBLPROPERTIES (
>>>>
>>>>                 'mapred.jdbc.driver.class' = 'com.mysql.jdbc.Driver',
>>>>
>>>>                 'mapred.jdbc.url' = 'jdbc:mysql://localhost/greg',
>>>>
>>>>                 'mapred.jdbc.username' = 'root',
>>>>
>>>>                 'mapred.jdbc.password' = '123',
>>>>
>>>>                 'hive.jdbc.update.on.duplicate'= 'true',
>>>>
>>>>                 'hive.jdbc.primary.key.fields'='payload_reportID ,
>>>> payload_packageName',
>>>>
>>>>                 'hive.jdbc.table.create.query' = 'CREATE TABLE
>>>> BambooEmmaSummary ( payload_reportID varchar(50) , payload_packageName
>>>> varchar(200)  ,
>>>>                  payload_numberOfPackages varchar(50)  ,
>>>> payload_numberOfClasses varchar(50) ,
>>>> payload_numberOfMethods varchar(50)  , payload_numberOfBlocks
>>>> varchar(50)  , payload_numberOfLines    varchar(50) ,
>>>> payload_numberOfFiles varchar(50) , payload_allClassCoverage
>>>> varchar(50) ,  payload_allMethodCoverage varchar(50)  ,
>>>> payload_allBlockCoverage varchar(50) ,
>>>> payload_allLineCoverage     varchar(50) ,   payload_classCoverage
>>>> varchar(50) , payload_methodCoverage varchar(50) , payload_blockCoverage
>>>> varchar(50) ,
>>>> payload_lineCoverage varchar(50), PRIMARY KEY(payload_reportID,
>>>> payload_packageName ) )');
>>>>
>>>>
>>>> insert overwrite table SummarizedBambooEmmaTable SELECT
>>>> payload_reportID   ,  payload_packageName , payload_numberOfPackages   ,
>>>> payload_numberOfClasses  , payload_numberOfMethods   ,
>>>> payload_numberOfBlocks
>>>> , payload_numberOfLines , payload_numberOfFiles  ,
>>>> payload_allClassCoverage  ,  payload_allMethodCoverage   ,
>>>> payload_allBlockCoverage  , payload_allLineCoverage      ,
>>>> payload_classCoverage  , payload_methodCoverage  ,
>>>> payload_blockCoverage  , payload_lineCoverage  FROM BambooEmmaTable
>>>>
>>>>
>>>>
>>>>  How ever I am getting the following exception When I ran the hive
>>>> script.
>>>>
>>>> [2014-02-11 18:37:33,279] ERROR
>>>> {org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBOperation} -  Failed to
>>>> write data to database
>>>> java.sql.SQLException: No value specified for parameter 2
>>>>     at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1086)
>>>>     at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:989)
>>>>     at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:975)
>>>>     at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:920)
>>>>     at
>>>> com.mysql.jdbc.PreparedStatement.checkAllParametersSet(PreparedStatement.java:2594)
>>>>     at
>>>> com.mysql.jdbc.PreparedStatement.fillSendPacket(PreparedStatement.java:2569)
>>>>     at
>>>> com.mysql.jdbc.PreparedStatement.fillSendPacket(PreparedStatement.java:2493)
>>>>     at
>>>> com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:2260)
>>>>     at
>>>> org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBOperation.isRowExisting(DBOperation.java:149)
>>>>     at
>>>> org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBOperation.writeToDB(DBOperation.java:59)
>>>>     at
>>>> org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBRecordWriter.write(DBRecordWriter.java:35)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:589)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:83)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:531)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:143)
>>>>     at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
>>>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:435)
>>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:371)
>>>>     at
>>>> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:211)
>>>>
>>>>
>>>> However, If I change the * 
>>>> 'hive.jdbc.primary.key.fields'='**payload_reportID
>>>> , payload_packageName',  *to a single key value, script seems to work
>>>> fine.
>>>>
>>>
>>> Remove the space between the primary key fields and try again. Ie, *
>>> 'hive.jdbc.primary.key.fields'='*
>>> *payload_reportID,payload_packageName'.*
>>>
>>
>> I think we need to fix this(or add it to the docs). Otherwise someone
>> else will complain about the same thing again. Current docs are also having
>> spaces between the fields. WDYT?
>>
>>
>>> Note you need to drop the Hive JDBC table - SummarizedBambooEmmaTable
>>> as above, for the changes to get reflected during the execution.
>>>
>>> Thanks,
>>> Sinthuja.
>>>
>>>>
>>>> Any ideas what I'm doing wrong here?.. Any help suggestion is highly
>>>> appreciated.
>>>>
>>>>
>>>>
>>>> Regards,
>>>> --
>>>>
>>>> * Aruna Sujith Karunarathna* | Software Engineer
>>>> WSO2, Inc | lean. enterprise. middleware.
>>>> #20, Palm Grove, Colombo 03, Sri Lanka
>>>> Mobile: +94 71 9040362 | Work: +94 112145345
>>>> Email: [email protected] | Web: www.wso2.com
>>>>
>>>>
>>>> _______________________________________________
>>>> Dev mailing list
>>>> [email protected]
>>>> http://wso2.org/cgi-bin/mailman/listinfo/dev
>>>>
>>>>
>>>
>>>
>>> --
>>> *Sinthuja Rajendran*
>>> Software Engineer <http://wso2.com/>
>>> WSO2, Inc.:http://wso2.com
>>>
>>> Blog: http://sinthu-rajan.blogspot.com/
>>> Mobile: +94774273955
>>>
>>>
>>>
>>> _______________________________________________
>>> Dev mailing list
>>> [email protected]
>>> http://wso2.org/cgi-bin/mailman/listinfo/dev
>>>
>>>
>>
>>
>> --
>> Regards,
>>
>> Inosh Goonewardena
>> Associate Technical Lead- WSO2 Inc.
>> Mobile: +94779966317
>>
>
>
>
> --
> *Sinthuja Rajendran*
> Software Engineer <http://wso2.com/>
> WSO2, Inc.:http://wso2.com
>
> Blog: http://sinthu-rajan.blogspot.com/
> Mobile: +94774273955
>
>
>


-- 
Regards,

Inosh Goonewardena
Associate Technical Lead- WSO2 Inc.
Mobile: +94779966317
_______________________________________________
Dev mailing list
[email protected]
http://wso2.org/cgi-bin/mailman/listinfo/dev

Reply via email to