Sure. Will include relevant tests for summarizer into BAM.
jira: https://wso2.org/jira/browse/BAM-934

Thanks,
Reka

On Tue, Oct 16, 2012 at 10:05 PM, Tharindu Mathew <thari...@wso2.com> wrote:

> Reka,
>
> Please include an integration test for BAM. This should have been detected
> much earlier.
>
> On Tue, Oct 16, 2012 at 9:32 AM, Reka Thirunavukkarasu <r...@wso2.com>wrote:
>
>> Thanks for looking into this issue.
>>
>> Reka
>>
>>
>> On Tue, Oct 16, 2012 at 9:35 PM, Kasun Weranga <kas...@wso2.com> wrote:
>>
>>> This issue has been fixed. Thanks Buddhika for providing the fix.
>>>
>>> On Tue, Oct 16, 2012 at 2:31 PM, Reka Thirunavukkarasu <r...@wso2.com>wrote:
>>>
>>>> Hi
>>>>
>>>> The following hive query which works fine in BAM 2.0.0 is not working
>>>> now in BAM 2.0.1. Could you please have a look on this?
>>>> jira: https://wso2.org/jira/browse/CARBON-13943
>>>>
>>>>     set logs_column_family = "log_1_AS_2012_10_14";
>>>>     set file_path =
>>>> /media/Entertainment/stratos/BAM_LOGS/archive-logs/tmpLogs/1/AS/2012_10_14;
>>>>     drop table LogStats;
>>>>     set mapred.output.compress=true;
>>>>     set hive.exec.compress.output=true;
>>>>     set
>>>> mapred.output.compression.codec=org.apache.hadoop.io.compress.GzipCodec;
>>>>     set io.compression.codecs=org.apache.hadoop.io.compress.GzipCodec;
>>>>
>>>>     CREATE EXTERNAL TABLE IF NOT EXISTS LogStats (key STRING,
>>>>         payload_tenantID STRING,payload_serverName STRING,
>>>>         payload_appName STRING,payload_message STRING,
>>>>         payload_stacktrace STRING,
>>>>         payload_logger STRING,
>>>>         payload_priority STRING,payload_logTime BIGINT)
>>>>         STORED BY
>>>> 'org.apache.hadoop.hive.cassandra.CassandraStorageHandler'
>>>>         WITH SERDEPROPERTIES ( "cassandra.host" = "localhost",
>>>>         "cassandra.port" ="9160","cassandra.ks.name" = "EVENT_KS",
>>>>         "cassandra.ks.username" = "admin","cassandra.ks.password"
>>>> ="admin",
>>>>         "cassandra.cf.name" = ${hiveconf:logs_column_family},
>>>>         "cassandra.columns.mapping" =
>>>>         ":key,payload_tenantID,
>>>>         payload_serverName,payload_appName,payload_message,
>>>>         payload_stacktrace,payload_logger,payload_priority,
>>>>         payload_logTime" );
>>>>     INSERT OVERWRITE LOCAL DIRECTORY 'file:///${hiveconf:file_path}'
>>>>         select
>>>>         concat('TID[',payload_tenantID, ']\t',
>>>>         'Server[',payload_serverName,']\t',
>>>>         'Application[',payload_appName,']\t',
>>>>         'Message[',payload_message,']\t',
>>>>         'Stacktrace ',payload_stacktrace,'\t',
>>>>         'Logger{',payload_logger,'}\t',
>>>>         'Priority[',payload_priority,']\t'),
>>>>         concat('LogTime[',
>>>>         (from_unixtime(cast(payload_logTime/1000 as BIGINT),'yyyy-MM-dd
>>>> HH:mm:ss.SSS' )),']\n') as LogTime from LogStats
>>>>         ORDER BY LogTime;
>>>>
>>>> The exception trace is as following:
>>>>
>>>> Hive history
>>>> file=/tmp/reka/hive_job_log_reka_201210161416_1379462536.txt
>>>> Hive history file=/tmp/reka/hive_job_log_reka_201210161416_844839482.txt
>>>> [2012-10-16 14:16:03,180] ERROR {hive.ql.metadata.Hive} -
>>>> NoSuchObjectException(message:default.LogStats table not found)
>>>>     at
>>>> org.apache.hadoop.hive.metastore.MultitenantMetaStoreHandler$16.run(MultitenantMetaStoreHandler.java:1194)
>>>>     at
>>>> org.apache.hadoop.hive.metastore.MultitenantMetaStoreHandler$16.run(MultitenantMetaStoreHandler.java:1189)
>>>>     at
>>>> org.apache.hadoop.hive.metastore.MultitenantMetaStoreHandler.executeWithRetry(MultitenantMetaStoreHandler.java:324)
>>>>     at
>>>> org.apache.hadoop.hive.metastore.MultitenantMetaStoreHandler.get_table(MultitenantMetaStoreHandler.java:1189)
>>>>     at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:735)
>>>>     at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901)
>>>>     at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.DDLTask.dropTable(DDLTask.java:3127)
>>>>     at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:250)
>>>>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>>>>     at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1334)
>>>>     at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1125)
>>>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:933)
>>>>     at
>>>> org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:201)
>>>>     at
>>>> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:187)
>>>>     at
>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:324)
>>>>     at
>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:225)
>>>>     at
>>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>     at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>     at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>>>>     at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>>>>     at java.lang.Thread.run(Thread.java:662)
>>>>
>>>> OK
>>>> FAILED: Parse Error: line 1:536 mismatched input '$' expecting
>>>> StringLiteral near '=' in specifying key/value property
>>>>
>>>> [2012-10-16 14:16:03,186] ERROR {org.apache.hadoop.hive.ql.Driver} -
>>>> FAILED: Parse Error: line 1:536 mismatched input '$' expecting
>>>> StringLiteral near '=' in specifying key/value property
>>>>
>>>> org.apache.hadoop.hive.ql.parse.ParseException: line 1:536 mismatched
>>>> input '$' expecting StringLiteral near '=' in specifying key/value property
>>>>
>>>>     at
>>>> org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:438)
>>>>     at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:419)
>>>>     at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:339)
>>>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:891)
>>>>     at
>>>> org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:201)
>>>>     at
>>>> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:187)
>>>>     at
>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:324)
>>>>     at
>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:225)
>>>>     at
>>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>     at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>     at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>>>>     at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>>>>     at java.lang.Thread.run(Thread.java:662)
>>>>
>>>> [2012-10-16 14:16:03,186] ERROR
>>>> {org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl} -  Error
>>>> while executing Hive script.
>>>> Query returned non-zero code: 11, cause: FAILED: Parse Error: line
>>>> 1:536 mismatched input '$' expecting StringLiteral near '=' in specifying
>>>> key/value property
>>>>
>>>> java.sql.SQLException: Query returned non-zero code: 11, cause: FAILED:
>>>> Parse Error: line 1:536 mismatched input '$' expecting StringLiteral near
>>>> '=' in specifying key/value property
>>>>
>>>>     at
>>>> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:189)
>>>>     at
>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:324)
>>>>     at
>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:225)
>>>>     at
>>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>     at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>     at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>>>>     at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>>>>     at java.lang.Thread.run(Thread.java:662)
>>>> [2012-10-16 14:16:03,187] ERROR
>>>> {org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask} -  Error while
>>>> executing script : test
>>>> org.wso2.carbon.analytics.hive.exception.HiveExecutionException: Error
>>>> while executing Hive script.Query returned non-zero code: 11, cause:
>>>> FAILED: Parse Error: line 1:536 mismatched input '$' expecting
>>>> StringLiteral near '=' in specifying key/value property
>>>>
>>>>     at
>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl.execute(HiveExecutorServiceImpl.java:110)
>>>>     at
>>>> org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask.execute(HiveScriptExecutorTask.java:60)
>>>>     at
>>>> org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAdapter.java:56)
>>>>     at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
>>>>     at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
>>>>     at
>>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>     at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>     at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>>>>     at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>>>>     at java.lang.Thread.run(Thread.java:662)
>>>> [2012-10-16 14:16:03,203] ERROR {hive.ql.metadata.Hive} -
>>>> NoSuchObjectException(message:default.LogStats table not found)
>>>>     at
>>>> org.apache.hadoop.hive.metastore.MultitenantMetaStoreHandler$16.run(MultitenantMetaStoreHandler.java:1194)
>>>>     at
>>>> org.apache.hadoop.hive.metastore.MultitenantMetaStoreHandler$16.run(MultitenantMetaStoreHandler.java:1189)
>>>>     at
>>>> org.apache.hadoop.hive.metastore.MultitenantMetaStoreHandler.executeWithRetry(MultitenantMetaStoreHandler.java:324)
>>>>     at
>>>> org.apache.hadoop.hive.metastore.MultitenantMetaStoreHandler.get_table(MultitenantM
>>>>
>>>> Thanks,
>>>> Reka
>>>>
>>>
>>>
>>>
>>> --
>>> *Kasun Weranga*
>>> Software Engineer
>>> **
>>> *WSO2, Inc.
>>> *lean.enterprise.middleware.
>>> mobile : +94 772314602
>>> <http://sanjeewamalalgoda.blogspot.com/>blog 
>>> :<http://sanjeewamalalgoda.blogspot.com/>
>>> http://kasunweranga.blogspot.com/
>>>
>>
>>
>> _______________________________________________
>> Dev mailing list
>> Dev@wso2.org
>> http://wso2.org/cgi-bin/mailman/listinfo/dev
>>
>>
>
>
> --
> Regards,
>
> Tharindu
>
> blog: http://mackiemathew.com/
> M: +94777759908
>
>
_______________________________________________
Dev mailing list
Dev@wso2.org
http://wso2.org/cgi-bin/mailman/listinfo/dev

Reply via email to