Hi All,

Thanks for the prompt responses we will do the needful.

Regards,
Malith

On Wed, Sep 21, 2016 at 2:54 PM, Rukshan Premathunga <ruks...@wso2.com>
wrote:

> Hi Malith,
>
> cApp we provided to Analytics APIM will not work for the DAS because of
> the changes happen to the DAS 3.1.0. Because of that we need to use
> Analytics APIM or have to update capp with the above changes.
>
> Thanks and Regards.
>
> On Wed, Sep 21, 2016 at 2:48 PM, Niranda Perera <nira...@wso2.com> wrote:
>
>> Hi Malith,
>>
>> Yes, correct! you need to change the script.
>>
>> Additionally, there are some changes in the carbonJdbc connector as
>> well... so, you might need to watch out for it!
>>
>> Please check with the APIM team and ESB team whether we are doing a
>> feature release with the DAS 310 changes?
>>
>> cheers
>>
>> On Wed, Sep 21, 2016 at 5:11 AM, Malith Munasinghe <mali...@wso2.com>
>> wrote:
>>
>>> Hi All,
>>>
>>> While preparing a DAS 3.1.0 to run APIM Analytics I have added features
>>> as in [1]
>>> <https://docs.wso2.com/display/AM200/Installing+WSO2+APIM+Analytics+Features>.
>>> After deploying the CApp for APIM Analytics I run in to below error.
>>> According to the error that *incrementalProcessing *is not a valid
>>> option. Also according to [2]
>>> <https://docs.wso2.com/display/DAS310/Incremental+Processing> the
>>> syntax to parse this option is *incrementalParams. *In order to get DAS
>>> 3.1.0 to process APIM Analytics
>>> do we have to change the scripts with this option as well ?
>>>
>>>
>>> TID: [-1234] [] [2016-09-21 08:54:00,019] ERROR
>>> {org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService}
>>> -  Error while executing query :         CREATE TEMPORARY TABLE
>>> APIMGT_PERMINUTE_REQUEST_DATA USING CarbonAnalytics OPTIONS(tableName
>>> "ORG_WSO2_APIMGT_STATISTICS_PERMINUTEREQUEST", schema "    year INT -i,
>>> month INT -i, day INT -i, hour INT -i, minute INT -i,    consumerKey
>>> STRING, context STRING, api_version STRING, api STRING, version STRING,
>>> requestTime LONG, userId STRING, hostName STRING,    apiPublisher STRING,
>>> total_request_count LONG, resourceTemplate STRING, method STRING,
>>> applicationName STRING, tenantDomain STRING,    userAgent STRING,
>>> resourcePath STRING, request INT, applicationId STRING, tier STRING,
>>> throttledOut BOOLEAN, clientIp STRING,    applicationOwner STRING,
>>> _timestamp LONG -i",    primaryKeys "year, month, day, hour, minute,
>>> consumerKey, context, api_version, userId, hostName, apiPublisher,
>>> resourceTemplate, method, userAgent, clientIp",    incrementalProcessing
>>> "APIMGT_PERMINUTE_REQUEST_DATA, HOUR",    mergeSchema "false")
>>> {org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService}
>>> org.wso2.carbon.analytics.spark.core.exception.AnalyticsExecutionException:
>>> Exception in executing query CREATE TEMPORARY TABLE
>>> APIMGT_PERMINUTE_REQUEST_DATA USING CarbonAnalytics OPTIONS(tableName
>>> "ORG_WSO2_APIMGT_STATISTICS_PERMINUTEREQUEST", schema "    year INT -i,
>>> month INT -i, day INT -i, hour INT -i, minute INT -i,    consumerKey
>>> STRING, context STRING, api_version STRING, api STRING, version STRING,
>>> requestTime LONG, userId STRING, hostName STRING,    apiPublisher STRING,
>>> total_request_count LONG, resourceTemplate STRING, method STRING,
>>> applicationName STRING, tenantDomain STRING,    userAgent STRING,
>>> resourcePath STRING, request INT, applicationId STRING, tier STRING,
>>> throttledOut BOOLEAN, clientIp STRING,    applicationOwner STRING,
>>> _timestamp LONG -i",    primaryKeys "year, month, day, hour, minute,
>>> consumerKey, context, api_version, userId, hostName, apiPublisher,
>>> resourceTemplate, method, userAgent, clientIp",    incrementalProcessing
>>> "APIMGT_PERMINUTE_REQUEST_DATA, HOUR",    mergeSchema "false")
>>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>>> Executor.executeQueryLocal(SparkAnalyticsExecutor.java:764)
>>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>>> Executor.executeQuery(SparkAnalyticsExecutor.java:721)
>>> at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcesso
>>> rService.executeQuery(CarbonAnalyticsProcessorService.java:201)
>>> at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcesso
>>> rService.executeScript(CarbonAnalyticsProcessorService.java:151)
>>> at org.wso2.carbon.analytics.spark.core.AnalyticsTask.execute(A
>>> nalyticsTask.java:60)
>>> at org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute
>>> (TaskQuartzJobAdapter.java:67)
>>> at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
>>> at java.util.concurrent.Executors$RunnableAdapter.call(Executor
>>> s.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPool
>>> Executor.java:1145)
>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoo
>>> lExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.RuntimeException: Unknown options :
>>> incrementalprocessing
>>> at org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelati
>>> onProvider.checkParameters(AnalyticsRelationProvider.java:123)
>>> at org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelati
>>> onProvider.setParameters(AnalyticsRelationProvider.java:113)
>>> at org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelati
>>> onProvider.createRelation(AnalyticsRelationProvider.java:75)
>>> at org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelati
>>> onProvider.createRelation(AnalyticsRelationProvider.java:45)
>>> at org.apache.spark.sql.execution.datasources.ResolvedDataSourc
>>> e$.apply(ResolvedDataSource.scala:158)
>>> at org.apache.spark.sql.execution.datasources.CreateTempTableUs
>>> ing.run(ddl.scala:92)
>>> at org.apache.spark.sql.execution.ExecutedCommand.sideEffectRes
>>> ult$lzycompute(commands.scala:58)
>>> at org.apache.spark.sql.execution.ExecutedCommand.sideEffectRes
>>> ult(commands.scala:56)
>>> at org.apache.spark.sql.execution.ExecutedCommand.doExecute(com
>>> mands.scala:70)
>>> at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.
>>> apply(SparkPlan.scala:132)
>>> at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.
>>> apply(SparkPlan.scala:130)
>>> at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperati
>>> onScope.scala:150)
>>> at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
>>> at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompu
>>> te(QueryExecution.scala:55)
>>> at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExe
>>> cution.scala:55)
>>> at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:145)
>>> at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
>>> at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:52)
>>> at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
>>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>>> Executor.executeQueryLocal(SparkAnalyticsExecutor.java:760)
>>> ... 11 more
>>> TID: [-1234] [] [2016-09-21 08:54:00,020] ERROR
>>> {org.wso2.carbon.analytics.spark.core.AnalyticsTask} -  Error while
>>> executing the scheduled task for the script: 
>>> APIM_INCREMENTAL_PROCESSING_SCRIPT
>>> {org.wso2.carbon.analytics.spark.core.AnalyticsTask}
>>> org.wso2.carbon.analytics.spark.core.exception.AnalyticsExecutionException:
>>> Exception in executing query CREATE TEMPORARY TABLE
>>> APIMGT_PERMINUTE_REQUEST_DATA USING CarbonAnalytics OPTIONS(tableName
>>> "ORG_WSO2_APIMGT_STATISTICS_PERMINUTEREQUEST", schema "    year INT -i,
>>> month INT -i, day INT -i, hour INT -i, minute INT -i,    consumerKey
>>> STRING, context STRING, api_version STRING, api STRING, version STRING,
>>> requestTime LONG, userId STRING, hostName STRING,    apiPublisher STRING,
>>> total_request_count LONG, resourceTemplate STRING, method STRING,
>>> applicationName STRING, tenantDomain STRING,    userAgent STRING,
>>> resourcePath STRING, request INT, applicationId STRING, tier STRING,
>>> throttledOut BOOLEAN, clientIp STRING,    applicationOwner STRING,
>>> _timestamp LONG -i",    primaryKeys "year, month, day, hour, minute,
>>> consumerKey, context, api_version, userId, hostName, apiPublisher,
>>> resourceTemplate, method, userAgent, clientIp",    incrementalProcessing
>>> "APIMGT_PERMINUTE_REQUEST_DATA, HOUR",    mergeSchema "false")
>>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>>> Executor.executeQueryLocal(SparkAnalyticsExecutor.java:764)
>>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>>> Executor.executeQuery(SparkAnalyticsExecutor.java:721)
>>> at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcesso
>>> rService.executeQuery(CarbonAnalyticsProcessorService.java:201)
>>> at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcesso
>>> rService.executeScript(CarbonAnalyticsProcessorService.java:151)
>>> at org.wso2.carbon.analytics.spark.core.AnalyticsTask.execute(A
>>> nalyticsTask.java:60)
>>> at org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute
>>> (TaskQuartzJobAdapter.java:67)
>>> at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
>>> at java.util.concurrent.Executors$RunnableAdapter.call(Executor
>>> s.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPool
>>> Executor.java:1145)
>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoo
>>> lExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.RuntimeException: Unknown options :
>>> incrementalprocessing
>>> at org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelati
>>> onProvider.checkParameters(AnalyticsRelationProvider.java:123)
>>> at org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelati
>>> onProvider.setParameters(AnalyticsRelationProvider.java:113)
>>> at org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelati
>>> onProvider.createRelation(AnalyticsRelationProvider.java:75)
>>> at org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelati
>>> onProvider.createRelation(AnalyticsRelationProvider.java:45)
>>> at org.apache.spark.sql.execution.datasources.ResolvedDataSourc
>>> e$.apply(ResolvedDataSource.scala:158)
>>> at org.apache.spark.sql.execution.datasources.CreateTempTableUs
>>> ing.run(ddl.scala:92)
>>> at org.apache.spark.sql.execution.ExecutedCommand.sideEffectRes
>>> ult$lzycompute(commands.scala:58)
>>> at org.apache.spark.sql.execution.ExecutedCommand.sideEffectRes
>>> ult(commands.scala:56)
>>> at org.apache.spark.sql.execution.ExecutedCommand.doExecute(com
>>> mands.scala:70)
>>> at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.
>>> apply(SparkPlan.scala:132)
>>> at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.
>>> apply(SparkPlan.scala:130)
>>> at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperati
>>> onScope.scala:150)
>>> at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
>>> at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompu
>>> te(QueryExecution.scala:55)
>>> at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExe
>>> cution.scala:55)
>>> at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:145)
>>> at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
>>> at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:52)
>>> at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
>>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>>> Executor.executeQueryLocal(SparkAnalyticsExecutor.java:760)
>>> ... 11 more
>>>
>>>
>>> [1] https://docs.wso2.com/display/AM200/Installing+WSO2+APIM
>>> +Analytics+Features
>>> [2] https://docs.wso2.com/display/DAS310/Incremental+Processing
>>>
>>> Regards,
>>> Malith
>>> --
>>> Malith Munasinghe | Software Engineer
>>> M: +94 (71) 9401122
>>> E: mali...@wso2.com
>>> W: http://wso2.com
>>> <http://wso2.com/signature>
>>>
>>
>>
>>
>> --
>> *Niranda Perera*
>> Software Engineer, WSO2 Inc.
>> Mobile: +94-71-554-8430
>> Twitter: @n1r44 <https://twitter.com/N1R44>
>> https://pythagoreanscript.wordpress.com/
>>
>
>
>
> --
> Rukshan Chathuranga.
> Software Engineer.
> WSO2, Inc.
>



-- 
Malith Munasinghe | Software Engineer
M: +94 (71) 9401122
E: mali...@wso2.com
W: http://wso2.com
<http://wso2.com/signature>
_______________________________________________
Dev mailing list
Dev@wso2.org
http://wso2.org/cgi-bin/mailman/listinfo/dev

Reply via email to