Hi Inshaf,

This happens when you have not set the JAVA_HOME environment variable and
pointed to the installed JRE location. This needs to be explicitly set in
your environment. Refer [1]

[1] :
https://docs.wso2.com/display/BAM241/FAQ#FAQ-Iseeanexceptionstating-java.io.IOException:Cannotrunprogram
"null/bin/java"whenrunningBAM?Whatisgoingwrong?

On Fri, Nov 28, 2014 at 6:30 PM, Lakshman Udayakantha <lakshm...@wso2.com>
wrote:

> Hi Inshaf,
>
> Did you set the JAVA_HOME in your machine by adding following lines to
> .bashrc or .bash_profile?
>
> JAVA_HOME=/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home
>
> export JAVA_HOME
>
> Thanks
>
> On Fri, Nov 28, 2014 at 6:05 PM, Inshaf Mahath <ins...@wso2.com> wrote:
>
>> Hi Lakshman,
>>
>> I tried using both BAM 2.4.0 and 2.4.1. I got the same error on both
>> occasion.
>>
>> On Fri, Nov 28, 2014 at 5:39 PM, Lakshman Udayakantha <lakshm...@wso2.com
>> > wrote:
>>
>>> Hi Inshaf,
>>>
>>> You are using BAM 2.4.1. But you are referring BAM 2.4.0 documentation
>>> which is an old one. please try the right documentation[1].
>>>
>>> [1].https://docs.wso2.com/display/BAM241/Analysing+HTTPD+Logs
>>>
>>> On Fri, Nov 28, 2014 at 5:30 PM, Inshaf Mahath <ins...@wso2.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> I am trying 'Analysing HTTPD Logs' sample [1] under BAM documentation.
>>>> After publishing the events to thrift port they were successfully stored in
>>>> Cassandra database which can be viewed using Cassandra Explorer. But I
>>>> am getting following error once I installed the 'HTTPD Logs Analysis
>>>> Toolbox' on BAM console.
>>>>
>>>>
>>>> Hive history
>>>> file=/Users/inshaf/Documents/Assignments/wso2bam-2.4.1/tmp/hive/wso2-querylogs/hive_job_log_root_201411281123_1016944917.txt
>>>> 2014-11-28 11:23:01.995 java[6291:57e17] Unable to load realm info from
>>>> SCDynamicStore
>>>> OK
>>>> OK
>>>> Total MapReduce jobs = 1
>>>> Launching Job 1 out of 1
>>>> Number of reduce tasks not specified. Estimated from input data size: 1
>>>> In order to change the average load for a reducer (in bytes):
>>>>   set hive.exec.reducers.bytes.per.reducer=<number>
>>>> In order to limit the maximum number of reducers:
>>>>   set hive.exec.reducers.max=<number>
>>>> In order to set a constant number of reducers:
>>>>   set mapred.reduce.tasks=<number>
>>>> java.io.IOException: Cannot run program "null/bin/java" (in directory
>>>> "/Users/inshaf/server/wso2bam-2.4.1"): error=2, No such file or directory
>>>>     at
>>>> java.lang.ProcessBuilder.processException(ProcessBuilder.java:478)
>>>>     at java.lang.ProcessBuilder.start(ProcessBuilder.java:457)
>>>>     at java.lang.Runtime.exec(Runtime.java:593)
>>>>     at java.lang.Runtime.exec(Runtime.java:431)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:317)
>>>>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:129)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:62)
>>>>     at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1351)
>>>>     at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1126)
>>>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:934)
>>>>     at
>>>> org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:201)
>>>>     at
>>>> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:187)
>>>>     at
>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.executeHiveQuery(HiveExecutorServiceImpl.java:577)
>>>>     at
>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:287)
>>>>     at
>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:190)
>>>>     at
>>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>     at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>     at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
>>>>     at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>>>>     at java.lang.Thread.run(Thread.java:695)
>>>> Caused by: java.io.IOException: error=2, No such file or directory
>>>>     at java.lang.UNIXProcess.forkAndExec(Native Method)
>>>>     at java.lang.UNIXProcess.<init>(UNIXProcess.java:53)
>>>>     at java.lang.ProcessImpl.start(ProcessImpl.java:91)
>>>>     at java.lang.ProcessBuilder.start(ProcessBuilder.java:452)
>>>>     ... 18 more
>>>> [2014-11-28 11:23:03,050] ERROR
>>>> {org.apache.hadoop.hive.ql.exec.ExecDriver} -  Exception: Cannot run
>>>> program "null/bin/java" (in directory "
>>>> /Users/inshaf/server/wso2bam-2.4.1"): error=2, No such file or
>>>> directory
>>>> FAILED: Execution Error, return code 1 from
>>>> org.apache.hadoop.hive.ql.exec.MapRedTask
>>>> [2014-11-28 11:23:03,051] ERROR {org.apache.hadoop.hive.ql.Driver} -
>>>> FAILED: Execution Error, return code 1 from
>>>> org.apache.hadoop.hive.ql.exec.MapRedTask
>>>> [2014-11-28 11:23:03,057] ERROR
>>>> {org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl} -  Error
>>>> while executing Hive script.
>>>> Query returned non-zero code: 9, cause: FAILED: Execution Error, return
>>>> code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask
>>>> java.sql.SQLException: Query returned non-zero code: 9, cause: FAILED:
>>>> Execution Error, return code 1 from
>>>> org.apache.hadoop.hive.ql.exec.MapRedTask
>>>>     at
>>>> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:189)
>>>>     at
>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.executeHiveQuery(HiveExecutorServiceImpl.java:577)
>>>>     at
>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:287)
>>>>     at
>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:190)
>>>>     at
>>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>     at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>     at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
>>>>     at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>>>>     at java.lang.Thread.run(Thread.java:695)
>>>> [2014-11-28 11:23:03,058] ERROR
>>>> {org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask} -  Error while
>>>> executing script : phone_retail_store_script
>>>> org.wso2.carbon.analytics.hive.exception.HiveExecutionException: Error
>>>> while executing Hive script.Query returned non-zero code: 9, cause: FAILED:
>>>> Execution Error, return code 1 from
>>>> org.apache.hadoop.hive.ql.exec.MapRedTask
>>>>     at
>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl.execute(HiveExecutorServiceImpl.java:116)
>>>>     at
>>>> org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask.execute(HiveScriptExecutorTask.java:70)
>>>>     at
>>>> org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAdapter.java:67)
>>>>     at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
>>>>     at
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
>>>>     at
>>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>     at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>     at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
>>>>     at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>>>>     at java.lang.Thread.run(Thread.java:695)
>>>>
>>>> I tried executing httpd_logs_script manually from Management Console.
>>>> Then I had following error.
>>>>
>>>> *ERROR: *Error while executing Hive script.Query returned non-zero
>>>> code: 9, cause: FAILED: Execution Error, return code 1 from
>>>> org.apache.hadoop.hive.ql.exec.MapRedTask
>>>>
>>>>
>>>> Has anyone got any idea what would be the reason for this error. Any
>>>> solution would be appreciated.
>>>>
>>>>
>>>> [1] https://docs.wso2.com/display/BAM240/Analysing+HTTPD+Logs
>>>>
>>>> Thanks and Best Regards,
>>>>
>>>> --
>>>> Inshaf Mahath
>>>> Associate Software Engineer
>>>> Mobile: +94775907181
>>>> WSO2 Inc.
>>>> Lean . Enterprise . Middleware
>>>>
>>>>
>>>> _______________________________________________
>>>> Dev mailing list
>>>> Dev@wso2.org
>>>> http://wso2.org/cgi-bin/mailman/listinfo/dev
>>>>
>>>>
>>>
>>>
>>> --
>>> Lakshman Udayakantha
>>> WSO2 Inc. www.wso2.com
>>> lean.enterprise.middleware
>>> Mobile: *0711241005*
>>>
>>>
>>
>>
>> --
>> Inshaf Mahath
>> Associate Software Engineer
>> Mobile: +94775907181
>> WSO2 Inc.
>> Lean . Enterprise . Middleware
>>
>>
>
>
> --
> Lakshman Udayakantha
> WSO2 Inc. www.wso2.com
> lean.enterprise.middleware
> Mobile: *0711241005*
>
>
> _______________________________________________
> Dev mailing list
> Dev@wso2.org
> http://wso2.org/cgi-bin/mailman/listinfo/dev
>
>


-- 
Thanks
Abimaran Kugathasan

Software Engineer | WSO2 Inc
Data & APIs Technologies Team
Mobile : +94 773922820

<http://stackoverflow.com/users/515034>
<http://lk.linkedin.com/in/abimaran>  <http://www.lkabimaran.blogspot.com/>
<https://github.com/abimaran>  <https://twitter.com/abimaran>
_______________________________________________
Dev mailing list
Dev@wso2.org
http://wso2.org/cgi-bin/mailman/listinfo/dev

Reply via email to