[
https://issues.apache.org/jira/browse/KNOX-180?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13794308#comment-13794308
]
Kevin Minder commented on KNOX-180:
-----------------------------------
Hi Amit,
I'm not certain what you are asking. If you could expand on the actual
use case you are targeting for Hive, Pig, Scoop, etc. I can probably
provide better guidance. At the highest level all I can say is that the
approach for Hive and Pig will be very similar as they will both use
WebHCat. Scoop is an entirely different matter.
Kevin.
--
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
> Server error while submitting mapreduce job with knox-0.2.0
> -----------------------------------------------------------
>
> Key: KNOX-180
> URL: https://issues.apache.org/jira/browse/KNOX-180
> Project: Apache Knox
> Issue Type: Bug
> Components: Server
> Environment: knox-0.2.0
> Reporter: Amit Kamble
>
> I am using HDP 1.3.2 and knox-0.2.0
> I am trying to run sample mapreduce job (wordcount) provided with knox-0.2.0
> distribution (i.e. hadoop-examples.jar) using groovy. But while submitting
> its giving an error:
> DEBUG hadoop.gateway: Dispatching request: DELETE
> http://localhost:50070/webhdfs/v1/tmp/test?user.name=mapred&recursive=true&op=DELETE
> Delete /tmp/test 200
> DEBUG hadoop.gateway: Dispatching request: PUT
> http://localhost:50070/webhdfs/v1/tmp/test?user.name=mapred&op=MKDIRS
> Create /tmp/test 200
> DEBUG hadoop.gateway: Dispatching request: PUT
> http://localhost:50070/webhdfs/v1/tmp/test/input/FILE?user.name=mapred&op=CREATE
> DEBUG hadoop.gateway: Dispatching request: PUT
> http://localhost:50070/webhdfs/v1/tmp/test/hadoop-examples.jar?user.name=mapred&op=CREATE
> DEBUG hadoop.gateway: Dispatching request: PUT
> http://localhost:50075/webhdfs/v1/tmp/test/hadoop-examples.jar?user.name=mapred&overwrite=false&op=CREATE
> DEBUG hadoop.gateway: Dispatching request: PUT
> http://localhost:50075/webhdfs/v1/tmp/test/input/FILE?user.name=mapred&overwrite=false&op=CREATE
> Put /tmp/test/hadoop-examples.jar 201
> Put /tmp/test/input/FILE 201
> DEBUG hadoop.gateway: Dispatching request: POST
> http://localhost:50111/templeton/v1/mapreduce/jar
> Caught: org.apache.hadoop.gateway.shell.HadoopException:
> org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
> org.apache.hadoop.gateway.shell.HadoopException:
> org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server Error
> at
> org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:72)
> at org.apache.hadoop.gateway.shell.AbstractRequest$now.call(Unknown
> Source)
> at ExampleSubmitJob.run(ExampleSubmitJob.groovy:42)
> at org.apache.hadoop.gateway.shell.Shell.main(Shell.java:40)
> at
> org.apache.hadoop.gateway.launcher.Invoker.invokeMainMethod(Invoker.java:64)
> at org.apache.hadoop.gateway.launcher.Invoker.invoke(Invoker.java:37)
> at org.apache.hadoop.gateway.launcher.Command.run(Command.java:100)
> at org.apache.hadoop.gateway.launcher.Launcher.run(Launcher.java:70)
> at org.apache.hadoop.gateway.launcher.Launcher.main(Launcher.java:49)
> Caused by: org.apache.hadoop.gateway.shell.ErrorResponse: HTTP/1.1 500 Server
> Error
> at org.apache.hadoop.gateway.shell.Hadoop.executeNow(Hadoop.java:107)
> at
> org.apache.hadoop.gateway.shell.AbstractRequest.execute(AbstractRequest.java:47)
> at
> org.apache.hadoop.gateway.shell.job.Java$Request.access$100(Java.java:38)
> at
> org.apache.hadoop.gateway.shell.job.Java$Request$1.call(Java.java:82)
> at
> org.apache.hadoop.gateway.shell.job.Java$Request$1.call(Java.java:70)
> at
> org.apache.hadoop.gateway.shell.AbstractRequest.now(AbstractRequest.java:70)
> ... 8 more
> In case of Oozie workflow, its working properly and submitting workflow
> successfully. In my case, its working fine with WebHDFS and Oozie but not
> with WebHCat/Templeton.
> Could you please suggest if I missed any configuration setting with this.
--
This message was sent by Atlassian JIRA
(v6.1#6144)