First, thanks to everyone for their assistance and recommendations.
@Marcelo
I applied the patch that you recommended and am now able to get into the
shell, thank you worked great after I realized that the pom was pointing to
the 1.3.0-SNAPSHOT for parent, need to bump that down to 1.2.1.
@Zhan
Hi Todd,
Looks like the thrift server can connect to metastore, but something wrong in
the executors. You can try to get the log with yarn logs -applicationID xxx”
to check why it failed. If there is no log (master or executor is not started
at all), you can go to the RM webpage, click the
Hi Zhan,
I applied the patch you recommended,
https://github.com/apache/spark/pull/3409, it it now works. It was failing
with this:
Exception message:
/hadoop/yarn/local/usercache/root/appcache/application_1425078697953_0020/container_1425078697953_0020_01_02/launch_container.sh:
line 14:
Sorry. Misunderstanding. Looks like it already worked. If you still met some
hdp.version problem, you can try it :)
Thanks.
Zhan Zhang
On Mar 6, 2015, at 11:40 AM, Zhan Zhang
zzh...@hortonworks.commailto:zzh...@hortonworks.com wrote:
You are using 1.2.1 right? If so, please add java-opts in
You are using 1.2.1 right? If so, please add java-opts in conf directory and
give it a try.
[root@c6401 conf]# more java-opts
-Dhdp.version=2.2.2.0-2041
Thanks.
Zhan Zhang
On Mar 6, 2015, at 11:35 AM, Todd Nist
tsind...@gmail.commailto:tsind...@gmail.com wrote:
Working great now, after applying that patch; thanks again.
On Fri, Mar 6, 2015 at 2:42 PM, Zhan Zhang zzh...@hortonworks.com wrote:
Sorry. Misunderstanding. Looks like it already worked. If you still met
some hdp.version problem, you can try it :)
Thanks.
Zhan Zhang
On Mar 6, 2015,
It seems from the excerpt below that your cluster is set up to use the
Yarn ATS, and the code is failing in that path. I think you'll need to
apply the following patch to your Spark sources if you want this to
work:
https://github.com/apache/spark/pull/3938
On Thu, Mar 5, 2015 at 10:04 AM, Todd
In addition, you may need following patch if it is not in 1.2.1 to solve some
system property issue if you use HDP 2.2.
https://github.com/apache/spark/pull/3409
You can follow the following link to set hdp.version for java options.
I am running Spark on a HortonWorks HDP Cluster. I have deployed there
prebuilt version but it is only for Spark 1.2.0 not 1.2.1 and there are a
few fixes and features in there that I would like to leverage.
I just downloaded the spark-1.2.1 source and built it to support Hadoop 2.6
by doing the
Jackson 1.9.13? and codehaus.jackson.version? that's already set by
the profile hadoop-2.4.
On Thu, Mar 5, 2015 at 6:13 PM, Ted Yu yuzhih...@gmail.com wrote:
Please add the following to build command:
-Djackson.version=1.9.3
Cheers
On Thu, Mar 5, 2015 at 10:04 AM, Todd Nist
That particular class you did find is under parquet/... which means it was
shaded. Did you build your application against a hadoop2.6 dependency? The
maven central repo only has 2.2 but HDP has its own repos.
On Thu, Mar 5, 2015 at 10:04 AM, Todd Nist tsind...@gmail.com wrote:
I am running
Please add the following to build command:
-Djackson.version=1.9.3
Cheers
On Thu, Mar 5, 2015 at 10:04 AM, Todd Nist tsind...@gmail.com wrote:
I am running Spark on a HortonWorks HDP Cluster. I have deployed there
prebuilt version but it is only for Spark 1.2.0 not 1.2.1 and there are a
few
@Victor,
I'm pretty sure I built it correctly, I specified -Dhadoop.version=2.6.0,
am I missing something here? Followed the docs on this but I'm open to
suggestions.
make-distribution.sh --name hadoop2.6 --tgz -Pyarn -Phadoop-2.4
*-Dhadoop.version=2.6.0* -Phive -Phive-thriftserver -DskipTests
13 matches
Mail list logo