Thanks Zhang, that solved the error. This is probably not documented
anywhere so I missed it.
Thanks again,
Udit
On Fri, Apr 17, 2015 at 3:24 PM, Zhan Zhang wrote:
> Besides the hdp.version in spark-defaults.conf, I think you probably
> forget to put the file* java-opts* under $SPARK_HOME/conf
Besides the hdp.version in spark-defaults.conf, I think you probably forget to
put the file java-opts under $SPARK_HOME/conf with following contents.
[root@c6402 conf]# pwd
/usr/hdp/current/spark-client/conf
[root@c6402 conf]# ls
fairscheduler.xml.template java-opts log4j.properties.temp
Hi,
This is the log trace:
https://gist.github.com/uditmehta27/511eac0b76e6d61f8b47
On the yarn RM UI, I see :
Error: Could not find or load main class
org.apache.spark.deploy.yarn.ExecutorLauncher
The command I run is: bin/spark-shell --master yarn-client
The spark defaults I use is:
spark.y
Hi Udit,
By the way, do you mind to share the whole log trace?
Thanks.
Zhan Zhang
On Apr 17, 2015, at 2:26 PM, Udit Mehta
mailto:ume...@groupon.com>> wrote:
I am just trying to launch a spark shell and not do anything fancy. I got the
binary distribution from apache and put the spark assembl
You probably want to first try the basic configuration to see whether it works,
instead of setting SPARK_JAR pointing to the hdfs location. This error is
caused by not finding ExecutorLauncher in class path, and not HDP specific, I
think.
Thanks.
Zhan Zhang
On Apr 17, 2015, at 2:26 PM, Udit
I am just trying to launch a spark shell and not do anything fancy. I got
the binary distribution from apache and put the spark assembly on hdfs. I
then specified the yarn.jars option in spark defaults to point to the
assembly in hdfs. I still got the same error so though I had to build it
for hdp.
Thanks. Would that distribution work for hdp 2.2?
On Fri, Apr 17, 2015 at 2:19 PM, Zhan Zhang wrote:
> You don’t need to put any yarn assembly in hdfs. The spark assembly jar
> will include everything. It looks like your package does not include yarn
> module, although I didn’t find anything wr
You don’t need to put any yarn assembly in hdfs. The spark assembly jar will
include everything. It looks like your package does not include yarn module,
although I didn’t find anything wrong in your mvn command. Can you check
whether the ExecutorLauncher class is in your jar file or not?
BTW:
I followed the steps described above and I still get this error:
Error: Could not find or load main class
org.apache.spark.deploy.yarn.ExecutorLauncher
I am trying to build spark 1.3 on hdp 2.2.
I built spark from source using:
build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive
-Phive-
Hi Folks,
Just to summarize it to run SPARK on HDP distribution.
1. The spark version has to be 1.3.0 and above if you are using upstream
distribution. This configuration is mainly for HDP rolling upgrade purpose,
and the patch only went into spark upstream from 1.3.0.
2. In $SPARK_HOME/conf/
The “best” solution to spark-shell’s problem is creating a file
$SPARK_HOME/conf/java-opts
with “-Dhdp.version=2.2.0.0-2014”
Cheers,
Doug
> On Mar 28, 2015, at 1:25 PM, Michael Stone wrote:
>
> I've also been having trouble running 1.3.0 on HDP. The
> spark.yarn.am.extraJavaOptions -Dhdp.ve
I've also been having trouble running 1.3.0 on HDP. The
spark.yarn.am.extraJavaOptions -Dhdp.version=2.2.0.0-2041
configuration directive seems to work with pyspark, but not propagate
when using spark-shell. (That is, everything works find with pyspark,
and spark-shell fails with the "bad substi
I found the problem.
In mapped-site.xml, mapreduce.application.classpath has references to
“${hdp.version}” which is not getting replaced
when launch_container.sh is created. The executor fails with a substitution
error at line 27 in launch_container.sh because bash
can’t deal with “${hdp.versio
Hi Doug,
I did try setting that config parameter to a larger number (several
minutes), but still wasn't able to retrieve additional context logs. Let us
know if you have any success with it.
Thanks,
Bharath
On Fri, Mar 20, 2015 at 3:21 AM, Doug Balog
wrote:
> I’m seeing the same problem.
> I’v
I’m seeing the same problem.
I’ve set logging to DEBUG, and I think some hints are in the “Yarn AM launch
context” that is printed out
before Yarn runs java.
My next step is to talk to the admins and get them to set
yarn.nodemanager.delete.debug-delay-sec
in the config, as recommended in
htt
Thanks for clarifying Todd. This may then be an issue specific to the HDP
version we're using. Will continue to debug and post back if there's any
resolution.
On Thu, Mar 19, 2015 at 3:40 AM, Todd Nist wrote:
> Yes I believe you are correct.
>
> For the build you may need to specify the specific
Yes I believe you are correct.
For the build you may need to specify the specific HDP version of hadoop to
use with the -Dhadoop.version=. I went with the default 2.6.0, but
Horton may have a vendor specific version that needs to go here. I know I
saw a similar post today where the solution
Hi Todd,
Yes, those entries were present in the conf under the same SPARK_HOME that
was used to run spark-submit. On a related note, I'm assuming that the
additional spark yarn options (like spark.yarn.jar) need to be set in the
same properties file that is passed to spark-submit. That apart, I as
Hi Bharath,
Do you have these entries in your $SPARK_HOME/conf/spark-defaults.conf file?
spark.driver.extraJavaOptions -Dhdp.version=2.2.0.0-2041
spark.yarn.am.extraJavaOptions -Dhdp.version=2.2.0.0-2041
On Tue, Mar 17, 2015 at 1:04 AM, Bharath Ravi Kumar
wrote:
> Still no luck running purp
Still no luck running purpose-built 1.3 against HDP 2.2 after following all
the instructions. Anyone else faced this issue?
On Mon, Mar 16, 2015 at 8:53 PM, Bharath Ravi Kumar
wrote:
> Hi Todd,
>
> Thanks for the help. I'll try again after building a distribution with the
> 1.3 sources. However,
Hi Todd,
Thanks for the help. I'll try again after building a distribution with the
1.3 sources. However, I wanted to confirm what I mentioned earlier: is it
sufficient to copy the distribution only to the client host from where
spark-submit is invoked(with spark.yarn.jar set), or is there a need
Hi Bharath,
I ran into the same issue a few days ago, here is a link to a post on
Horton's fourm. http://hortonworks.com/community/forums/search/spark+1.2.1/
Incase anyone else needs to perform this these are the steps I took to get
it to work with Spark 1.2.1 as well as Spark 1.3.0-RC3:
1. Pul
Hi,
Trying to run spark ( 1.2.1 built for hdp 2.2) against a yarn cluster
results in the AM failing to start with following error on stderr:
Error: Could not find or load main class
org.apache.spark.deploy.yarn.ExecutorLauncher
An application id was assigned to the job, but there were no logs.
Not
23 matches
Mail list logo