Hi all,

Did you forget to restart the node managers after editing yarn-site.xml by
any chance?

-Andrew

2015-07-17 8:32 GMT-07:00 Andrew Lee <alee...@hotmail.com>:

> I have encountered the same problem after following the document.
>
> Here's my spark-defaults.conf
>
> spark.shuffle.service.enabled true
> spark.dynamicAllocation.enabled      true
> spark.dynamicAllocation.executorIdleTimeout 60
> spark.dynamicAllocation.cachedExecutorIdleTimeout 120
> spark.dynamicAllocation.initialExecutors 2
> spark.dynamicAllocation.maxExecutors 8
> spark.dynamicAllocation.minExecutors 1
> spark.dynamicAllocation.schedulerBacklogTimeout 10
>
>
>
> and yarn-site.xml configured.
>
>         <property>
>             <name>yarn.nodemanager.aux-services</name>
>             <value>spark_shuffle,mapreduce_shuffle</value>
>         </property>
> ...
>         <property>
>             <name>yarn.nodemanager.aux-services.spark_shuffle.class</name>
>             <value>org.apache.spark.network.yarn.YarnShuffleService</value>
>         </property>
>
>
> and deployed the 2 JARs to NodeManager's classpath
> /opt/hadoop/share/hadoop/mapreduce/. (I also checked the NodeManager log
> and the JARs appear in the classpath). I notice that the JAR location is
> not the same as the document in 1.4. I found them under network/yarn/target
> and network/shuffle/target/ after building it with "-Phadoop-2.4 -Psparkr
> -Pyarn -Phive -Phive-thriftserver" in maven.
>
>
> spark-network-yarn_2.10-1.4.1.jar
>
> spark-network-shuffle_2.10-1.4.1.jar
>
>
> and still getting the following exception.
>
> Exception in thread "ContainerLauncher #0" java.lang.Error: 
> org.apache.spark.SparkException: Exception while starting container 
> container_1437141440985_0003_01_000002 on host 
> alee-ci-2058-slave-2.test.altiscale.com
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1151)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:744)
> Caused by: org.apache.spark.SparkException: Exception while starting 
> container container_1437141440985_0003_01_000002 on host 
> alee-ci-2058-slave-2.test.altiscale.com
>       at 
> org.apache.spark.deploy.yarn.ExecutorRunnable.startContainer(ExecutorRunnable.scala:116)
>       at 
> org.apache.spark.deploy.yarn.ExecutorRunnable.run(ExecutorRunnable.scala:67)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       ... 2 more
> Caused by: org.apache.hadoop.yarn.exceptions.InvalidAuxServiceException: The 
> auxService:spark_shuffle does not exist
>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>       at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>       at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>       at 
> org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.instantiateException(SerializedExceptionPBImpl.java:152)
>       at 
> org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.deSerialize(SerializedExceptionPBImpl.java:106)
>
>
> Not sure what else am I missing here or doing wrong?
>
> Appreciate any insights or feedback, thanks.
>
>
> ------------------------------
> Date: Wed, 8 Jul 2015 09:25:39 +0800
> Subject: Re: The auxService:spark_shuffle does not exist
> From: zjf...@gmail.com
> To: rp...@njit.edu
> CC: user@spark.apache.org
>
>
> Did you enable the dynamic resource allocation ? You can refer to this
> page for how to configure spark shuffle service for yarn.
>
> https://spark.apache.org/docs/1.4.0/job-scheduling.html
>
>
> On Tue, Jul 7, 2015 at 10:55 PM, roy <rp...@njit.edu> wrote:
>
> we tried "--master yarn-client" with no different result.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/The-auxService-spark-shuffle-does-not-exist-tp23662p23689.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>
>
> --
> Best Regards
>
> Jeff Zhang
>

Reply via email to