[ https://issues.apache.org/jira/browse/SPARK-5034?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-5034. ------------------------------ Resolution: Cannot Reproduce I don't know what to make of this without more info. I don't think it is a parsing or quoting issue as it just looks like the main class name is incorrect and overwritten in part by some host name. > Spark on Yarn launch failure on HDInsight on Windows > ---------------------------------------------------- > > Key: SPARK-5034 > URL: https://issues.apache.org/jira/browse/SPARK-5034 > Project: Spark > Issue Type: Bug > Components: Windows, YARN > Affects Versions: 1.1.0, 1.1.1, 1.2.0 > Environment: Spark on Yarn within HDInsight on Windows Azure > Reporter: Rice > > Windows Environment > I'm trying to run JavaSparkPi example on YARN with master = yarn-client but I > have a problem. > It runs smoothly with submitting application, first container for Application > Master works too. > When job is starting and there are some tasks to do I'm getting this warning > on console (I'm using windows cmd if this makes any difference): > WARN cluster.YarnClientClusterScheduler: Initial job has not accepted any > resources; check your cluster UI to ensure that workers are registered and > have sufficient memory > When I'm checking logs for container with Application Masters it is launching > containers for executors "properly", then goes with: > INFO YarnAllocationHandler: Completed container > container_1409217202587_0003_01_000002 (state: COMPLETE, exit status: 1) > INFO YarnAllocationHandler: Container marked as failed: > container_1409217202587_0003_01_000002 > And tries to re-launch them. > On failed container log there is only this: > Error: Could not find or load main class > pwd..sp...@gbv06758291.my.secret.address.net:63680.user.CoarseGrainedScheduler > -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org