Update to the thread.

Upon investigation, this is a bug on windows. Windows does not grant user 
permission read permission to jar files by default.
Have created a pull request for 
SPARK-5914<https://issues.apache.org/jira/browse/SPARK-5914> to grant read 
permission to jar owner (slave service account in this case). With this fix, 
slave will be able to run without admin permission.
FYI: master & thrift server works fine with only user permission, so no issue 
there.

From: Judy Nash [mailto:judyn...@exchange.microsoft.com]
Sent: Thursday, February 19, 2015 12:26 AM
To: Akhil Das; dev@spark.apache.org
Cc: u...@spark.apache.org
Subject: RE: spark slave cannot execute without admin permission on windows

+ dev mailing list

If this is supposed to work, is there a regression then?

The spark core code shows the permission for copied file to \work is set to a+x 
at Line 442 of 
Utils.scala<https://github.com/apache/spark/blob/b271c265b742fa6947522eda4592e9e6a7fd1f3a/core/src/main/scala/org/apache/spark/util/Utils.scala>
 .
The example jar I used had all permissions including Read & Execute prior 
spark-submit:
[cid:image001.png@01D04FCA.85961CE0]
However after copied to worker node’s \work folder, only limited permission 
left on the jar with no execution right.
[cid:image002.png@01D04FCA.85961CE0]

From: Akhil Das [mailto:ak...@sigmoidanalytics.com]
Sent: Wednesday, February 18, 2015 10:40 PM
To: Judy Nash
Cc: u...@spark.apache.org<mailto:u...@spark.apache.org>
Subject: Re: spark slave cannot execute without admin permission on windows

You need not require admin permission, but just make sure all those jars has 
execute permission ( read/write access)

Thanks
Best Regards

On Thu, Feb 19, 2015 at 11:30 AM, Judy Nash 
<judyn...@exchange.microsoft.com<mailto:judyn...@exchange.microsoft.com>> wrote:
Hi,

Is it possible to configure spark to run without admin permission on windows?

My current setup run master & slave successfully with admin permission.
However, if I downgrade permission level from admin to user, SparkPi fails with 
the following exception on the slave node:
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to s
tage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task
0.3 in stage 0.0 (TID 9, 
workernode0.jnashsparkcurr2.d10.internal.cloudapp.net<http://workernode0.jnashsparkcurr2.d10.internal.cloudapp.net>)
: java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi$$anonfun$1

        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:270)

Upon investigation, it appears that sparkPi jar under 
spark_home\worker\appname\*.jar does not have execute permission set, causing 
spark not able to find class.

Advice would be very much appreciated.

Thanks,
Judy


Reply via email to