[ 
https://issues.apache.org/jira/browse/SPARK-9009?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

kumar ranganathan updated SPARK-9009:
-------------------------------------
    Description: 
I got FileNotFoundException in the application master when running the SparkPi 
example in windows machine.

The problem is that the truststore file found in C:\Spark\conf\spark.truststore 
location but getting below exception as

{code}
15/07/13 09:38:50 ERROR yarn.ApplicationMaster: Uncaught exception: 
java.io.FileNotFoundException: C:\Spark\conf\spark.truststore (The system 
cannot find the path specified)
        at java.io.FileInputStream.open(Native Method)
        at java.io.FileInputStream.<init>(FileInputStream.java:146)
        at 
org.spark-project.guava.io.Files$FileByteSource.openStream(Files.java:124)
        at 
org.spark-project.guava.io.Files$FileByteSource.openStream(Files.java:114)
        at 
org.apache.spark.SecurityManager$$anonfun$4.apply(SecurityManager.scala:261)
        at 
org.apache.spark.SecurityManager$$anonfun$4.apply(SecurityManager.scala:254)
        at scala.Option.map(Option.scala:145)
        at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:254)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:132)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:571)
        at 
org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
        at 
org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:65)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
        at 
org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:65)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:569)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
15/07/13 09:38:50 INFO yarn.ApplicationMaster: Final app status: FAILED, 
exitCode: 10, (reason: Uncaught exception: java.io.FileNotFoundException: 
C:\Spark\conf\spark.truststore (The system cannot find the path specified))
15/07/13 09:38:50 INFO util.Utils: Shutdown hook called
{code}

If i change the truststore file location to different drive 
(d:\spark_conf\spark.truststore) then getting exception as

{code}
java.io.FileNotFoundException: D:\Spark_conf\spark.truststore (The device is 
not ready)
{code}

This exception throws from SecurityManager.scala at the line of openstream() 
shown below

{code:title=SecurityManager.scala|borderStyle=solid}
val trustStoreManagers =
      for (trustStore <- fileServerSSLOptions.trustStore) yield {
        val input = 
Files.asByteSource(fileServerSSLOptions.trustStore.get).openStream()

        try {
{code}

The same problem occurs for the keystore file when removed truststore property 
in spark-defaults.conf.

When disabled the encryption property to set spark.ssl.enabled as false then 
the job completed successfully. 

  was:
I got FileNotFoundException in the application master when running the SparkPi 
example in windows machine.

The problem is that the truststore file found in C:\Spark\conf\spark.truststore 
location but getting below exception as

{code}
15/07/13 09:38:50 ERROR yarn.ApplicationMaster: Uncaught exception: 
java.io.FileNotFoundException: C:\Spark\conf\spark.truststore (The system 
cannot find the path specified)
        at java.io.FileInputStream.open(Native Method)
        at java.io.FileInputStream.<init>(FileInputStream.java:146)
        at 
org.spark-project.guava.io.Files$FileByteSource.openStream(Files.java:124)
        at 
org.spark-project.guava.io.Files$FileByteSource.openStream(Files.java:114)
        at 
org.apache.spark.SecurityManager$$anonfun$4.apply(SecurityManager.scala:261)
        at 
org.apache.spark.SecurityManager$$anonfun$4.apply(SecurityManager.scala:254)
        at scala.Option.map(Option.scala:145)
        at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:254)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:132)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:571)
        at 
org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
        at 
org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:65)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
        at 
org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:65)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:569)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
15/07/13 09:38:50 INFO yarn.ApplicationMaster: Final app status: FAILED, 
exitCode: 10, (reason: Uncaught exception: java.io.FileNotFoundException: 
C:\Spark\conf\spark.truststore (The system cannot find the path specified))
15/07/13 09:38:50 INFO util.Utils: Shutdown hook called
{code}

This exception throws from SecurityManager.scala at the line of openstream() 
shown below

{code:title=SecurityManager.scala|borderStyle=solid}
val trustStoreManagers =
      for (trustStore <- fileServerSSLOptions.trustStore) yield {
        val input = 
Files.asByteSource(fileServerSSLOptions.trustStore.get).openStream()

        try {
{code}

The same problem occurs for the keystore file when removed truststore property 
in spark-defaults.conf.

When disabled the encryption property to spark.ssl.enabled as false then the 
job completed successfully. 


> SPARK Encryption FileNotFoundException for truststore
> -----------------------------------------------------
>
>                 Key: SPARK-9009
>                 URL: https://issues.apache.org/jira/browse/SPARK-9009
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, YARN
>    Affects Versions: 1.4.0
>            Reporter: kumar ranganathan
>
> I got FileNotFoundException in the application master when running the 
> SparkPi example in windows machine.
> The problem is that the truststore file found in 
> C:\Spark\conf\spark.truststore location but getting below exception as
> {code}
> 15/07/13 09:38:50 ERROR yarn.ApplicationMaster: Uncaught exception: 
> java.io.FileNotFoundException: C:\Spark\conf\spark.truststore (The system 
> cannot find the path specified)
>       at java.io.FileInputStream.open(Native Method)
>       at java.io.FileInputStream.<init>(FileInputStream.java:146)
>       at 
> org.spark-project.guava.io.Files$FileByteSource.openStream(Files.java:124)
>       at 
> org.spark-project.guava.io.Files$FileByteSource.openStream(Files.java:114)
>       at 
> org.apache.spark.SecurityManager$$anonfun$4.apply(SecurityManager.scala:261)
>       at 
> org.apache.spark.SecurityManager$$anonfun$4.apply(SecurityManager.scala:254)
>       at scala.Option.map(Option.scala:145)
>       at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:254)
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:132)
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:571)
>       at 
> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
>       at 
> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:65)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>       at 
> org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:65)
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:569)
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
> 15/07/13 09:38:50 INFO yarn.ApplicationMaster: Final app status: FAILED, 
> exitCode: 10, (reason: Uncaught exception: java.io.FileNotFoundException: 
> C:\Spark\conf\spark.truststore (The system cannot find the path specified))
> 15/07/13 09:38:50 INFO util.Utils: Shutdown hook called
> {code}
> If i change the truststore file location to different drive 
> (d:\spark_conf\spark.truststore) then getting exception as
> {code}
> java.io.FileNotFoundException: D:\Spark_conf\spark.truststore (The device is 
> not ready)
> {code}
> This exception throws from SecurityManager.scala at the line of openstream() 
> shown below
> {code:title=SecurityManager.scala|borderStyle=solid}
> val trustStoreManagers =
>       for (trustStore <- fileServerSSLOptions.trustStore) yield {
>         val input = 
> Files.asByteSource(fileServerSSLOptions.trustStore.get).openStream()
>         try {
> {code}
> The same problem occurs for the keystore file when removed truststore 
> property in spark-defaults.conf.
> When disabled the encryption property to set spark.ssl.enabled as false then 
> the job completed successfully. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to