[ 
https://issues.apache.org/jira/browse/SPARK-8596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14609127#comment-14609127
 ] 

Vincent Warmerdam commented on SPARK-8596:
------------------------------------------

Mhm. I seem to stumble on another issue with adding a new user for Rstudio. 

Here is a link to the tutorial that I recently made (which I would like to push 
to the Rstudio blog, once this issue is fixed). 
https://gist.github.com/koaning/5a896eb5c773c24091c2. 

The odd thing is that the tutorial works fine if you do not run the 
`/root/spark/bin/sparkR` command and move on to installing Rstudio instead. If 
you run the sparkR shell then you get this error in later after Rstudio has 
been provisioned: 

```
> sc <- 
> sparkR.init('spark://ec2-52-18-7-11.eu-west-1.compute.amazonaws.com:7077')
Launching java with spark-submit command /root/spark/bin/spark-submit  
sparkr-shell /tmp/RtmpxBIfkg/backend_port104b15f47402 
15/06/30 21:38:49 INFO spark.SparkContext: Running Spark version 1.4.0
15/06/30 21:38:49 INFO spark.SecurityManager: Changing view acls to: analyst
15/06/30 21:38:49 INFO spark.SecurityManager: Changing modify acls to: analyst
15/06/30 21:38:49 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(analyst); users 
with modify permissions: Set(analyst)
15/06/30 21:38:49 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/06/30 21:38:49 INFO Remoting: Starting remoting
15/06/30 21:38:50 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://sparkDriver@172.31.6.135:58940]
15/06/30 21:38:50 INFO util.Utils: Successfully started service 'sparkDriver' 
on port 58940.
15/06/30 21:38:50 INFO spark.SparkEnv: Registering MapOutputTracker
15/06/30 21:38:50 INFO spark.SparkEnv: Registering BlockManagerMaster
15/06/30 21:38:50 ERROR util.Utils: Failed to create local root dir in 
/mnt/spark. Ignoring this directory.
15/06/30 21:38:50 ERROR util.Utils: Failed to create local root dir in 
/mnt2/spark. Ignoring this directory.
15/06/30 21:38:50 ERROR storage.DiskBlockManager: Failed to create any local 
dir.
15/06/30 21:38:50 INFO util.Utils: Shutdown hook called
Error in readTypedObject(con, type) : 
  Unsupported type for deserialization 
```

I get the impression this error is caused by the fact that we create another 
user that doesn't have full root access and can therefore not create a local 
dir. What might be the best way of dealing with this? What assumptions does 
Spark make in terms of permissions? Can any user push spark jobs via the spark 
link or are there some permissions involved on the filesystem before one can do 
this? 

> Install and configure RStudio server on Spark EC2
> -------------------------------------------------
>
>                 Key: SPARK-8596
>                 URL: https://issues.apache.org/jira/browse/SPARK-8596
>             Project: Spark
>          Issue Type: Improvement
>          Components: EC2, SparkR
>            Reporter: Shivaram Venkataraman
>
> This will make it convenient for R users to use SparkR from their browsers 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to