[ 
https://issues.apache.org/jira/browse/SPARK-15221?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-15221.
-------------------------------
          Resolution: Not A Problem
    Target Version/s:   (was: 1.6.1)

Exactly, this is the problem:

Caused by: java.sql.SQLException: Directory /home/metastore_db cannot be 
created.

You probably don't have permission to create that dir, but it's also probably 
not where you meant it to be created. You'd have to determine why you're trying 
to write into /home, but that's not a Spark issue per se.

Please read 
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark first 

> error: not found: value sqlContext when starting Spark 1.6.1
> ------------------------------------------------------------
>
>                 Key: SPARK-15221
>                 URL: https://issues.apache.org/jira/browse/SPARK-15221
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.6.1
>         Environment: Ubuntu 14.0.4, 8 GB RAM, 1 Processor
>            Reporter: Vijay Parmar
>            Priority: Blocker
>              Labels: build, newbie
>
> When I start Spark (version 1.6.1), at the very end I am getting the 
> following error message:
> <console>:16: error: not found: value sqlContext
>          import sqlContext.implicits._
>                 ^
> <console>:16: error: not found: value sqlContext
>          import sqlContext.sql
> I have gone through some content on the web about editing the /.bashrc file 
> and including the "SPARK_LOCAL_IP=127.0.0.1" under SPARK variables. 
> Also tried editing the /etc/hosts file with :-
>  $ sudo vi /etc/hosts
>  ...
>  127.0.0.1  <HOSTNAME>
>  ...
> but still the issue persists.  Is it the issue with the build or something 
> else?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to