[jira] [Updated] (SPARK-15221) error: not found: value sqlContext when starting Spark 1.6.1

2016-05-09 Thread Vijay Parmar (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-15221?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vijay Parmar updated SPARK-15221:
-
Priority: Blocker  (was: Minor)

> error: not found: value sqlContext when starting Spark 1.6.1
> 
>
> Key: SPARK-15221
> URL: https://issues.apache.org/jira/browse/SPARK-15221
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.6.1
> Environment: Ubuntu 14.0.4, 8 GB RAM, 1 Processor
>Reporter: Vijay Parmar
>Priority: Blocker
>  Labels: build, newbie
>
> When I start Spark (version 1.6.1), at the very end I am getting the 
> following error message:
> :16: error: not found: value sqlContext
>  import sqlContext.implicits._
> ^
> :16: error: not found: value sqlContext
>  import sqlContext.sql
> I have gone through some content on the web about editing the /.bashrc file 
> and including the "SPARK_LOCAL_IP=127.0.0.1" under SPARK variables. 
> Also tried editing the /etc/hosts file with :-
>  $ sudo vi /etc/hosts
>  ...
>  127.0.0.1  
>  ...
> but still the issue persists.  Is it the issue with the build or something 
> else?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-15221) error: not found: value sqlContext when starting Spark 1.6.1

2016-05-09 Thread Vijay Parmar (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-15221?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vijay Parmar updated SPARK-15221:
-
Description: 
When I start Spark (version 1.6.1), at the very end I am getting the following 
error message:

:16: error: not found: value sqlContext
 import sqlContext.implicits._
^
:16: error: not found: value sqlContext
 import sqlContext.sql

I have gone through some content on the web about editing the /.bashrc file and 
including the "SPARK_LOCAL_IP=127.0.0.1" under SPARK variables. 

Also tried editing the /etc/hosts file with :-

 $ sudo vi /etc/hosts
 ...
 127.0.0.1  
 ...

but still the issue persists.  Is it the issue with the build or something else?

  was:
When I start Spark (version 1.6.1), at the very end I am getting the following 
error message:

 :16: error: not found: value sqlContext
 import sqlContext.implicits._
^
:16: error: not found: value sqlContext
 import sqlContext.sql 

I have gone through some content on the web about editing the /.bashrc file and 
including the "SPARK_LOCAL_IP=127.0.0.1" under SPARK variables. 

Also tried editing the /etc/hosts file with :-

 $ sudo vi /etc/hosts
 ...
 127.0.0.1  
 ...

but still the issue persists.  Is it the issue with the build or something else?


> error: not found: value sqlContext when starting Spark 1.6.1
> 
>
> Key: SPARK-15221
> URL: https://issues.apache.org/jira/browse/SPARK-15221
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.6.1
> Environment: Ubuntu 14.0.4, 8 GB RAM, 1 Processor
>Reporter: Vijay Parmar
>Priority: Minor
>  Labels: build, newbie
>
> When I start Spark (version 1.6.1), at the very end I am getting the 
> following error message:
> :16: error: not found: value sqlContext
>  import sqlContext.implicits._
> ^
> :16: error: not found: value sqlContext
>  import sqlContext.sql
> I have gone through some content on the web about editing the /.bashrc file 
> and including the "SPARK_LOCAL_IP=127.0.0.1" under SPARK variables. 
> Also tried editing the /etc/hosts file with :-
>  $ sudo vi /etc/hosts
>  ...
>  127.0.0.1  
>  ...
> but still the issue persists.  Is it the issue with the build or something 
> else?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-15221) error: not found: value sqlContext when starting Spark 1.6.1

2016-05-09 Thread Vijay Parmar (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-15221?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vijay Parmar updated SPARK-15221:
-
Description: 
When I start Spark (version 1.6.1), at the very end I am getting the following 
error message:

 :16: error: not found: value sqlContext
 import sqlContext.implicits._
^
:16: error: not found: value sqlContext
 import sqlContext.sql 

I have gone through some content on the web about editing the /.bashrc file and 
including the "SPARK_LOCAL_IP=127.0.0.1" under SPARK variables. 

Also tried editing the /etc/hosts file with :-

 $ sudo vi /etc/hosts
 ...
 127.0.0.1  
 ...

but still the issue persists.  Is it the issue with the build or something else?

  was:
When I start Spark (version 1.6.1), at the very end I am getting the following 
error message:

:16: error: not found: value sqlContext
 import sqlContext.implicits._
^
:16: error: not found: value sqlContext
 import sqlContext.sql

I have gone through some content on the web about editing the /.bashrc file and 
including the "SPARK_LOCAL_IP=127.0.0.1" under SPARK variables. 

Also tried editing the /etc/hosts file with :-

 $ sudo vi /etc/hosts
 ...
 127.0.0.1  
 ...

but still the issue persists.  Is it the issue with the build or something else?


> error: not found: value sqlContext when starting Spark 1.6.1
> 
>
> Key: SPARK-15221
> URL: https://issues.apache.org/jira/browse/SPARK-15221
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.6.1
> Environment: Ubuntu 14.0.4, 8 GB RAM, 1 Processor
>Reporter: Vijay Parmar
>Priority: Minor
>  Labels: build, newbie
>
> When I start Spark (version 1.6.1), at the very end I am getting the 
> following error message:
>  :16: error: not found: value sqlContext
>  import sqlContext.implicits._
> ^
> :16: error: not found: value sqlContext
>  import sqlContext.sql 
> I have gone through some content on the web about editing the /.bashrc file 
> and including the "SPARK_LOCAL_IP=127.0.0.1" under SPARK variables. 
> Also tried editing the /etc/hosts file with :-
>  $ sudo vi /etc/hosts
>  ...
>  127.0.0.1  
>  ...
> but still the issue persists.  Is it the issue with the build or something 
> else?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org