Re: How do I access the SPARK SQL

2014-04-24 Thread Andrew Or
Did you build it with SPARK_HIVE=true?


On Thu, Apr 24, 2014 at 7:00 AM, diplomatic Guru
diplomaticg...@gmail.comwrote:

 Hi Matei,

 I checked out the git repository and built it. However, I'm still getting
 below error. It couldn't find those SQL packages. Please advice.

 package org.apache.spark.sql.api.java does not exist
 [ERROR]
 /home/VirtualBoxImages.com/Documents/projects/errCount/src/main/java/errorCount/TransDriverSQL.java:[49,8]
 cannot find symbol
 [ERROR] symbol  : class JavaSchemaRDD

 Kind regards,

 Raj.



 On 23 April 2014 22:09, Matei Zaharia matei.zaha...@gmail.com wrote:

 It's currently in the master branch, on https://github.com/apache/spark.
 You can check that out from git, build it with sbt/sbt assembly, and then
 try it out. We're also going to post some release candidates soon that will
 be pre-built.

 Matei

 On Apr 23, 2014, at 1:30 PM, diplomatic Guru diplomaticg...@gmail.com
 wrote:

  Hello Team,
 
  I'm new to SPARK and just came across SPARK SQL, which appears to be
 interesting but not sure how I could get it.
 
  I know it's an Alpha version but not sure if its available for
 community yet.
 
  Many thanks.
 
  Raj.





Re: How do I access the SPARK SQL

2014-04-24 Thread Michael Armbrust
You shouldn't need to set SPARK_HIVE=true unless you want to use the
JavaHiveContext.  You should be able to access
org.apache.spark.sql.api.java.JavaSQLContext with the default build.

How are you building your application?

Michael


On Thu, Apr 24, 2014 at 9:17 AM, Andrew Or and...@databricks.com wrote:

 Did you build it with SPARK_HIVE=true?


 On Thu, Apr 24, 2014 at 7:00 AM, diplomatic Guru diplomaticg...@gmail.com
  wrote:

 Hi Matei,

 I checked out the git repository and built it. However, I'm still getting
 below error. It couldn't find those SQL packages. Please advice.

 package org.apache.spark.sql.api.java does not exist
 [ERROR]
 /home/VirtualBoxImages.com/Documents/projects/errCount/src/main/java/errorCount/TransDriverSQL.java:[49,8]
 cannot find symbol
 [ERROR] symbol  : class JavaSchemaRDD

 Kind regards,

 Raj.



 On 23 April 2014 22:09, Matei Zaharia matei.zaha...@gmail.com wrote:

 It's currently in the master branch, on https://github.com/apache/spark.
 You can check that out from git, build it with sbt/sbt assembly, and then
 try it out. We're also going to post some release candidates soon that will
 be pre-built.

 Matei

 On Apr 23, 2014, at 1:30 PM, diplomatic Guru diplomaticg...@gmail.com
 wrote:

  Hello Team,
 
  I'm new to SPARK and just came across SPARK SQL, which appears to be
 interesting but not sure how I could get it.
 
  I know it's an Alpha version but not sure if its available for
 community yet.
 
  Many thanks.
 
  Raj.






Re: How do I access the SPARK SQL

2014-04-24 Thread Aaron Davidson
Looks like you're depending on Spark 0.9.1, which doesn't have Spark SQL.
Assuming you've downloaded Spark, just run 'mvn install' to publish Spark
locally, and depend on Spark version 1.0.0-SNAPSHOT.


On Thu, Apr 24, 2014 at 9:58 AM, diplomatic Guru
diplomaticg...@gmail.comwrote:

 It's a simple application based on the People example.

 I'm using Maven for building and below is the pom.xml. Perhaps, I need to
 change the version?

 project
   groupIdUthay.Test.App/groupId
 artifactIdtest-app/artifactId
   modelVersion4.0.0/modelVersion
 nameTestApp/name
   packagingjar/packaging
 version1.0/version

 repositories
 repository
  idAkka repository/id
  urlhttp://repo.akka.io/releases/url
 /repository
 /repositories

 dependencies
dependency !-- Spark dependency --
 groupIdorg.apache.spark/groupId
 artifactIdspark-core_2.10/artifactId
 version0.9.1/version
/dependency
 /dependencies
 /project



 On 24 April 2014 17:47, Michael Armbrust mich...@databricks.com wrote:

 You shouldn't need to set SPARK_HIVE=true unless you want to use the
 JavaHiveContext.  You should be able to access
 org.apache.spark.sql.api.java.JavaSQLContext with the default build.

 How are you building your application?

 Michael


 On Thu, Apr 24, 2014 at 9:17 AM, Andrew Or and...@databricks.com wrote:

 Did you build it with SPARK_HIVE=true?


 On Thu, Apr 24, 2014 at 7:00 AM, diplomatic Guru 
 diplomaticg...@gmail.com wrote:

 Hi Matei,

 I checked out the git repository and built it. However, I'm still
 getting below error. It couldn't find those SQL packages. Please advice.

 package org.apache.spark.sql.api.java does not exist
 [ERROR]
 /home/VirtualBoxImages.com/Documents/projects/errCount/src/main/java/errorCount/TransDriverSQL.java:[49,8]
 cannot find symbol
 [ERROR] symbol  : class JavaSchemaRDD

 Kind regards,

 Raj.



 On 23 April 2014 22:09, Matei Zaharia matei.zaha...@gmail.com wrote:

 It’s currently in the master branch, on
 https://github.com/apache/spark. You can check that out from git,
 build it with sbt/sbt assembly, and then try it out. We’re also going to
 post some release candidates soon that will be pre-built.

 Matei

 On Apr 23, 2014, at 1:30 PM, diplomatic Guru diplomaticg...@gmail.com
 wrote:

  Hello Team,
 
  I'm new to SPARK and just came across SPARK SQL, which appears to be
 interesting but not sure how I could get it.
 
  I know it's an Alpha version but not sure if its available for
 community yet.
 
  Many thanks.
 
  Raj.








Re: How do I access the SPARK SQL

2014-04-24 Thread Michael Armbrust
Oh, and you'll also need to add a dependency on spark-sql_2.10.


On Thu, Apr 24, 2014 at 10:13 AM, Michael Armbrust
mich...@databricks.comwrote:

 Yeah, you'll need to run `sbt publish-local` to push the jars to your
 local maven repository (~/.m2) and then depend on version 1.0.0-SNAPSHOT.


 On Thu, Apr 24, 2014 at 9:58 AM, diplomatic Guru diplomaticg...@gmail.com
  wrote:

 It's a simple application based on the People example.

 I'm using Maven for building and below is the pom.xml. Perhaps, I need to
 change the version?

 project
   groupIdUthay.Test.App/groupId
 artifactIdtest-app/artifactId
   modelVersion4.0.0/modelVersion
 nameTestApp/name
   packagingjar/packaging
 version1.0/version

 repositories
 repository
  idAkka repository/id
  urlhttp://repo.akka.io/releases/url
 /repository
 /repositories

 dependencies
dependency !-- Spark dependency --
 groupIdorg.apache.spark/groupId
 artifactIdspark-core_2.10/artifactId
 version0.9.1/version
/dependency
 /dependencies
 /project



 On 24 April 2014 17:47, Michael Armbrust mich...@databricks.com wrote:

 You shouldn't need to set SPARK_HIVE=true unless you want to use the
 JavaHiveContext.  You should be able to access
 org.apache.spark.sql.api.java.JavaSQLContext with the default build.

 How are you building your application?

 Michael


 On Thu, Apr 24, 2014 at 9:17 AM, Andrew Or and...@databricks.comwrote:

 Did you build it with SPARK_HIVE=true?


 On Thu, Apr 24, 2014 at 7:00 AM, diplomatic Guru 
 diplomaticg...@gmail.com wrote:

 Hi Matei,

 I checked out the git repository and built it. However, I'm still
 getting below error. It couldn't find those SQL packages. Please advice.

 package org.apache.spark.sql.api.java does not exist
 [ERROR]
 /home/VirtualBoxImages.com/Documents/projects/errCount/src/main/java/errorCount/TransDriverSQL.java:[49,8]
 cannot find symbol
 [ERROR] symbol  : class JavaSchemaRDD

 Kind regards,

 Raj.



 On 23 April 2014 22:09, Matei Zaharia matei.zaha...@gmail.com wrote:

 It's currently in the master branch, on
 https://github.com/apache/spark. You can check that out from git,
 build it with sbt/sbt assembly, and then try it out. We're also going to
 post some release candidates soon that will be pre-built.

 Matei

 On Apr 23, 2014, at 1:30 PM, diplomatic Guru 
 diplomaticg...@gmail.com wrote:

  Hello Team,
 
  I'm new to SPARK and just came across SPARK SQL, which appears to
 be interesting but not sure how I could get it.
 
  I know it's an Alpha version but not sure if its available for
 community yet.
 
  Many thanks.
 
  Raj.









Re: How do I access the SPARK SQL

2014-04-24 Thread diplomatic Guru
Many thanks for your prompt reply. I'll try your suggestions and will get
back to you.




On 24 April 2014 18:17, Michael Armbrust mich...@databricks.com wrote:

 Oh, and you'll also need to add a dependency on spark-sql_2.10.


 On Thu, Apr 24, 2014 at 10:13 AM, Michael Armbrust mich...@databricks.com
  wrote:

 Yeah, you'll need to run `sbt publish-local` to push the jars to your
 local maven repository (~/.m2) and then depend on version 1.0.0-SNAPSHOT.


 On Thu, Apr 24, 2014 at 9:58 AM, diplomatic Guru 
 diplomaticg...@gmail.com wrote:

 It's a simple application based on the People example.

 I'm using Maven for building and below is the pom.xml. Perhaps, I need
 to change the version?

 project
   groupIdUthay.Test.App/groupId
 artifactIdtest-app/artifactId
   modelVersion4.0.0/modelVersion
 nameTestApp/name
   packagingjar/packaging
 version1.0/version

 repositories
 repository
  idAkka repository/id
  urlhttp://repo.akka.io/releases/url
 /repository
 /repositories

 dependencies
dependency !-- Spark dependency --
 groupIdorg.apache.spark/groupId
 artifactIdspark-core_2.10/artifactId
 version0.9.1/version
/dependency
 /dependencies
 /project



 On 24 April 2014 17:47, Michael Armbrust mich...@databricks.com wrote:

 You shouldn't need to set SPARK_HIVE=true unless you want to use the
 JavaHiveContext.  You should be able to access
 org.apache.spark.sql.api.java.JavaSQLContext with the default build.

 How are you building your application?

 Michael


 On Thu, Apr 24, 2014 at 9:17 AM, Andrew Or and...@databricks.comwrote:

 Did you build it with SPARK_HIVE=true?


 On Thu, Apr 24, 2014 at 7:00 AM, diplomatic Guru 
 diplomaticg...@gmail.com wrote:

 Hi Matei,

 I checked out the git repository and built it. However, I'm still
 getting below error. It couldn't find those SQL packages. Please advice.

 package org.apache.spark.sql.api.java does not exist
 [ERROR]
 /home/VirtualBoxImages.com/Documents/projects/errCount/src/main/java/errorCount/TransDriverSQL.java:[49,8]
 cannot find symbol
 [ERROR] symbol  : class JavaSchemaRDD

 Kind regards,

 Raj.



 On 23 April 2014 22:09, Matei Zaharia matei.zaha...@gmail.comwrote:

 It’s currently in the master branch, on
 https://github.com/apache/spark. You can check that out from git,
 build it with sbt/sbt assembly, and then try it out. We’re also going to
 post some release candidates soon that will be pre-built.

 Matei

 On Apr 23, 2014, at 1:30 PM, diplomatic Guru 
 diplomaticg...@gmail.com wrote:

  Hello Team,
 
  I'm new to SPARK and just came across SPARK SQL, which appears to
 be interesting but not sure how I could get it.
 
  I know it's an Alpha version but not sure if its available for
 community yet.
 
  Many thanks.
 
  Raj.










Re: How do I access the SPARK SQL

2014-04-24 Thread diplomatic Guru
It worked!! Many thanks for your brilliant support.



On 24 April 2014 18:20, diplomatic Guru diplomaticg...@gmail.com wrote:

 Many thanks for your prompt reply. I'll try your suggestions and will get
 back to you.




 On 24 April 2014 18:17, Michael Armbrust mich...@databricks.com wrote:

 Oh, and you'll also need to add a dependency on spark-sql_2.10.


 On Thu, Apr 24, 2014 at 10:13 AM, Michael Armbrust 
 mich...@databricks.com wrote:

 Yeah, you'll need to run `sbt publish-local` to push the jars to your
 local maven repository (~/.m2) and then depend on version 1.0.0-SNAPSHOT.


 On Thu, Apr 24, 2014 at 9:58 AM, diplomatic Guru 
 diplomaticg...@gmail.com wrote:

 It's a simple application based on the People example.

 I'm using Maven for building and below is the pom.xml. Perhaps, I need
 to change the version?

 project
   groupIdUthay.Test.App/groupId
 artifactIdtest-app/artifactId
   modelVersion4.0.0/modelVersion
 nameTestApp/name
   packagingjar/packaging
 version1.0/version

 repositories
 repository
  idAkka repository/id
  urlhttp://repo.akka.io/releases/url
 /repository
 /repositories

 dependencies
dependency !-- Spark dependency --
 groupIdorg.apache.spark/groupId
 artifactIdspark-core_2.10/artifactId
 version0.9.1/version
/dependency
 /dependencies
 /project



 On 24 April 2014 17:47, Michael Armbrust mich...@databricks.comwrote:

 You shouldn't need to set SPARK_HIVE=true unless you want to use the
 JavaHiveContext.  You should be able to access
 org.apache.spark.sql.api.java.JavaSQLContext with the default build.

 How are you building your application?

 Michael


 On Thu, Apr 24, 2014 at 9:17 AM, Andrew Or and...@databricks.comwrote:

 Did you build it with SPARK_HIVE=true?


 On Thu, Apr 24, 2014 at 7:00 AM, diplomatic Guru 
 diplomaticg...@gmail.com wrote:

 Hi Matei,

 I checked out the git repository and built it. However, I'm still
 getting below error. It couldn't find those SQL packages. Please advice.

 package org.apache.spark.sql.api.java does not exist
 [ERROR]
 /home/VirtualBoxImages.com/Documents/projects/errCount/src/main/java/errorCount/TransDriverSQL.java:[49,8]
 cannot find symbol
 [ERROR] symbol  : class JavaSchemaRDD

 Kind regards,

 Raj.



 On 23 April 2014 22:09, Matei Zaharia matei.zaha...@gmail.comwrote:

 It’s currently in the master branch, on
 https://github.com/apache/spark. You can check that out from git,
 build it with sbt/sbt assembly, and then try it out. We’re also going 
 to
 post some release candidates soon that will be pre-built.

 Matei

 On Apr 23, 2014, at 1:30 PM, diplomatic Guru 
 diplomaticg...@gmail.com wrote:

  Hello Team,
 
  I'm new to SPARK and just came across SPARK SQL, which appears to
 be interesting but not sure how I could get it.
 
  I know it's an Alpha version but not sure if its available for
 community yet.
 
  Many thanks.
 
  Raj.











Re: How do I access the SPARK SQL

2014-04-23 Thread Matei Zaharia
It’s currently in the master branch, on https://github.com/apache/spark. You 
can check that out from git, build it with sbt/sbt assembly, and then try it 
out. We’re also going to post some release candidates soon that will be 
pre-built.

Matei

On Apr 23, 2014, at 1:30 PM, diplomatic Guru diplomaticg...@gmail.com wrote:

 Hello Team,
 
 I'm new to SPARK and just came across SPARK SQL, which appears to be 
 interesting but not sure how I could get it.
 
 I know it's an Alpha version but not sure if its available for community yet.
 
 Many thanks.
 
 Raj.