Many thanks for your prompt reply. I'll try your suggestions and will get
back to you.




On 24 April 2014 18:17, Michael Armbrust <mich...@databricks.com> wrote:

> Oh, and you'll also need to add a dependency on "spark-sql_2.10".
>
>
> On Thu, Apr 24, 2014 at 10:13 AM, Michael Armbrust <mich...@databricks.com
> > wrote:
>
>> Yeah, you'll need to run `sbt publish-local` to push the jars to your
>> local maven repository (~/.m2) and then depend on version 1.0.0-SNAPSHOT.
>>
>>
>> On Thu, Apr 24, 2014 at 9:58 AM, diplomatic Guru <
>> diplomaticg...@gmail.com> wrote:
>>
>>> It's a simple application based on the "People" example.
>>>
>>> I'm using Maven for building and below is the pom.xml. Perhaps, I need
>>> to change the version?
>>>
>>> <project>
>>>   <groupId>Uthay.Test.App</groupId>
>>>     <artifactId>test-app</artifactId>
>>>       <modelVersion>4.0.0</modelVersion>
>>>         <name>TestApp</name>
>>>           <packaging>jar</packaging>
>>>             <version>1.0</version>
>>>
>>>         <repositories>
>>>             <repository>
>>>                  <id>Akka repository</id>
>>>                  <url>http://repo.akka.io/releases</url>
>>>             </repository>
>>>         </repositories>
>>>
>>>         <dependencies>
>>>            <dependency> <!-- Spark dependency -->
>>>                 <groupId>org.apache.spark</groupId>
>>>                 <artifactId>spark-core_2.10</artifactId>
>>>                 <version>0.9.1</version>
>>>            </dependency>
>>>         </dependencies>
>>> </project>
>>>
>>>
>>>
>>> On 24 April 2014 17:47, Michael Armbrust <mich...@databricks.com> wrote:
>>>
>>>> You shouldn't need to set SPARK_HIVE=true unless you want to use the
>>>> JavaHiveContext.  You should be able to access
>>>> org.apache.spark.sql.api.java.JavaSQLContext with the default build.
>>>>
>>>> How are you building your application?
>>>>
>>>> Michael
>>>>
>>>>
>>>> On Thu, Apr 24, 2014 at 9:17 AM, Andrew Or <and...@databricks.com>wrote:
>>>>
>>>>> Did you build it with SPARK_HIVE=true?
>>>>>
>>>>>
>>>>> On Thu, Apr 24, 2014 at 7:00 AM, diplomatic Guru <
>>>>> diplomaticg...@gmail.com> wrote:
>>>>>
>>>>>> Hi Matei,
>>>>>>
>>>>>> I checked out the git repository and built it. However, I'm still
>>>>>> getting below error. It couldn't find those SQL packages. Please advice.
>>>>>>
>>>>>> package org.apache.spark.sql.api.java does not exist
>>>>>> [ERROR]
>>>>>> /home/VirtualBoxImages.com/Documents/projects/errCount/src/main/java/errorCount/TransDriverSQL.java:[49,8]
>>>>>> cannot find symbol
>>>>>> [ERROR] symbol  : class JavaSchemaRDD
>>>>>>
>>>>>> Kind regards,
>>>>>>
>>>>>> Raj.
>>>>>>
>>>>>>
>>>>>>
>>>>>> On 23 April 2014 22:09, Matei Zaharia <matei.zaha...@gmail.com>wrote:
>>>>>>
>>>>>>> It’s currently in the master branch, on
>>>>>>> https://github.com/apache/spark. You can check that out from git,
>>>>>>> build it with sbt/sbt assembly, and then try it out. We’re also going to
>>>>>>> post some release candidates soon that will be pre-built.
>>>>>>>
>>>>>>> Matei
>>>>>>>
>>>>>>> On Apr 23, 2014, at 1:30 PM, diplomatic Guru <
>>>>>>> diplomaticg...@gmail.com> wrote:
>>>>>>>
>>>>>>> > Hello Team,
>>>>>>> >
>>>>>>> > I'm new to SPARK and just came across SPARK SQL, which appears to
>>>>>>> be interesting but not sure how I could get it.
>>>>>>> >
>>>>>>> > I know it's an Alpha version but not sure if its available for
>>>>>>> community yet.
>>>>>>> >
>>>>>>> > Many thanks.
>>>>>>> >
>>>>>>> > Raj.
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to