So from what I understand, those usually pull dependencies for a given
project? I'm able to run the spark shell so I'd assume I have everything.
What am I missing from the big picture and what directory do I run maven on?

Thanks,
        Jerry

On Tue, Aug 18, 2015 at 11:15 AM, Ted Yu <yuzhih...@gmail.com> wrote:

> Normally people would establish maven project with Spark dependencies or,
> use sbt.
>
> Can you go with either approach ?
>
> Cheers
>
> On Tue, Aug 18, 2015 at 10:28 AM, Jerry <jerry.c...@gmail.com> wrote:
>
>> Hello,
>>
>> So I setup Spark to run on my local machine to see if I can reproduce the
>> issue I'm having with data frames, but I'm running into issues with the
>> compiler.
>>
>> Here's what I got:
>>
>> $ echo $CLASSPATH
>>
>> /usr/lib/jvm/java-6-oracle/lib:/home/adminz/dev/spark/spark-1.4.1/lib/spark-assembly-1.4.1-hadoop2.6.0.jar
>>
>>
>> javac Test.java
>> Test.java:1: package org.apache.spark.sql.api.java does not exist
>> import org.apache.spark.sql.api.java.*;
>> ^
>> Test.java:6: package org.apache.spark.sql does not exist
>> import org.apache.spark.sql.*;
>> ^
>> Test.java:7: package org.apache.spark.sql.hive does not exist
>> import org.apache.spark.sql.hive.*;
>> ....
>>
>>
>> Let me know what I'm doing wrong.
>>
>> Thanks,
>>         Jerry
>>
>
>

Reply via email to