Normally people would establish maven project with Spark dependencies or,
use sbt.

Can you go with either approach ?

Cheers

On Tue, Aug 18, 2015 at 10:28 AM, Jerry <jerry.c...@gmail.com> wrote:

> Hello,
>
> So I setup Spark to run on my local machine to see if I can reproduce the
> issue I'm having with data frames, but I'm running into issues with the
> compiler.
>
> Here's what I got:
>
> $ echo $CLASSPATH
>
> /usr/lib/jvm/java-6-oracle/lib:/home/adminz/dev/spark/spark-1.4.1/lib/spark-assembly-1.4.1-hadoop2.6.0.jar
>
>
> javac Test.java
> Test.java:1: package org.apache.spark.sql.api.java does not exist
> import org.apache.spark.sql.api.java.*;
> ^
> Test.java:6: package org.apache.spark.sql does not exist
> import org.apache.spark.sql.*;
> ^
> Test.java:7: package org.apache.spark.sql.hive does not exist
> import org.apache.spark.sql.hive.*;
> ....
>
>
> Let me know what I'm doing wrong.
>
> Thanks,
>         Jerry
>

Reply via email to