Re: Spark Import Issue

2013-12-08 Thread Andrew Ash
Also note that when you add parameters to the -cp flag on the JVM and want
to include multiple jars, the only way to do that is by including an entire
directory with "dir/*" -- you can't use "dir/*jar" or "dir/spark*jar" or
anything else like that.

http://stackoverflow.com/questions/219585/setting-multiple-jars-in-java-classpath


On Sun, Dec 8, 2013 at 12:25 AM, Matei Zaharia wrote:

> I’m not sure you can have a star inside that quoted classpath argument
> (the double quotes may cancel the *). Try using the JAR through its full
> name, or link to Spark through Maven (
> http://spark.incubator.apache.org/docs/latest/quick-start.html#a-standalone-app-in-java
> ).
>
> Matei
>
> On Dec 6, 2013, at 9:50 AM, Garrett Hamers  wrote:
>
> Hello,
>
> I am new to the spark system, and I am trying to write a simple program to
> get myself familiar with how spark works. I am currently having problem
> with importing the spark package. I am getting the following compiler
> error: package org.apache.spark.api.java does not exist.
>
> I have spark-0.8.0-incubating install. I ran the commands: sbt/sbt
> compile, sbt/sbt assembly, and sbt/sbt publish-local without any errors. My
> sql.java file is located in the spark-0.8.0-incubating root directory. I
> tried to compile the code using “javac sql.java” and “javac -cp
> "assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating*.jar"
> sql.java”.
>
> Here is the code for sql.java:
>
> package shark;
>
> import java.io.Serializable;
>
> import java.util.List;
>
> import java.io.*;
>
> import org.apache.spark.api.java.*; //Issue is here
>
> public class sql implements Serializable {
>
>   public static void main( String[] args) {
>
> System.out.println("Hello World”);
>
>   }
>
> }
>
>
>  What do I need to do in order for java to import the spark code properly?
> Any advice would be greatly appreciated.
>
> Thank you,
> Garrett Hamers
>
>
>


Re: Spark Import Issue

2013-12-08 Thread Matei Zaharia
I’m not sure you can have a star inside that quoted classpath argument (the 
double quotes may cancel the *). Try using the JAR through its full name, or 
link to Spark through Maven 
(http://spark.incubator.apache.org/docs/latest/quick-start.html#a-standalone-app-in-java).

Matei

On Dec 6, 2013, at 9:50 AM, Garrett Hamers  wrote:

> Hello,
> I am new to the spark system, and I am trying to write a simple program to 
> get myself familiar with how spark works. I am currently having problem with 
> importing the spark package. I am getting the following compiler error: 
> package org.apache.spark.api.java does not exist. 
> I have spark-0.8.0-incubating install. I ran the commands: sbt/sbt compile, 
> sbt/sbt assembly, and sbt/sbt publish-local without any errors. My sql.java 
> file is located in the spark-0.8.0-incubating root directory. I tried to 
> compile the code using “javac sql.java” and “javac -cp 
> "assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating*.jar" 
> sql.java”.
> 
> Here is the code for sql.java:
> package shark;
> import java.io.Serializable;
> import java.util.List;
> import java.io.*;
> import org.apache.spark.api.java.*; //Issue is here
> public class sql implements Serializable { 
>   public static void main( String[] args) {
> System.out.println("Hello World”);
>   }
> }
> 
> What do I need to do in order for java to import the spark code properly? Any 
> advice would be greatly appreciated.
> 
> Thank you,
> Garrett Hamers



Spark Import Issue

2013-12-06 Thread Garrett Hamers
Hello,

I am new to the spark system, and I am trying to write a simple program to
get myself familiar with how spark works. I am currently having problem
with importing the spark package. I am getting the following compiler
error: package org.apache.spark.api.java does not exist.

I have spark-0.8.0-incubating install. I ran the commands: sbt/sbt compile,
sbt/sbt assembly, and sbt/sbt publish-local without any errors. My sql.java
file is located in the spark-0.8.0-incubating root directory. I tried to
compile the code using “javac sql.java” and “javac -cp
"assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating*.jar"
sql.java”.

Here is the code for sql.java:

package shark;

import java.io.Serializable;

import java.util.List;

import java.io.*;

import org.apache.spark.api.java.*; //Issue is here

public class sql implements Serializable {

  public static void main( String[] args) {

System.out.println("Hello World”);

  }

}


 What do I need to do in order for java to import the spark code properly?
Any advice would be greatly appreciated.

Thank you,
Garrett Hamers