Or another option is to use "Scala-IDE", which is built on top of Eclipse, 
instead of pure Eclipse, so Scala comes with it.
Yong

> From: so...@cloudera.com
> Date: Tue, 10 Mar 2015 18:40:44 +0000
> Subject: Re: Compilation error
> To: mohitanch...@gmail.com
> CC: t...@databricks.com; user@spark.apache.org
> 
> A couple points:
> 
> You've got mismatched versions here -- 1.2.0 vs 1.2.1. You should fix
> that but it's not your problem.
> 
> These are also supposed to be 'provided' scope dependencies in Maven.
> 
> You should get the Scala deps transitively and can import scala.*
> classes. However, it would be a little bit more correct to depend
> directly on the scala library classes, but in practice, easiest not to
> in simple use cases.
> 
> If you're still having trouble look at the output of "mvn dependency:tree"
> 
> On Tue, Mar 10, 2015 at 6:32 PM, Mohit Anchlia <mohitanch...@gmail.com> wrote:
> > I am using maven and my dependency looks like this, but this doesn't seem to
> > be working
> >
> > <dependencies>
> >
> > <dependency>
> >
> > <groupId>org.apache.spark</groupId>
> >
> > <artifactId>spark-streaming_2.10</artifactId>
> >
> > <version>1.2.0</version>
> >
> > </dependency>
> >
> > <dependency>
> >
> > <groupId>org.apache.spark</groupId>
> >
> > <artifactId>spark-core_2.10</artifactId>
> >
> > <version>1.2.1</version>
> >
> > </dependency>
> >
> > </dependencies>
> >
> >
> > On Tue, Mar 10, 2015 at 11:06 AM, Tathagata Das <t...@databricks.com> wrote:
> >>
> >> If you are using tools like SBT/Maven/Gradle/etc, they figure out all the
> >> recursive dependencies and includes them in the class path. I haven't
> >> touched Eclipse in years so I am not sure off the top of my head what's
> >> going on instead. Just in case you only downloaded the
> >> spark-streaming_2.10.jar  then that is indeed insufficient and you have to
> >> download all the recursive dependencies. May be you should create a Maven
> >> project inside Eclipse?
> >>
> >> TD
> >>
> >> On Tue, Mar 10, 2015 at 11:00 AM, Mohit Anchlia <mohitanch...@gmail.com>
> >> wrote:
> >>>
> >>> How do I do that? I haven't used Scala before.
> >>>
> >>> Also, linking page doesn't mention that:
> >>>
> >>>
> >>> http://spark.apache.org/docs/1.2.0/streaming-programming-guide.html#linking
> >>>
> >>> On Tue, Mar 10, 2015 at 10:57 AM, Sean Owen <so...@cloudera.com> wrote:
> >>>>
> >>>> It means you do not have Scala library classes in your project
> >>>> classpath.
> >>>>
> >>>> On Tue, Mar 10, 2015 at 5:54 PM, Mohit Anchlia <mohitanch...@gmail.com>
> >>>> wrote:
> >>>> > I am trying out streaming example as documented and I am using spark
> >>>> > 1.2.1
> >>>> > streaming from maven for Java.
> >>>> >
> >>>> > When I add this code I get compilation error on and eclipse is not
> >>>> > able to
> >>>> > recognize Tuple2. I also don't see any "import scala.Tuple2" class.
> >>>> >
> >>>> >
> >>>> >
> >>>> > http://spark.apache.org/docs/1.2.0/streaming-programming-guide.html#a-quick-example
> >>>> >
> >>>> >
> >>>> > private void map(JavaReceiverInputDStream<String> lines) {
> >>>> >
> >>>> > JavaDStream<String> words = lines.flatMap(
> >>>> >
> >>>> > new FlatMapFunction<String, String>() {
> >>>> >
> >>>> > @Override public Iterable<String> call(String x) {
> >>>> >
> >>>> > return Arrays.asList(x.split(" "));
> >>>> >
> >>>> > }
> >>>> >
> >>>> > });
> >>>> >
> >>>> > // Count each word in each batch
> >>>> >
> >>>> > JavaPairDStream<String, Integer> pairs = words.map(
> >>>> >
> >>>> > new PairFunction<String, String, Integer>() {
> >>>> >
> >>>> > @Override public Tuple2<String, Integer> call(String s) throws
> >>>> > Exception {
> >>>> >
> >>>> > return new Tuple2<String, Integer>(s, 1);
> >>>> >
> >>>> > }
> >>>> >
> >>>> > });
> >>>> >
> >>>> > }
> >>>
> >>>
> >>
> >
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 
                                          

Reply via email to