Hi Tathagata,

When you say please mark spark-core and spark-streaming as dependencies how
do you mean?
I have installed the pre-build spark-1.4 for Hadoop 2.6 from spark
downloads. In my maven pom.xml, I am using version 1.4 as described.

Please let me know how I can fix that?

Thanks
Nipun

On Thu, Jun 18, 2015 at 4:22 PM, Tathagata Das <t...@databricks.com> wrote:

> I think you may be including a different version of Spark Streaming in
> your assembly. Please mark spark-core nd spark-streaming as provided
> dependencies. Any installation of Spark will automatically provide Spark in
> the classpath so you do not have to bundle it.
>
> On Thu, Jun 18, 2015 at 8:44 AM, Nipun Arora <nipunarora2...@gmail.com>
> wrote:
>
>> Hi,
>>
>> I have the following piece of code, where I am trying to transform a
>> spark stream and add min and max to it of eachRDD. However, I get an error
>> saying max call does not exist, at run-time (compiles properly). I am using
>> spark-1.4
>>
>> I have added the question to stackoverflow as well:
>> http://stackoverflow.com/questions/30902090/adding-max-and-min-in-spark-stream-in-java/30909796#30909796
>>
>> Any help is greatly appreciated :)
>>
>> Thanks
>> Nipun
>>
>> JavaPairDStream<Tuple2<Long, Integer>, Tuple3<Integer,Long,Long>> 
>> sortedtsStream = transformedMaxMintsStream.transformToPair(new Sort2());
>>
>> sortedtsStream.foreach(
>>         new Function<JavaPairRDD<Tuple2<Long, Integer>, Tuple3<Integer, 
>> Long, Long>>, Void>() {
>>             @Override
>>             public Void call(JavaPairRDD<Tuple2<Long, Integer>, 
>> Tuple3<Integer, Long, Long>> tuple2Tuple3JavaPairRDD) throws Exception {
>>                 List<Tuple2<Tuple2<Long, Integer>, 
>> Tuple3<Integer,Long,Long>> >templist = tuple2Tuple3JavaPairRDD.collect();
>>                 for(Tuple2<Tuple2<Long,Integer>, Tuple3<Integer,Long,Long>> 
>> tuple :templist){
>>
>>                     Date date = new Date(tuple._1._1);
>>                     int pattern = tuple._1._2;
>>                     int count = tuple._2._1();
>>                     Date maxDate = new Date(tuple._2._2());
>>                     Date minDate = new Date(tuple._2._2());
>>                     System.out.println("TimeSlot: " + date.toString() + " 
>> Pattern: " + pattern + " Count: " + count + " Max: " + maxDate.toString() + 
>> " Min: " + minDate.toString());
>>
>>                 }
>>                 return null;
>>             }
>>         }
>> );
>>
>> Error:
>>
>>
>> 15/06/18 11:05:06 INFO BlockManagerInfo: Added input-0-1434639906000 in 
>> memory on localhost:42829 (size: 464.0 KB, free: 264.9 MB)15/06/18 11:05:06 
>> INFO BlockGenerator: Pushed block input-0-1434639906000Exception in thread 
>> "JobGenerator" java.lang.NoSuchMethodError: 
>> org.apache.spark.api.java.JavaPairRDD.max(Ljava/util/Comparator;)Lscala/Tuple2;
>>         at 
>> org.necla.ngla.spark_streaming.MinMax.call(Type4ViolationChecker.java:346)
>>         at 
>> org.necla.ngla.spark_streaming.MinMax.call(Type4ViolationChecker.java:340)
>>         at 
>> org.apache.spark.streaming.api.java.JavaDStreamLike$class.scalaTransform$3(JavaDStreamLike.scala:360)
>>         at 
>> org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$transformToPair$1.apply(JavaDStreamLike.scala:361)
>>         at 
>> org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$transformToPair$1.apply(JavaDStreamLike.scala:361)
>>         at 
>> org.apache.spark.streaming.dstream.DStream$$anonfun$transform$1$$anonf
>>
>>
>

Reply via email to