On Mon, Oct 26, 2015 at 9:01 AM, Bryan Jeffrey <bryan.jeff...@gmail.com>
wrote:
> All,
>
> I'm seeing the following error compiling Spark 1.4.1 w/ Scala 2.11 & Hive
> support. Any ideas?
>
> mvn -Dhadoop.version=2.6.1 -Dscala-2.11 -DskipTests -Pyarn -Phive
> -Phive
s,
>
> Bryan Jeffrey
>
> On Mon, Oct 26, 2015 at 9:01 AM, Bryan Jeffrey <bryan.jeff...@gmail.com>
> wrote:
>
>> All,
>>
>> I'm seeing the following error compiling Spark 1.4.1 w/ Scala 2.11 & Hive
>> support. Any ideas?
>>
>> mvn -D
All,
I'm seeing the following error compiling Spark 1.4.1 w/ Scala 2.11 & Hive
support. Any ideas?
mvn -Dhadoop.version=2.6.1 -Dscala-2.11 -DskipTests -Pyarn -Phive
-Phive-thriftserver package
[INFO] Spark Project Parent POM .. SUCCESS [4.124s]
[INFO] Spark Laun
Scala 2.11 is supported in 1.5.1 release:
http://search.maven.org/#search%7Cga%7C1%7Ca%3A%22spark-parent_2.11%22
Can you upgrade ?
Cheers
On Mon, Oct 26, 2015 at 6:01 AM, Bryan Jeffrey <bryan.jeff...@gmail.com>
wrote:
> All,
>
> I'm seeing the following error compiling Spark
I assume you're trying to sum *by key* within each window. The _ + _
operation applies to integers, but here you're telling it to sum
(String,Int) pairs, which isn't defined. Use reduceByKeyAndWindow
On Sat, Jan 31, 2015 at 12:00 AM, Eduardo Costa Alfaia
e.costaalf...@unibs.it wrote:
Hi Guys,
Hi Guys,
some idea how solve this error
[error]
/sata_disk/workspace/spark-1.1.1/examples/src/main/scala/org/apache/spark/examples/streaming/KafkaWordCount.scala:76:
missing parameter type for expanded function ((x$6, x$7) = x$6.$plus(x$7))
This is how i do it:
val tmp = test.map(x = (x, 1L)).reduceByWindow({ case ((word1, count1),
(word2, count2)) = (word1 + + word2, count1 + count2)}, Seconds(10),
Seconds(10))
In your case you are actually having a type mismatch:
[image: Inline image 1]
Thanks
Best Regards
On Sat, Jan 31,