Re: A little Scala 2.12 help

2017-09-19 Thread Jacek Laskowski
Hi,

Nice catch, Sean! Learnt this today. They did say you could learn a lot
with Spark! :)

Pozdrawiam,
Jacek Laskowski

https://about.me/JacekLaskowski
Spark Structured Streaming (Apache Spark 2.2+)
https://bit.ly/spark-structured-streaming
Mastering Apache Spark 2 https://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski

On Tue, Sep 19, 2017 at 4:23 PM, Sean Owen  wrote:

> I figured this out. It's another effect of a new behavior in
> 2.12: Eta-expansion of zero-argument method values is deprecated
> Imagine:
>
> def f(): String = "foo"
> def g(fn: () => String) = ???
>
> g(f) works in 2.11 without warning. It generates a warning in 2.12,
> because it wants you to explicitly make a function from the method
> reference: g(() => f). It will maybe be an error in 2.13.
>
> But, this affects implicit resolution. Some of the implicits that power
> SparkContext.sequenceFile() need to change to be vals of type () =>
> WritableConverter[T], not methods that return WritableConverter[T].
>
> I'm working through this and other deprecated items in 2.12 and preparing
> more 2.11-compatible changes that allow these to work cleanly in 2.12.
>
> On Fri, Sep 15, 2017 at 11:21 AM Sean Owen  wrote:
>
>> I'm working on updating to Scala 2.12, and, have hit a compile error in
>> Scala 2.12 that I'm strugging to design a fix to (that doesn't modify the
>> API significantly). If you "./dev/change-scala-version.sh 2.12" and
>> compile, you'll see errors like...
>>
>> [error] /Users/srowen/Documents/Cloudera/spark/core/src/test/
>> scala/org/apache/spark/FileSuite.scala:100: could not find implicit
>> value for parameter kcf: () => org.apache.spark.
>> WritableConverter[org.apache.hadoop.io.IntWritable]
>> [error] Error occurred in an application involving default arguments.
>> [error] val output = sc.sequenceFile[IntWritable, Text](outputDir)
>>
>> Clearly implicit resolution changed a little bit in 2.12 somehow. I
>> actually don't recall seeing this error before, so might be somehow related
>> to 2.12.3, but not sure.
>>
>> As you can see the implicits that have always existed and been imported
>> and should apply here don't seem to be found.
>>
>> If anyone is a Scala expert and could glance at this, you might help save
>> me a lot of puzzling.
>>
>


Re: A little Scala 2.12 help

2017-09-19 Thread Sean Owen
I figured this out. It's another effect of a new behavior in
2.12: Eta-expansion of zero-argument method values is deprecated
Imagine:

def f(): String = "foo"
def g(fn: () => String) = ???

g(f) works in 2.11 without warning. It generates a warning in 2.12, because
it wants you to explicitly make a function from the method reference: g(()
=> f). It will maybe be an error in 2.13.

But, this affects implicit resolution. Some of the implicits that power
SparkContext.sequenceFile() need to change to be vals of type () =>
WritableConverter[T], not methods that return WritableConverter[T].

I'm working through this and other deprecated items in 2.12 and preparing
more 2.11-compatible changes that allow these to work cleanly in 2.12.

On Fri, Sep 15, 2017 at 11:21 AM Sean Owen  wrote:

> I'm working on updating to Scala 2.12, and, have hit a compile error in
> Scala 2.12 that I'm strugging to design a fix to (that doesn't modify the
> API significantly). If you "./dev/change-scala-version.sh 2.12" and
> compile, you'll see errors like...
>
> [error]
> /Users/srowen/Documents/Cloudera/spark/core/src/test/scala/org/apache/spark/FileSuite.scala:100:
> could not find implicit value for parameter kcf: () =>
> org.apache.spark.WritableConverter[org.apache.hadoop.io.IntWritable]
> [error] Error occurred in an application involving default arguments.
> [error] val output = sc.sequenceFile[IntWritable, Text](outputDir)
>
> Clearly implicit resolution changed a little bit in 2.12 somehow. I
> actually don't recall seeing this error before, so might be somehow related
> to 2.12.3, but not sure.
>
> As you can see the implicits that have always existed and been imported
> and should apply here don't seem to be found.
>
> If anyone is a Scala expert and could glance at this, you might help save
> me a lot of puzzling.
>


A little Scala 2.12 help

2017-09-15 Thread Sean Owen
I'm working on updating to Scala 2.12, and, have hit a compile error in
Scala 2.12 that I'm strugging to design a fix to (that doesn't modify the
API significantly). If you "./dev/change-scala-version.sh 2.12" and
compile, you'll see errors like...

[error]
/Users/srowen/Documents/Cloudera/spark/core/src/test/scala/org/apache/spark/FileSuite.scala:100:
could not find implicit value for parameter kcf: () =>
org.apache.spark.WritableConverter[org.apache.hadoop.io.IntWritable]
[error] Error occurred in an application involving default arguments.
[error] val output = sc.sequenceFile[IntWritable, Text](outputDir)

Clearly implicit resolution changed a little bit in 2.12 somehow. I
actually don't recall seeing this error before, so might be somehow related
to 2.12.3, but not sure.

As you can see the implicits that have always existed and been imported and
should apply here don't seem to be found.

If anyone is a Scala expert and could glance at this, you might help save
me a lot of puzzling.