Github user michalsenkyr commented on the issue:

    https://github.com/apache/spark/pull/20505
  
    Yes, that is the idea. Frankly, I am not that familiar with how the 
compiler resolves all the implicit parameters to say confidently what is going 
on. But here's my take:
    
    I did a little more research and found out that the "diverging implicit 
expansion" error means the compiler is able to follow a resolution route that 
can be potentially infinite. I think it might be possible that the compiler is 
following my other implicit methods, trying to fit various collections inside V 
with the hope of satisfying the `<:<` condition some time in the future before 
finally giving up.
    
    Just to be sure multi-level collections work with my change, I successfully 
tried this:
    ```
    scala> implicitly[Encoder[Map[Seq[Map[String, Seq[Long]]], 
List[Array[Map[String, Int]]]]]]
    res5: 
org.apache.spark.sql.Encoder[Map[Seq[Map[String,Seq[Long]]],List[Array[Map[String,Int]]]]]
 = class[value[0]: 
map<array<map<string,array<bigint>>>,array<array<map<string,int>>>>]
    ```
    
    One thing that doesn't make sense for me, however, is the information the 
compiler gives me when enabling implicit resolution logging via 
`-Xlog-implicits`:
    ```
    scala> implicitly[Encoder[Map[String, Any]]]
    <console>:25: newCheckedSetEncoder is not a valid implicit value for 
org.apache.spark.sql.Encoder[Map[String,Any]] because:
    hasMatchingSymbol reported error: polymorphic expression cannot be 
instantiated to expected type;
     found   : [T[_], E]org.apache.spark.sql.Encoder[T[E]]
     required: org.apache.spark.sql.Encoder[Map[String,Any]]
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: newCheckedSetEncoder is not a valid implicit value for 
org.apache.spark.sql.Encoder[String] because:
    hasMatchingSymbol reported error: polymorphic expression cannot be 
instantiated to expected type;
     found   : [T[_], E]org.apache.spark.sql.Encoder[T[E]]
     required: org.apache.spark.sql.Encoder[String]
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: newCheckedMapEncoder is not a valid implicit value for 
org.apache.spark.sql.Encoder[String] because:
    hasMatchingSymbol reported error: polymorphic expression cannot be 
instantiated to expected type;
     found   : [T[_, _], K, V]org.apache.spark.sql.Encoder[T[K,V]]
     required: org.apache.spark.sql.Encoder[String]
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: newCheckedSequenceEncoder is not a valid implicit value for 
org.apache.spark.sql.Encoder[String] because:
    hasMatchingSymbol reported error: polymorphic expression cannot be 
instantiated to expected type;
     found   : [T[_], E]org.apache.spark.sql.Encoder[T[E]]
     required: org.apache.spark.sql.Encoder[String]
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: materializing requested 
reflect.runtime.universe.type.TypeTag[A] using 
`package`.this.materializeTypeTag[A](scala.reflect.runtime.`package`.universe)
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: newCheckedMapEncoder is not a valid implicit value for 
org.apache.spark.sql.Encoder[E] because:
    hasMatchingSymbol reported error: diverging implicit expansion for type 
org.apache.spark.sql.Encoder[K]
    starting with method newStringEncoder in class SQLImplicits
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: newCheckedSetEncoder is not a valid implicit value for 
org.apache.spark.sql.Encoder[Any] because:
    hasMatchingSymbol reported error: ambiguous implicit values:
     both method newIntEncoder in class SQLImplicits of type => 
org.apache.spark.sql.Encoder[Int]
     and method newLongEncoder in class SQLImplicits of type => 
org.apache.spark.sql.Encoder[Long]
     match expected type org.apache.spark.sql.Encoder[E]
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: newCheckedMapEncoder is not a valid implicit value for 
org.apache.spark.sql.Encoder[Any] because:
    hasMatchingSymbol reported error: diverging implicit expansion for type 
org.apache.spark.sql.Encoder[K]
    starting with method newStringEncoder in class SQLImplicits
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: materializing requested 
reflect.runtime.universe.type.TypeTag[A] using 
`package`.this.materializeTypeTag[A](scala.reflect.runtime.`package`.universe)
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: newCheckedMapEncoder is not a valid implicit value for 
org.apache.spark.sql.Encoder[E] because:
    hasMatchingSymbol reported error: diverging implicit expansion for type 
org.apache.spark.sql.Encoder[K]
    starting with method newStringEncoder in class SQLImplicits
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: newCheckedSequenceEncoder is not a valid implicit value for 
org.apache.spark.sql.Encoder[Any] because:
    hasMatchingSymbol reported error: ambiguous implicit values:
     both method newIntEncoder in class SQLImplicits of type => 
org.apache.spark.sql.Encoder[Int]
     and method newLongEncoder in class SQLImplicits of type => 
org.apache.spark.sql.Encoder[Long]
     match expected type org.apache.spark.sql.Encoder[E]
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: materializing requested 
reflect.runtime.universe.type.TypeTag[Any] using 
`package`.this.materializeTypeTag[Any](scala.reflect.runtime.`package`.universe)
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: newProductEncoder is not a valid implicit value for 
org.apache.spark.sql.Encoder[Any] because:
    typing TypeApply reported errors for the implicit tree: type arguments 
[Any] do not conform to method newProductEncoder's type parameter bounds [T <: 
Product]
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: newCheckedMapEncoder is not a valid implicit value for 
org.apache.spark.sql.Encoder[Map[String,Any]] because:
    hasMatchingSymbol reported error: diverging implicit expansion for type 
org.apache.spark.sql.Encoder[K]
    starting with method newStringEncoder in class SQLImplicits
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: newCheckedSequenceEncoder is not a valid implicit value for 
org.apache.spark.sql.Encoder[Map[String,Any]] because:
    hasMatchingSymbol reported error: polymorphic expression cannot be 
instantiated to expected type;
     found   : [T[_], E]org.apache.spark.sql.Encoder[T[E]]
     required: org.apache.spark.sql.Encoder[Map[String,Any]]
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: materializing requested 
reflect.runtime.universe.type.TypeTag[Map[String,Any]] using 
`package`.this.materializeTypeTag[Map[String,Any]](scala.reflect.runtime.`package`.universe)
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: newProductEncoder is not a valid implicit value for 
org.apache.spark.sql.Encoder[Map[String,Any]] because:
    typing TypeApply reported errors for the implicit tree: type arguments 
[Map[String,Any]] do not conform to method newProductEncoder's type parameter 
bounds [T <: Product]
           implicitly[Encoder[Map[String, Any]]]
                     ^
    <console>:25: error: diverging implicit expansion for type 
org.apache.spark.sql.Encoder[Map[String,Any]]
    starting with method newStringEncoder in class SQLImplicits
           implicitly[Encoder[Map[String, Any]]]
    ```
    Note the "ambiguous implicit values" errors on Long/Int Encoders while 
looking for Set/Seq Encoders. I really can't explain what led the compiler to 
consider those. I'll look some more and try to find out exactly what is going 
on there, but I am suspecting it might have something to do with covariance on 
collection parameters.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to