Github user holdenk commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21416#discussion_r191491943
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/Column.scala ---
    @@ -786,6 +787,24 @@ class Column(val expr: Expression) extends Logging {
       @scala.annotation.varargs
       def isin(list: Any*): Column = withExpr { In(expr, 
list.map(lit(_).expr)) }
    --- End diff --
    
    I know this is not your change, but I think (both here and bellow) 
something about the automagical type casting thats going on should be in the 
docstring/scaladoc/javadoc because to me its a little surprising how this will 
compare integers to strings and silently convert the types including if there 
are no strings which can be converted to integers. And I'd also include that in 
the isInCollection docstring/scaladoc/javadoc bellow.
    
    I'd also point out that the result of the conversion needs to be of the 
same type and not of a sequence of the type (although the error message we get 
is pretty clear so your call).
    
    Just a suggestion for improvement.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to