Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/22383
  
    Oh, hm:
    
    ```
    Serialization stack:
        - object not serializable (class: java.util.Optional, value: 
Optional[x])
        - field (class: scala.Tuple2, name: _2, type: class java.lang.Object)
        - object (class scala.Tuple2, (1,Optional[x]))
        - field (class: scala.Tuple2, name: _2, type: class java.lang.Object)
        - object (class scala.Tuple2, (1,(1,Optional[x])))
        - element of array (index: 0)
        - array (class [Lscala.Tuple2;, size 5)
    ```
    
    So `java.util.Optional` isn't `Serializable`. Well, that may scuttle this 
whole idea. I think we're going to find a number of instances where Spark or 
user apps need to `collect()` `Optional` objects.
    
    Unless someone has a bright idea I think we can't do this. Same reason I 
was unable to change Spark to use `java.util.function` interfaces -- lambdas 
aren't otherwise `Serializable` in Java!


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to