Github user mccheah commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21366#discussion_r194572858
  
    --- Diff: pom.xml ---
    @@ -760,6 +760,12 @@
             <version>1.10.19</version>
             <scope>test</scope>
           </dependency>
    +      <dependency>
    --- End diff --
    
    > f not mistaken Java 9 has introduced the related interfaces, so rx-java 
might not be needed in the future when Spark will update the supported java 
version.
    
    Spark will need to support Java 8 for the foreseeable future. We'll be 
using a Java 8 only compatible solution for awhile.
    
    > Some more scala centric implementations: akka streams a Reactive Streams 
and JDK 9+ java.util.concurrent.Flow-compliant implementation. This does not 
depend on rx-java.
    Also there is monix, and RxScala seems outdated. A comparison between monix 
and others here.
    
    Does Akka streams without JDK 9 depend on ReactiveX? If so then we have the 
same dependency problem. As for monix vs. Akka vs. RxJava, think that's an 
implementation detail - at the end of the day the main concern is the 
dependency addition and either way we're adding an external dependency vs. 
implementing the semantics ourselves. If we think monix / Akka is the more 
elegant solution over RxJava we can switch to that, but given how little we're 
actually doing that's actually in the functional programming style I don't 
think the difference between Java and Scala is significant in this particular 
instance.
    
    > From the dependencies already brought in by Spark is there a lib that 
could provide similar functionality, if not based on Reactive stuff some other 
way queues or something?
    
    I didn't see any but I am open to being corrected here. Spark implements an 
event queue but it doesn't suffice for the paradigm of having multiple 
subscribers process snapshots at different intervals.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to