We are happy to announce the availability of Spark 2.2.3!
Apache Spark 2.2.3 is a maintenance release, based on the branch-2.2
maintenance branch of Spark. We strongly recommend all 2.2.x users to
upgrade to this stable release.
To download Spark 2.2.3, head over to the download page:
http://spar
Unsubscribe
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Unsubscribe
Basically, it is a zipping two flowables using the defined function[takes
two parameters and returns one, Hence the name BiFunction].
Obviously, you could avoid using RXJava and by using a TimerTask.
val a = Seq(1, 2, 3)
val b = a.zipWithIndex
b.foreach(b => new Timer().schedule(new TimerTask {
Hi Vladimir,
I've try to do the same here when I attempted to write a Spark connector
for remote file.
>From my point of view, There was a lot of change in the V2 API => Better
semantic at least !
I understood that only continuous streaming use datasourceV2 (Not sure if
im correct). But for file
Hi,
I am trying to understand the state of datasource v2, and I'm a bit lost.
On one hand, it is supposed to be more flexible approach, as described for
example here:
https://www.slideshare.net/databricks/apache-spark-data-source
-v2-with-wenchen-fan-and-gengliang-wang
On another hand, it ap