Experimental in Spark really just means that we are not promising binary
compatibly for those functions in the 2.x release line. For Datasets in
particular, we want a few releases to make sure the APIs don't have any
major gaps before removing the experimental tag.
On Thu, Dec 15, 2016 at 1:17
Hi Team,
When going through Dataset class for Spark 2.0 it comes across that both
overloaded map functions with encoder and without are marked as
experimental.
Is there a reason and issues that developers whould be aware of when using
this for production applications. Also is there a