Like `RDD.map`, you can throw whatever exceptions and they will be
propagated to the driver side and fail the Spark job.
On Mon, Apr 8, 2019 at 3:10 PM Andrew Melo wrote:
> Hello,
>
> I'm developing a (java) DataSourceV2 to read a columnar fileformat
> popular in a number of physical sciences
Hello,
I'm developing a (java) DataSourceV2 to read a columnar fileformat
popular in a number of physical sciences (https://root.cern.ch/). (I
also understand that the API isn't fixed and subject to change).
My question is -- what is the expected way to transmit exceptions from
the DataSource up