+1 to fail fast. Thanks for reporting this, Jungtaek!
On Mon, Oct 26, 2020 at 8:36 AM Jungtaek Lim
wrote:
> Yeah I'm in favor of fast-fail if things are not working out as end users
> intended. Spark should only fail back when it doesn't make any difference
> but only some sort of performance. (
Yeah I'm in favor of fast-fail if things are not working out as end users
intended. Spark should only fail back when it doesn't make any difference
but only some sort of performance. (like whole stage codegen) This fail
back brings behavioral differences, which should be considered as a bug.
I'll
I agree. If the user configures an invalid catalog, it should fail and
propagate the exception. Running with a catalog other than the one the user
requested is incorrect.
On Fri, Oct 23, 2020 at 5:24 AM Russell Spitzer
wrote:
> I was convinced that we should probably just fail, but if that is to
I was convinced that we should probably just fail, but if that is too much
of a change, then logging the exception is also acceptable.
On Thu, Oct 22, 2020, 10:32 PM Jungtaek Lim
wrote:
> Hi devs,
>
> I got another report regarding configuring v2 session catalog, when Spark
> fails to instantiat
Hi devs,
I got another report regarding configuring v2 session catalog, when Spark
fails to instantiate the configured catalog. For now, it just simply logs
error message without exception information, and silently uses the default
session catalog.
https://github.com/apache/spark/blob/3819d396073