Re: [DISCUSS][SPIP] Standardize Spark Exception Messages

2020-10-27 Thread Xinyi Yu
Hi Chang,

It is a script that directly analyzes the source code searching for raw "throw 
new” exception. : ) Hope that give an intuitive overview of current exceptions 
in Spark.
On Oct 27, 2020, 8:21 PM -0700, Chang Chen , wrote:
> hi Xinyi
>
> Just curious, which tool did you use to generate this
>
> > Xinyi Yu  于2020年10月26日周一 上午8:05写道:
> > > Hi all,
> > >
> > > We like to post a SPIP of Standardize Exception Messages in Spark. Here is
> > > the document link:
> > > https://docs.google.com/document/d/1XGj1o3xAFh8BA7RCn3DtwIPC6--hIFOaNUNSlpaOIZs/edit?usp=sharing
> > > 
> > >
> > > This SPIP aims to standardize the exception messages in Spark. It has 
> > > three
> > > major focuses:
> > > 1. Group exception messages in dedicated files for easy maintenance and
> > > auditing.
> > > 2. Establish an error message guideline for developers.
> > > 3. Improve error message quality.
> > >
> > > Thanks for your time and patience. Looking forward to your feedback!
> > >
> > >
> > >
> > > --
> > > Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
> > >
> > > -
> > > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> > >


Re: [DISCUSS][SPIP] Standardize Spark Exception Messages

2020-10-27 Thread Chang Chen
hi Xinyi

Just curious, which tool did you use to generate this



Xinyi Yu  于2020年10月26日周一 上午8:05写道:

> Hi all,
>
> We like to post a SPIP of Standardize Exception Messages in Spark. Here is
> the document link:
>
> https://docs.google.com/document/d/1XGj1o3xAFh8BA7RCn3DtwIPC6--hIFOaNUNSlpaOIZs/edit?usp=sharing
> <
> https://docs.google.com/document/d/1XGj1o3xAFh8BA7RCn3DtwIPC6--hIFOaNUNSlpaOIZs/edit?usp=sharing>
>
>
> This SPIP aims to standardize the exception messages in Spark. It has three
> major focuses:
> 1. Group exception messages in dedicated files for easy maintenance and
> auditing.
> 2. Establish an error message guideline for developers.
> 3. Improve error message quality.
>
> Thanks for your time and patience. Looking forward to your feedback!
>
>
>
> --
> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>