gt; Il giorno lun 7 gen 2019 alle ore 15:03 Wenchen Fan
> ha scritto:
>
>> AFAIK parquet spec says decimal scale can't be negative. If we want to
>> officially support negative-scale decimal, we should clearly define the
>> behavior when writing negative-scale decimals to
e-scale decimal, we should clearly define the
>> behavior when writing negative-scale decimals to parquet and other data
>> sources. The most straightforward way is to fail for this case, but maybe we
>> can do something better, like casting decimal(1, -20) to decimal(20,
gt; I'm OK with it, i.e. fail the write if there are negative-scale decimals
>> (we need to document it though). We can improve it later in data source v2.
>>
>> On Mon, Jan 7, 2019 at 10:09 PM Marco Gaido
>> wrote:
>>
>>> In general we can say that some
10:36 PM Wenchen Fan wrote:
> I'm OK with it, i.e. fail the write if there are negative-scale decimals
> (we need to document it though). We can improve it later in data source v2.
>
> On Mon, Jan 7, 2019 at 10:09 PM Marco Gaido
> wrote:
>
>> In general we can sa
I'm OK with it, i.e. fail the write if there are negative-scale decimals
(we need to document it though). We can improve it later in data source v2.
On Mon, Jan 7, 2019 at 10:09 PM Marco Gaido wrote:
> In general we can say that some datasources allow them, others fail. At
> the mom
or when writing negative-scale decimals to parquet and other data
> sources. The most straightforward way is to fail for this case, but maybe
> we can do something better, like casting decimal(1, -20) to decimal(20, 0)
> before writing.
>
> On Mon, Jan 7, 2019 at 9:32 PM Marco Gaido w
AFAIK parquet spec says decimal scale can't be negative. If we want to
officially support negative-scale decimal, we should clearly define the
behavior when writing negative-scale decimals to parquet and other data
sources. The most straightforward way is to fail for this case, but maybe
we c
, and the result type of decimal operations, and the
> behavior when writing out decimals(e.g. we can cast decimal(1, -20) to
> decimal(20, 0) before writing).
>
> Another question is, shall we set a min scale? e.g. shall we allow
> decimal(1, -1000)?
>
> On Thu, Oct 25, 2018 at 9
com/questions/35435691/bigdecimal-precision-and-scale>
looks pretty good), and the result type of decimal operations, and the
behavior when writing out decimals(e.g. we can cast decimal(1, -20) to
decimal(20, 0) before writing).
Another question is, shall we set a min scale? e.g. shall we allow
dec
That is feasible, the main point is that negative scales were not really
meant to be there in the first place, so it something which was forgot to
be forbidden, and it is something which the DBs we are drawing our
inspiration from for decimals (mainly SQLServer) do not support.
Honestly, my
co Gaido < marcogaido91@ gmail. com (
>> marcogaid...@gmail.com ) > wrote:
>>
>>
>>> Hi all,
>>>
>>>
>>> as you may remember, there was a design doc to support operations
>>> involving decimals with negative scales. After the
This is at analysis time.
On Tue, 18 Dec 2018, 17:32 Reynold Xin Is this an analysis time thing or a runtime thing?
>
> On Tue, Dec 18, 2018 at 7:45 AM Marco Gaido
> wrote:
>
>> Hi all,
>>
>> as you may remember, there was a design doc to support operations
>&
Is this an analysis time thing or a runtime thing?
On Tue, Dec 18, 2018 at 7:45 AM Marco Gaido wrote:
> Hi all,
>
> as you may remember, there was a design doc to support operations
> involving decimals with negative scales. After the discussion in the design
> doc, now th
Hi all,
as you may remember, there was a design doc to support operations involving
decimals with negative scales. After the discussion in the design doc, now
the related PR is blocked because for 3.0 we have another option which we
can explore, ie. forbidding negative scales. This is probably a
Hi all,
a bit more than one month ago, I sent a proposal for handling properly
decimals with negative scales in our operations. This is a long standing
problem in our codebase as we derived our rules from Hive and SQLServer
where negative scales are forbidden, while in Spark they are not.
The
DISCUSS thread is good to have...
From: Marco Gaido
Sent: Friday, September 21, 2018 3:31 AM
To: Wenchen Fan
Cc: dev
Subject: Re: SPIP: support decimals with negative scale in decimal operation
Hi Wenchen,
Thank you for the clarification. I agree that this is
Marco Gaido
> wrote:
>
>> Hi all,
>>
>> I am writing this e-mail in order to discuss the issue which is reported
>> in SPARK-25454 and according to Wenchen's suggestion I prepared a design
>> doc for it.
>>
>> The problem we are facing here is th
Gaido wrote:
> Hi all,
>
> I am writing this e-mail in order to discuss the issue which is reported
> in SPARK-25454 and according to Wenchen's suggestion I prepared a design
> doc for it.
>
> The problem we are facing here is that our rules for decimals operations
> a
Hi all,
I am writing this e-mail in order to discuss the issue which is reported in
SPARK-25454 and according to Wenchen's suggestion I prepared a design doc
for it.
The problem we are facing here is that our rules for decimals operations
are taken from Hive and MS SQL server and they expli
Hi Marco,
great work, I personally hope it gets included soon!
I just wanted to clarify one thing - Oracle and PostgreSQL do not have
infinite precision. The scale and precision of decimals are just
user-defined (explicitly or implicitly).
So, both of them follow the exact same rules you mentioned
i
> Inviato: 21/12/2017 22:46
> A: Marco Gaido
> Cc: Reynold Xin ; dev@spark.apache.org
> Oggetto: Re: Decimals
>
> Losing precision is not acceptable to financial customers. Thus, instead
> of returning NULL, I saw DB2 issues the following error message:
>
> SQL0802N
d Xin" ; "dev@spark.apache.org"
Oggetto: Re: Decimals
Losing precision is not acceptable to financial customers. Thus, instead of
returning NULL, I saw DB2 issues the following error message:
SQL0802N Arithmetic overflow or other arithmetic exception occurred.
SQLSTATE=22003
read it.
>
> Spark's current implementation of arithmetic operations on decimals was
> "copied" from Hive. Thus, the initial goal of the implementation was to be
> compliant with Hive, which itself aims to reproduce SQLServer behavior.
> Therefore I compared these 3 DBs an
Hello everybody,
I did some further researches and now I am sharing my findings. I am sorry,
it is going to be a quite long e-mail, but I'd really appreciate some
feedbacks when you have time to read it.
Spark's current implementation of arithmetic operations on decimals was
"co
Responses inline
On Tue, Dec 12, 2017 at 2:54 AM, Marco Gaido wrote:
> Hi all,
>
> I saw in these weeks that there are a lot of problems related to decimal
> values (SPARK-22036, SPARK-22755, for instance). Some are related to
> historical choices, which I don't know, thus please excuse me if I
Hi all,
I saw in these weeks that there are a lot of problems related to decimal
values (SPARK-22036, SPARK-22755, for instance). Some are related to
historical choices, which I don't know, thus please excuse me if I am
saying dumb things:
- why are we interpreting literal constants in queries a
en I tried to multiply to decimals in a select
> (either in scala or as SQL), and Im assuming I must be missing the point.
>
> The issue is fairly easy to recreate with something like the following:
>
>
> val sqlContext = new org.apache.spark.sql.SQLContext(sc)
> impo
27 matches
Mail list logo