[ 
https://issues.apache.org/jira/browse/CASSANDRA-20723?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nadav Har'El updated CASSANDRA-20723:
-------------------------------------
    Description: 
According to the Cassandra documentation, the CQL "decimal" type is based on 
Java's BigDecimal type. The documentation of that type explains that the 
so-called "scale" of this number (fractional digits minus exponent) can be very 
large, and in particular, for example, the number "1e2147483647" (the exponent 
is 2^31-1) should be a valid decimal.

However, experimentally if I create a table with a "decimal" clustering key and 
try to insert any exponent 309 or higher, I get an error:

{{CREATE TABLE tbl (p int, c decimal, PRIMARY KEY (p, c))}}

{{INSERT INTO tbl (p, c) VALUES (1, 1e309)}}

The error I get is:

{{cassandra.protocol.SyntaxException: <Error from server: code=2000 [Syntax 
error in CQL query] message="Failed parsing statement: [INSERT INTO tbl (p, c) 
VALUES (1, 1e309)] reason: NumberFormatException Character I is neither a 
decimal digit number, decimal point, nor "e" notation exponential mark.">}}

This error message is silly and misleading - which "Character I" does it refer 
to? But beyond the wrong error message, it is also wrong to be an error at all, 
because as I explained above 1e309 should have been a valid decimal value - 
even 1e2147483647 is.

Since 1e308 is accepted but 1e309 is not, I suspect the bug is that the decimal 
literal is parsed as a double-precision number and then assigned into a 
decimal. But this is wrong - it means that with inline literals (without 
prepared statements) we cannot actually use the full range of the decimal type: 
We cannot use the full range of the exponent, and probably also can't use 
arbitrary precision (but I didn't test this).

Is there a different way to specify a decimal literal? The documentation 
doesn't seem to say anything about this question. I tried maybe a string like 
"1e309" might work to initialize a decimal, but no - it doesn't seem to work.

This seems to be a *regression in Cassandra 4 and 5:* I tested Cassandra 
3.11.17, and it doesn't have this bug: It is able able to parse 1e309 and even 
1e2147483647 just fine and assign them into the decimal clustering-key in my 
test. But the bug does exist on Cassandra 4.1.6 and 5.0.0 which I tested.

  was:
According to the Cassandra documentation, the CQL "decimal" type is based on 
Java's BigDecimal type. The documentation of that type explains that the 
so-called "scale" of this number (fractional digits minus exponent) can be very 
large, and in particular, for example, the number "1e2147483647" (the exponent 
is 2^31-1) should be a valid decimal.

However, experimentally if I create a table with a "decimal" clustering key and 
try to insert any exponent 309 or higher, I get an error:

{{CREATE TABLE tbl (p int, c decimal, PRIMARY KEY (p, c))}}

{{INSERT INTO tbl (p, c) VALUES (1, 1e309)}}

The error I get is:

{{cassandra.protocol.SyntaxException: <Error from server: code=2000 [Syntax 
error in CQL query] message="Failed parsing statement: [INSERT INTO tbl (p, c) 
VALUES (1, 1e309)] reason: NumberFormatException Character I is neither a 
decimal digit number, decimal point, nor "e" notation exponential mark.">}}

This error message is silly and misleading - which "Character I" does it refer 
to? But beyond the wrong error message, it is also wrong to be an error at all, 
because as I explained above 1e309 should have been a valid decimal value - 
even 1e2147483647 is.

Since 1e308 is accepted but 1e309 is not, I suspect the bug is that the decimal 
literal is parsed as a double-precision number and then assigned into a 
decimal. But this is wrong - it means that with inline literals (without 
prepared statements) we cannot actually use the full range of the decimal type: 
We cannot use the full range of the exponent, and probably also can't use 
arbitrary precision (but I didn't test this).

Is there a different way to specify a decimal literal? The documentation 
doesn't seem to say anything about this question. I tried maybe a string like 
"1e309" might work to initialize a decimal, but no - it doesn't seem to work.

This seems to be a *regression in Cassandra 4:* I tested Cassandra 3.11.17, and 
it doesn't have this bug: It is able able to parse 1e309 and even 1e2147483647 
just fine and assign them into the decimal clustering-key in my test. But the 
bug does exist on Cassandra 4.1.6 and 5.0.0 which I tested.


> decimal literal parsed as as a double, with misleading error message
> --------------------------------------------------------------------
>
>                 Key: CASSANDRA-20723
>                 URL: https://issues.apache.org/jira/browse/CASSANDRA-20723
>             Project: Apache Cassandra
>          Issue Type: Bug
>            Reporter: Nadav Har'El
>            Priority: Normal
>
> According to the Cassandra documentation, the CQL "decimal" type is based on 
> Java's BigDecimal type. The documentation of that type explains that the 
> so-called "scale" of this number (fractional digits minus exponent) can be 
> very large, and in particular, for example, the number "1e2147483647" (the 
> exponent is 2^31-1) should be a valid decimal.
> However, experimentally if I create a table with a "decimal" clustering key 
> and try to insert any exponent 309 or higher, I get an error:
> {{CREATE TABLE tbl (p int, c decimal, PRIMARY KEY (p, c))}}
> {{INSERT INTO tbl (p, c) VALUES (1, 1e309)}}
> The error I get is:
> {{cassandra.protocol.SyntaxException: <Error from server: code=2000 [Syntax 
> error in CQL query] message="Failed parsing statement: [INSERT INTO tbl (p, 
> c) VALUES (1, 1e309)] reason: NumberFormatException Character I is neither a 
> decimal digit number, decimal point, nor "e" notation exponential mark.">}}
> This error message is silly and misleading - which "Character I" does it 
> refer to? But beyond the wrong error message, it is also wrong to be an error 
> at all, because as I explained above 1e309 should have been a valid decimal 
> value - even 1e2147483647 is.
> Since 1e308 is accepted but 1e309 is not, I suspect the bug is that the 
> decimal literal is parsed as a double-precision number and then assigned into 
> a decimal. But this is wrong - it means that with inline literals (without 
> prepared statements) we cannot actually use the full range of the decimal 
> type: We cannot use the full range of the exponent, and probably also can't 
> use arbitrary precision (but I didn't test this).
> Is there a different way to specify a decimal literal? The documentation 
> doesn't seem to say anything about this question. I tried maybe a string like 
> "1e309" might work to initialize a decimal, but no - it doesn't seem to work.
> This seems to be a *regression in Cassandra 4 and 5:* I tested Cassandra 
> 3.11.17, and it doesn't have this bug: It is able able to parse 1e309 and 
> even 1e2147483647 just fine and assign them into the decimal clustering-key 
> in my test. But the bug does exist on Cassandra 4.1.6 and 5.0.0 which I 
> tested.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to