[ https://issues.apache.org/jira/browse/SPARK-22036?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16291805#comment-16291805 ]
Anvesh R commented on SPARK-22036: ---------------------------------- +1 Issue reproduced on spark-2.2.0 : Data at s3 location - s3://bucket/spark-sql-jira/ : --------------------------------------------- 100|99999 drop table if exists test; CREATE EXTERNAL TABLE `test` ( a decimal(38,10), b decimal(38,10) ) ROW FORMAT DELIMITED FIELDS TERMINATED BY '|' STORED AS TEXTFILE LOCATION 's3://bucket/spark-sql-jira/'; spark-sql> select a,(a*b*0.98765432100) from test; 100 9876444.4445679 Time taken: 11.033 seconds, Fetched 1 row(s) spark-sql> select a,(a*b*0.987654321000) from test; 100 NULL Time taken: 0.523 seconds, Fetched 1 row(s) Changing a column's scale from decimal(38,10) to decimal(38,9) also helped but we would loose precision. > BigDecimal multiplication sometimes returns null > ------------------------------------------------ > > Key: SPARK-22036 > URL: https://issues.apache.org/jira/browse/SPARK-22036 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 2.2.0 > Reporter: Olivier Blanvillain > > The multiplication of two BigDecimal numbers sometimes returns null. Here is > a minimal reproduction: > {code:java} > object Main extends App { > import org.apache.spark.{SparkConf, SparkContext} > import org.apache.spark.sql.SparkSession > import spark.implicits._ > val conf = new > SparkConf().setMaster("local[*]").setAppName("REPL").set("spark.ui.enabled", > "false") > val spark = > SparkSession.builder().config(conf).appName("REPL").getOrCreate() > implicit val sqlContext = spark.sqlContext > case class X2(a: BigDecimal, b: BigDecimal) > val ds = sqlContext.createDataset(List(X2(BigDecimal(-0.1267333984375), > BigDecimal(-1000.1)))) > val result = ds.select(ds("a") * ds("b")).collect.head > println(result) // [null] > } > {code} -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org