[ https://issues.apache.org/jira/browse/SPARK-22036?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16169074#comment-16169074 ]
Marco Gaido commented on SPARK-22036: ------------------------------------- Maybe the "bad" part is that by default spark creates the columns as {{Decimal(38, 18)}}. This is the problem. With a multiplication this leads to a {{Decimal(38, 36)}}, which as you can easily understand is the root of the problem of your operation. If you cast the two columns before the multiplication, like {{ds("a").cast(DecimalType(20,14))}}, you won't have any problem anymore. Currently you should suggest Spark which are the right values to use. > BigDecimal multiplication sometimes returns null > ------------------------------------------------ > > Key: SPARK-22036 > URL: https://issues.apache.org/jira/browse/SPARK-22036 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 2.2.0 > Reporter: Olivier Blanvillain > > The multiplication of two BigDecimal numbers sometimes returns null. This > issue we discovered while doing property based testing for the frameless > project. Here is a minimal reproduction: > {code:java} > object Main extends App { > import org.apache.spark.{SparkConf, SparkContext} > import org.apache.spark.sql.SparkSession > import spark.implicits._ > val conf = new > SparkConf().setMaster("local[*]").setAppName("REPL").set("spark.ui.enabled", > "false") > val spark = > SparkSession.builder().config(conf).appName("REPL").getOrCreate() > implicit val sqlContext = spark.sqlContext > case class X2(a: BigDecimal, b: BigDecimal) > val ds = sqlContext.createDataset(List(X2(BigDecimal(-0.1267333984375), > BigDecimal(-1000.1)))) > val result = ds.select(ds("a") * ds("b")).collect.head > println(result) // [null] > } > {code} -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org