Part of it can be found at:
https://github.com/apache/spark/pull/3429/files#diff-f88c3e731fcb17b1323b778807c35b38R34
 
Sorry it's a TO BE reviewed PR, but still should be informative.

Cheng Hao

-----Original Message-----
From: Alessandro Baretta [mailto:alexbare...@gmail.com] 
Sent: Friday, December 12, 2014 6:37 AM
To: Michael Armbrust; dev@spark.apache.org
Subject: Where are the docs for the SparkSQL DataTypes?

Michael & other Spark SQL junkies,

As I read through the Spark API docs, in particular those for the 
org.apache.spark.sql package, I can't seem to find details about the Scala 
classes representing the various SparkSQL DataTypes, for instance DecimalType. 
I find DataType classes in org.apache.spark.sql.api.java, but they don't seem 
to match the similarly named scala classes. For instance, DecimalType is 
documented as having a nullary constructor, but if I try to construct an 
instance of org.apache.spark.sql.DecimalType without any parameters, the 
compiler complains about the lack of a precisionInfo field, which I have 
discovered can be passed in as None. Where is all this stuff documented?

Alex

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to