thank you, issue of ticket.


2015/01/02 15:45、Akhil Das <ak...@sigmoidanalytics.com> のメッセージ:

> Yep, Opened SPARK-5054
> 
> Thanks
> Best Regards
> 
>> On Tue, Dec 30, 2014 at 5:52 AM, Michael Armbrust <mich...@databricks.com> 
>> wrote:
>> Yeah, this looks like a regression in the API due to the addition of 
>> arbitrary decimal support.  Can you open a JIRA?
>> 
>>> On Sun, Dec 28, 2014 at 12:23 AM, Akhil Das <ak...@sigmoidanalytics.com> 
>>> wrote:
>>> Hi Zigen,
>>> 
>>> Looks like they missed it.
>>> 
>>> Thanks
>>> Best Regards
>>> 
>>>> On Sat, Dec 27, 2014 at 12:43 PM, Zigen Zigen <dbviewer.zi...@gmail.com> 
>>>> wrote:
>>>> Hello , I am zigen.
>>>> 
>>>> I am using the Spark SQL 1.1.0.
>>>> 
>>>> I want to use the Spark SQL 1.2.0.
>>>> 
>>>> 
>>>> 
>>>> but my Spark application is a compile error.
>>>> 
>>>> Spark 1.1.0 had a DataType.DecimalType.
>>>> 
>>>> but Spark1.2.0 had not DataType.DecimalType.
>>>> 
>>>> Why ?
>>>> 
>>>> 
>>>> 
>>>> JavaDoc (Spark 1.1.0)
>>>> 
>>>> http://people.apache.org/~pwendell/spark-1.1.0-rc1-docs/api/java/org/apache/spark/sql/api/java/DataType.html
>>>> 
>>>> 
>>>> 
>>>> JavaDoc (Spark 1.2.0)
>>>> 
>>>> http://people.apache.org/~pwendell/spark-1.2.0-rc1-docs/api/java/org/apache/spark/sql/api/java/DataType.html
>>>> 
>>>> 
>>>> 
>>>> programing guild (Spark 1.2.0)
>>>> 
>>>> https://spark.apache.org/docs/latest/sql-programming-guide.html#spark-sql-datatype-reference
> 

Reply via email to