Ah yes. I found it too in the manual. Thanks for the help anyway!

Since BigDecimal is just a wrapper around BigInt, let's also convert to
BigInt to Decimal.

I created a ticket. https://issues.apache.org/jira/browse/SPARK-4549

Jianshi

On Fri, Nov 21, 2014 at 11:30 PM, Yin Huai <huaiyin....@gmail.com> wrote:

> Hello Jianshi,
>
> The reason of that error is that we do not have a Spark SQL data type for
> Scala BigInt. You can use Decimal for your case.
>
> Thanks,
>
> Yin
>
> On Fri, Nov 21, 2014 at 5:11 AM, Jianshi Huang <jianshi.hu...@gmail.com>
> wrote:
>
>> Hi,
>>
>> I got an error during rdd.registerTempTable(...) saying scala.MatchError:
>> scala.BigInt
>>
>> Looks like BigInt cannot be used in SchemaRDD, is that correct?
>>
>> So what would you recommend to deal with it?
>>
>> Thanks,
>> --
>> Jianshi Huang
>>
>> LinkedIn: jianshi
>> Twitter: @jshuang
>> Github & Blog: http://huangjs.github.com/
>>
>
>


-- 
Jianshi Huang

LinkedIn: jianshi
Twitter: @jshuang
Github & Blog: http://huangjs.github.com/

Reply via email to