Exie,

Reported your issue: https://issues.apache.org/jira/browse/SPARK-9302

SparkR has support for long(bigint) type in serde. This issue is related to 
support complex Scala types in serde.

-----Original Message-----
From: Exie [mailto:tfind...@prodevelop.com.au] 
Sent: Friday, July 24, 2015 10:26 AM
To: user@spark.apache.org
Subject: SparkR Supported Types - Please add "bigint"

Hi Folks,

Using Spark to read in JSON files and detect the schema, it gives me a 
dataframe with a "bigint" filed. R then fails to import the dataframe as it 
cant convert the type.

> head(mydf)
Error in as.data.frame.default(x[[i]], optional = TRUE) : 
  cannot coerce class ""jobj"" to a data.frame
>
> show(mydf)
DataFrame[localEventDtTm:timestamp, asset:string, assetCategory:string, 
assetType:string, event:string, 
extras:array<struct&lt;name:string,value:string>>, ipAddress:string, 
memberId:string, system:string, timestamp:bigint, title:string, 
trackingId:string, version:bigint]
>

I believe this is related to:
https://issues.apache.org/jira/browse/SPARK-8840

A sample record in raw JSON looks like this:
{"version": 1,"event": "view","timestamp": 1427846422377,"system":
"DCDS","asset": "6404476","assetType": "myType","assetCategory":
"myCategory","extras": [{"name": "videoSource","value": "mySource"},{"name":
"playerType","value": "Article"},{"name": "duration","value":
"202088"}],"trackingId": "155629a0-d802-11e4-13ee-6884e43d6000","ipAddress":
"165.69.2.4","title": "myTitle"}

Can someone turn this into a feature request or something for 1.5.0 ?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-Supported-Types-Please-add-bigint-tp23975.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
commands, e-mail: user-h...@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to