...@prodevelop.com.au]
Sent: Friday, July 24, 2015 1:35 PM
To: user@spark.apache.org
Subject: Re: SparkR Supported Types - Please add bigint
Interestingly, after more digging, df.printSchema() in raw spark shows the
columns as a long, not a bigint.
root
|-- localEventDtTm: timestamp (nullable = true
representations.
-Original Message-
From: Exie [mailto:tfind...@prodevelop.com.au]
Sent: Friday, July 24, 2015 1:35 PM
To: user@spark.apache.org
Subject: Re: SparkR Supported Types - Please add bigint
Interestingly, after more digging, df.printSchema() in raw spark shows the
columns as a long
10:26 AM
To: user@spark.apache.org
Subject: SparkR Supported Types - Please add bigint
Hi Folks,
Using Spark to read in JSON files and detect the schema, it gives me a
dataframe with a bigint filed. R then fails to import the dataframe as it
cant convert the type.
head(mydf)
Error
.n3.nabble.com/SparkR-Supported-Types-Please-add-bigint-tp23975p23978.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands
-Types-Please-add-bigint-tp23975.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org