Hi, Michael.
I used HiveContext to create a table with a field of type Array. However, in
the hql results, this field was returned as type ArrayBuffer which is mutable.
Would it make more sense to be an Array?
The Spark version of my test is 1.0.2. I haven’t tested it on SQLContext nor
newer
Arrays in the JVM are also mutable. However, you should not be relying on
the exact type here. The only promise is that you will get back something
of type Seq[_].
On Wed, Aug 27, 2014 at 4:27 PM, Du Li l...@yahoo-inc.com wrote:
Hi, Michael.
I used HiveContext to create a table with a
mich...@databricks.commailto:mich...@databricks.com
Date: Wednesday, August 27, 2014 at 5:21 PM
To: Du Li l...@yahoo-inc.commailto:l...@yahoo-inc.com
Cc: user@spark.apache.orgmailto:user@spark.apache.org
user@spark.apache.orgmailto:user@spark.apache.org
Subject: Re: SparkSQL returns ArrayBuffer
?
From: Michael Armbrust mich...@databricks.com
Date: Wednesday, August 27, 2014 at 5:21 PM
To: Du Li l...@yahoo-inc.com
Cc: user@spark.apache.org user@spark.apache.org
Subject: Re: SparkSQL returns ArrayBuffer for fields of type Array
Arrays in the JVM are also mutable. However, you should