ork for my purpose. It actually
does nothing.
Kazuaki Ishizaki
From: Jacek Laskowski <ja...@japila.pl>
To: Kazuaki Ishizaki/Japan/IBM@IBMJP
Cc: user <user@spark.apache.org>
Date: 2016/08/15 04:56
Subject: Re: Change nullable property in Dataset schema
On Wed,
<ko...@tresata.com>
To: Kazuaki Ishizaki/Japan/IBM@IBMJP
Cc: "user@spark.apache.org" <user@spark.apache.org>
Date: 2016/08/16 04:35
Subject: Re: Change nullable property in Dataset schema
why do you want the array to have nullable = false? what is the b
why do you want the array to have nullable = false? what is the benefit?
On Wed, Aug 3, 2016 at 10:45 AM, Kazuaki Ishizaki
wrote:
> Dear all,
> Would it be possible to let me know how to change nullable property in
> Dataset?
>
> When I looked for how to change nullable
On Wed, Aug 10, 2016 at 12:04 AM, Kazuaki Ishizaki wrote:
> import testImplicits._
> test("test") {
> val ds1 = sparkContext.parallelize(Seq(Array(1, 1), Array(2, 2),
> Array(3, 3)), 1).toDS
You should just Seq(...).toDS
> val ds2 = ds1.map(e => e)
Why are you
After some investigations, I was able to change nullable property in
Dataset[Array[Int]] in the following way. Is this right way?
(1) Apply https://github.com/apache/spark/pull/13873
(2) Use two Encoders. One is RowEncoder. The other is predefined
ExressionEncoder.
class Test extends QueryTest