Re: Serializing DataSets

2016-01-19 Thread Simon Hafner
The occasional type error if the casting goes wrong for whatever reason.

2016-01-19 1:22 GMT+08:00 Michael Armbrust :
> What error?
>
> On Mon, Jan 18, 2016 at 9:01 AM, Simon Hafner  wrote:
>>
>> And for deserializing,
>> `sqlContext.read.parquet("path/to/parquet").as[T]` and catch the
>> error?
>>
>> 2016-01-14 3:43 GMT+08:00 Michael Armbrust :
>> > Yeah, thats the best way for now (note the conversion is purely logical
>> > so
>> > there is no cost of calling toDF()).  We'll likely be combining the
>> > classes
>> > in Spark 2.0 to remove this awkwardness.
>> >
>> > On Tue, Jan 12, 2016 at 11:20 PM, Simon Hafner 
>> > wrote:
>> >>
>> >> What's the proper way to write DataSets to disk? Convert them to a
>> >> DataFrame and use the writers there?
>> >>
>> >> -
>> >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> >> For additional commands, e-mail: user-h...@spark.apache.org
>> >>
>> >
>
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Serializing DataSets

2016-01-18 Thread Michael Armbrust
What error?

On Mon, Jan 18, 2016 at 9:01 AM, Simon Hafner  wrote:

> And for deserializing,
> `sqlContext.read.parquet("path/to/parquet").as[T]` and catch the
> error?
>
> 2016-01-14 3:43 GMT+08:00 Michael Armbrust :
> > Yeah, thats the best way for now (note the conversion is purely logical
> so
> > there is no cost of calling toDF()).  We'll likely be combining the
> classes
> > in Spark 2.0 to remove this awkwardness.
> >
> > On Tue, Jan 12, 2016 at 11:20 PM, Simon Hafner 
> > wrote:
> >>
> >> What's the proper way to write DataSets to disk? Convert them to a
> >> DataFrame and use the writers there?
> >>
> >> -
> >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> >> For additional commands, e-mail: user-h...@spark.apache.org
> >>
> >
>


Re: Serializing DataSets

2016-01-18 Thread Simon Hafner
And for deserializing,
`sqlContext.read.parquet("path/to/parquet").as[T]` and catch the
error?

2016-01-14 3:43 GMT+08:00 Michael Armbrust :
> Yeah, thats the best way for now (note the conversion is purely logical so
> there is no cost of calling toDF()).  We'll likely be combining the classes
> in Spark 2.0 to remove this awkwardness.
>
> On Tue, Jan 12, 2016 at 11:20 PM, Simon Hafner 
> wrote:
>>
>> What's the proper way to write DataSets to disk? Convert them to a
>> DataFrame and use the writers there?
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Serializing DataSets

2016-01-13 Thread Michael Armbrust
Yeah, thats the best way for now (note the conversion is purely logical so
there is no cost of calling toDF()).  We'll likely be combining the classes
in Spark 2.0 to remove this awkwardness.

On Tue, Jan 12, 2016 at 11:20 PM, Simon Hafner 
wrote:

> What's the proper way to write DataSets to disk? Convert them to a
> DataFrame and use the writers there?
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Serializing DataSets

2016-01-12 Thread Simon Hafner
What's the proper way to write DataSets to disk? Convert them to a
DataFrame and use the writers there?

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org