Please the last line of convertToCatalyst(a: Any) :
case other => other
FYI
On Mon, Feb 15, 2016 at 12:09 AM, Fabian Böhnlein <
fabian.boehnl...@gmail.com> wrote:
> Interesting, thanks.
>
> The (only) publicly accessible method seems *convertToCatalyst*:
>
>
Interesting, thanks.
The (only) publicly accessible method seems /convertToCatalyst/:
https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/CatalystTypeConverters.scala#L425
Seems it's missing some types like Integer, Short, Long... I'll give it
I had the same issue. I resolved it in Java, but I am pretty sure it would work
with scala too. Its kind of a gross hack. But what I did is say I had a table
in Mysql with 1000 columns
what is did is that I threw a jdbc query to extracted the schema of the table.
I stored that schema and wrote
Hi all,
is there a way to create a Spark SQL Row schema based on Scala data
types without creating a manual mapping?
That's the only example I can find which doesn't require
spark.sql.types.DataType already as input, but it requires to define
them as Strings.
* val struct = (new
CatatlystTypeConverters.scala has all types of utility methods to convert
from Scala to row and vice a versa.
On Fri, Feb 12, 2016 at 12:21 AM, Rishabh Wadhawan
wrote:
> I had the same issue. I resolved it in Java, but I am pretty sure it would
> work with scala too. Its
Right, Thanks Ted.
On Fri, Feb 12, 2016 at 10:21 AM, Ted Yu wrote:
> Minor correction: the class is CatalystTypeConverters.scala
>
> On Thu, Feb 11, 2016 at 8:46 PM, Yogesh Mahajan
> wrote:
>
>> CatatlystTypeConverters.scala has all types of utility
Minor correction: the class is CatalystTypeConverters.scala
On Thu, Feb 11, 2016 at 8:46 PM, Yogesh Mahajan
wrote:
> CatatlystTypeConverters.scala has all types of utility methods to convert
> from Scala to row and vice a versa.
>
>
> On Fri, Feb 12, 2016 at 12:21 AM,