fixed in the next releases…
From: Campagnola, Francesco
Sent: martedì 6 settembre 2016 09:46
To: 'Jeff Zhang'
Cc: user@spark.apache.org
Subject: RE: Spark 2.0.0 Thrift Server problem with Hive metastore
I mean I have installed Spark 2.0 in the same environment where Spark 1.6
thrift se
sco <mailto:francesco.campagn...@anritsu.com>>
> Cc: user@spark.apache.org <mailto:user@spark.apache.org>
> Subject: Re: Spark 2.0.0 Thrift Server problem with Hive metastore
>
> How do you upgrade to spark 2.0 ?
>
> On Mon, Sep 5, 2016 at 11:25 PM, Campagnola, Francesco
:
org.apache.spark.sql.catalyst.expressions.GenericInternalRow cannot be cast to
org.apache.spark.sql.catalyst.expressions.UnsafeRow
From: Jeff Zhang [mailto:zjf...@gmail.com]
Sent: martedì 6 settembre 2016 02:50
To: Campagnola, Francesco
Cc: user@spark.apache.org
Subject: Re: Spark 2.0.0 Thrift Server problem with Hive metastore
How do
How do you upgrade to spark 2.0 ?
On Mon, Sep 5, 2016 at 11:25 PM, Campagnola, Francesco <
francesco.campagn...@anritsu.com> wrote:
> Hi,
>
>
>
> in an already working Spark - Hive environment with Spark 1.6 and Hive
> 1.2.1, with Hive metastore configured on Postgres DB, I have upgraded Spark
>
Hi,
in an already working Spark - Hive environment with Spark 1.6 and Hive 1.2.1,
with Hive metastore configured on Postgres DB, I have upgraded Spark to the
2.0.0.
I have started the thrift server on YARN, then tried to execute from the
beeline cli or a jdbc client the following command:
SHOW