Steven
You summarized mostly correct. But there is a couple points I want to
emphasize.
Not every cluster have the Hive Service enabled. So The Yarn Client
shouldn't try to get the hive delegation token just because security mode
is enabled.
The Yarn Client code can check if the
> On Oct 21, 2015, at 8:45 PM, Chester Chen wrote:
>
> Doug
> thanks for responding.
> >>I think Spark just needs to be compiled against 1.2.1
>
>Can you elaborate on this, or specific command you are referring ?
>
>In our build.scala, I was including the
A similar issue occurs when interacting with Hive secured by Sentry.
https://issues.apache.org/jira/browse/SPARK-9042
By changing how Hive Context instance is created, this issue might also be
resolved.
On Thu, Oct 22, 2015 at 11:33 AM Steve Loughran
wrote:
> On 22 Oct
Doug
We are not trying to compiling against different version of hive. The
1.2.1.spark hive-exec is specified on spark 1.5.2 Pom file. We are moving from
spark 1.3.1 to 1.5.1. Simply trying to supply the needed dependency. The rest
of application (besides spark) simply uses hive 0.13.1.
Thanks Steve
Likes the slides on kerberos, I have enough scars from Kerberos
while trying to integrated it with (Pig, MapRed, Hive JDBC, and HCatalog
and Spark) etc. I am still having trouble making Impersonating to work for
HCatalog. I might send you an offline email to ask some pointers
On 22 Oct 2015, at 21:54, Chester Chen
> wrote:
Thanks Steve
Likes the slides on kerberos, I have enough scars from Kerberos while
trying to integrated it with (Pig, MapRed, Hive JDBC, and HCatalog and Spark)
etc. I am still having
On 22 Oct 2015, at 19:32, Chester Chen
> wrote:
Steven
You summarized mostly correct. But there is a couple points I want to
emphasize.
Not every cluster have the Hive Service enabled. So The Yarn Client
shouldn't try to get the
On 22 Oct 2015, at 08:25, Chester Chen
> wrote:
Doug
We are not trying to compiling against different version of hive. The
1.2.1.spark hive-exec is specified on spark 1.5.2 Pom file. We are moving from
spark 1.3.1 to 1.5.1. Simply trying
All,
just to see if this happens to other as well.
This is tested against the
spark 1.5.1 ( branch 1.5 with label 1.5.2-SNAPSHOT with commit on Tue
Oct 6, 84f510c4fa06e43bd35e2dc8e1008d0590cbe266)
Spark deployment mode : Spark-Cluster
Notice that if we enable Kerberos mode,
Doug
thanks for responding.
>>I think Spark just needs to be compiled against 1.2.1
Can you elaborate on this, or specific command you are referring ?
In our build.scala, I was including the following
"org.spark-project.hive" % "hive-exec" % "1.2.1.spark" intransitive()
I am not
10 matches
Mail list logo