Did you check that the security extensions are installed (JCE)?
KhajaAsmath Mohammed schrieb am Mi., 22. Nov.
2017 um 19:36 Uhr:
> [image: Inline image 1]
>
> This is what we are on.
>
> On Wed, Nov 22, 2017 at 12:33 PM, KhajaAsmath Mohammed <
> mdkhajaasm...@gmail.com>
[image: Inline image 1]
This is what we are on.
On Wed, Nov 22, 2017 at 12:33 PM, KhajaAsmath Mohammed <
mdkhajaasm...@gmail.com> wrote:
> We use oracle JDK. we are on unix.
>
> On Wed, Nov 22, 2017 at 12:31 PM, Georg Heiler
> wrote:
>
>> Do you use oracle or open
We use oracle JDK. we are on unix.
On Wed, Nov 22, 2017 at 12:31 PM, Georg Heiler
wrote:
> Do you use oracle or open jdk? We recently had an issue with open jdk:
> formerly, java Security extensions were installed by default - no longer so
> on centos 7.3
>
> Are
Do you use oracle or open jdk? We recently had an issue with open jdk:
formerly, java Security extensions were installed by default - no longer so
on centos 7.3
Are these installed?
KhajaAsmath Mohammed schrieb am Mi. 22. Nov. 2017
um 19:29:
> I passed keytab, renewal
I passed keytab, renewal is enabled by running the script every eight
hours. User gets renewed by the script every eight hours.
On Wed, Nov 22, 2017 at 12:27 PM, Georg Heiler
wrote:
> Did you pass a keytab? Is renewal enabled in your kdc?
> KhajaAsmath Mohammed
Did you pass a keytab? Is renewal enabled in your kdc?
KhajaAsmath Mohammed schrieb am Mi. 22. Nov. 2017
um 19:25:
> Hi,
>
> I have written spark stream job and job is running successfully for more
> than 36 hours. After around 36 hours job gets failed with kerberos
Hi,
I have written spark stream job and job is running successfully for more
than 36 hours. After around 36 hours job gets failed with kerberos issue.
Any solution on how to resolve it.
org.apache.spark.SparkException: Task failed while wri\
ting rows.
at
On 19 Oct 2016, at 00:18, Michael Segel
> wrote:
(Sorry sent reply via wrong account.. )
Steve,
Kinda hijacking the thread, but I promise its still on topic to OP’s issue.. ;-)
Usually you will end up having a local Kerberos set up
(Sorry sent reply via wrong account.. )
Steve,
Kinda hijacking the thread, but I promise its still on topic to OP’s issue.. ;-)
Usually you will end up having a local Kerberos set up per cluster.
So your machine accounts (hive, yarn, hbase, etc …) are going to be local to
the cluster.
So you
On 17 Oct 2016, at 22:11, Michael Segel
> wrote:
@Steve you are going to have to explain what you mean by ‘turn Kerberos on’.
Taken one way… it could mean making cluster B secure and running Kerberos and
then you’d have to create
On 13 Oct 2016, at 10:50, dbolshak
> wrote:
Hello community,
We've a challenge and no ideas how to solve it.
The problem,
Say we have the following environment:
1. `cluster A`, the cluster does not use kerberos and we use it as a
kerberos on `cluster A`
>>>> 3. We cannot turn off kerberos on `cluster C`
>>>> 4. We can turn on/off kerberos on `cluster B`, currently it's turned
>>>> off.
>>>> 5. Spark app is built on top of RDD and does not dep
cannot turn off kerberos on `cluster C`
>>> 4. We can turn on/off kerberos on `cluster B`, currently it's turned off.
>>> 5. Spark app is built on top of RDD and does not depend on spark-sql.
>>>
>>> Does anybody know how to write data using RDD api to remote cluster
not depend on spark-sql.
>>
>> Does anybody know how to write data using RDD api to remote cluster which
>> is
>> running with Kerberos?
>>
>> --
>> //with Best Regards
>> --Denis Bolshakov
>> e-mail: bolshakov.de...
t depend on spark-sql.
>
> Does anybody know how to write data using RDD api to remote cluster which
> is
> running with Kerberos?
>
> --
> //with Best Regards
> --Denis Bolshakov
> e-mail: bolshakov.de...@gmail.com
>
>
>
> --
> View this message in conte
-sql.
Does anybody know how to write data using RDD api to remote cluster which is
running with Kerberos?
--
//with Best Regards
--Denis Bolshakov
e-mail: bolshakov.de...@gmail.com
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-with-kerberos-tp27894
Hello community,
We've a challenge and no ideas how to solve it.
The problem,
Say we have the following environment:
1. `cluster A`, the cluster does not use kerberos and we use it as a source
of data, important thing is - we don't manage this cluster.
2. `cluster B`, small cluster where our
Turns to be it is a Spark issue
https://issues.apache.org/jira/browse/SPARK-13478
--
Ruslan Dautkhanov
On Mon, Jan 18, 2016 at 4:25 PM, Ruslan Dautkhanov
wrote:
> Hi Romain,
>
> Thank you for your response.
>
> Adding Kerberos support might be as simple as
>
I took liberty and created a JIRA https://github.com/cloudera/livy/issues/36
Feel free to close it if doesn't belong to Livy project.
I really don't know if this is a Spark or a Livy/Sentry problem.
Any ideas for possible workarounds?
Thank you.
--
Ruslan Dautkhanov
On Mon, Jan 18, 2016 at
Hi Guys,
Any help regarding this issue..??
On Wed, Jan 13, 2016 at 6:39 PM, Vinay Kashyap wrote:
> Hi all,
>
> I am using *Spark 1.5.1 in YARN cluster mode in CDH 5.5.*
> I am trying to create an RDD by reading HBase table with kerberos enabled.
> I am able to launch
Hi Romain,
Thank you for your response.
Adding Kerberos support might be as simple as
https://issues.cloudera.org/browse/LIVY-44 ? I.e. add Livy --principal and
--keytab parameters to be passed to spark-submit.
As a workaround I just did kinit (using hues' keytab) and then launched
Livy Server.
Getting following error stack
The Spark session could not be created in the cluster:
> at org.apache.hadoop.security.*UserGroupInformation.doAs*
> (UserGroupInformation.java:1671)
> at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:160)
> at
Hi all,
I am using *Spark 1.5.1 in YARN cluster mode in CDH 5.5.*
I am trying to create an RDD by reading HBase table with kerberos enabled.
I am able to launch the spark job to read the HBase table, but I notice
that the executors launched for the job cannot proceed due to an issue with
:46 PM
To: Eric Walk
Cc: user@spark.apache.org;Bill Busch
Subject: Re: Spark + HBase + Kerberos
Are hbase config / keytab files deployed on executor machines ?
Consider adding -Dsun.security.krb5.debug=true for debug purpose.
Cheers
On Wed, Mar 18, 2015 at 11:39 AM, Eric Walk
eric.w
Are hbase config / keytab files deployed on executor machines ?
Consider adding -Dsun.security.krb5.debug=true for debug purpose.
Cheers
On Wed, Mar 18, 2015 at 11:39 AM, Eric Walk eric.w...@perficient.com
wrote:
Having an issue connecting to HBase from a Spark container in a Secure
jande...@gmail.com wrote:
We have a standalone spark cluster for kerberos test.
But when reading from hdfs, i get error output: Can't get Master Kerberos
principal for use as renewer.
So Whether standalone spark support kerberos? can anyone confirm it? or
what i missed?
Thanks in advance
Hope someone helps me. Thanks.
On Wed, Feb 4, 2015 at 6:14 PM, Jander g jande...@gmail.com wrote:
We have a standalone spark cluster for kerberos test.
But when reading from hdfs, i get error output: Can't get Master Kerberos
principal for use as renewer.
So Whether standalone spark
We have a standalone spark cluster for kerberos test.
But when reading from hdfs, i get error output: Can't get Master Kerberos
principal for use as renewer.
So Whether standalone spark support kerberos? can anyone confirm it? or
what i missed?
Thanks in advance.
--
Thanks,
Jander
Which version of Spark and Hadoop are you using? Could you please provide
the full stack trace of the exception?
On Tue, Oct 28, 2014 at 5:48 AM, Du Li l...@yahoo-inc.com.invalid wrote:
Hi,
I was trying to set up Spark SQL on a private cluster. I configured a
hive-site.xml under
-inc.com.invalidmailto:l...@yahoo-inc.com.invalid
Cc: user@spark.apache.orgmailto:user@spark.apache.org
user@spark.apache.orgmailto:user@spark.apache.org
Subject: Re: [SPARK SQL] kerberos error when creating database from
beeline/ThriftServer2
Which version of Spark and Hadoop are you using? Could you please
@spark.apache.org
Subject: Re: [SPARK SQL] kerberos error when creating database from
beeline/ThriftServer2
Which version of Spark and Hadoop are you using? Could you please provide the
full stack trace of the exception?
On Tue, Oct 28, 2014 at 5:48 AM, Du Li
l...@yahoo-inc.com.invalidmailto:l
To: Cheng Lian lian.cs@gmail.commailto:lian.cs@gmail.com
Cc: user@spark.apache.orgmailto:user@spark.apache.org
user@spark.apache.orgmailto:user@spark.apache.org
Subject: Re: [SPARK SQL] kerberos error when creating database from
beeline/ThriftServer2
If I put all the jar files from my local
Hi,
I was trying to set up Spark SQL on a private cluster. I configured a
hive-site.xml under spark/conf that uses a local metestore with warehouse and
default FS name set to HDFS on one of my corporate cluster. Then I started
spark master, worker and thrift server. However, when creating a
33 matches
Mail list logo