Hi,

We didn't fully tested it in 1.1 as we were waiting for the dynamic load of
JAAS in the new Kafka consumer/producer processor. We expected that loading
a JAAS when starting Nifi may had impact on all Kerberos processes.

Thanks,

Arnaud

On Wed, May 17, 2017 at 2:31 PM, Bryan Bende <bbe...@gmail.com> wrote:

> Hello,
>
> Thanks for reporting this, we definitely want to figure out what is
> going on here.
>
> Was this flow working fine before using Apache NiFi 1.1.x (or some
> earlier version) and then stopped working when upgrading to 1.2.0?
>
> Thanks,
>
> Bryan
>
>
> On Wed, May 17, 2017 at 4:49 AM, Arnaud G <greatpat...@gmail.com> wrote:
> > Hi!
> >
> > We are currently facing an issue and we have not yet find a potential
> > solution.
> >
> > We are running 1.2 and we have are facing an interaction between the
> Kafka
> > 10.2 and PutHDFS processors.
> >
> > We are connecting to an external Kafka server that require SASL/PLAIN
> > authentication. To enable this we are using the new Kafka processor with
> a
> > JAAS configuration in the processor to authenticate, and this is working
> > fine.
> >
> > However on an unrelated flow the PutHDFS processor stopped to work as the
> > processor is now using the JAAS configuration and ignoring the Kerberos
> > configuration of the processor. (the setup are completely unrelated)
> >
> > We tried a lot of configuration but we have not yet find a way to allow
> Nifi
> > to authenticate to both Kafka 10.2 and HDFS at the same time, it's either
> > one or the other.
> >
> > Are we missing something? Is there a way to ensure that both processor
> keep
> > their own configuration?
> >
> > Thanks!
> >
> >
> > For reference PutHDFS failure log:
> >
> > 2017-05-16 15:27:01,199 ERROR [StandardProcessScheduler Thread-8]
> > o.apache.nifi.processors.hadoop.PutHDFS
> > PutHDFS[id=b6464c58-fee9-341c-8428-932218f7d239]
> > PutHDFS[id=b6464c58-fee9-341c-8428-932218f7d239] failed to invoke
> > @OnScheduled method due to java.lang.RuntimeException: Failed while
> > executing one of processor's OnScheduled task.; processor will not be
> > scheduled to run for 30 seconds: java.lang.RuntimeException: Failed while
> > executing one of processor's OnScheduled task.
> >
> > java.lang.RuntimeException: Failed while executing one of processor's
> > OnScheduled task.
> >
> >         at
> > org.apache.nifi.controller.StandardProcessorNode.
> invokeTaskAsCancelableFuture(StandardProcessorNode.java:1480)
> >
> >         at
> > org.apache.nifi.controller.StandardProcessorNode.access$
> 000(StandardProcessorNode.java:100)
> >
> >         at
> > org.apache.nifi.controller.StandardProcessorNode$1.run(
> StandardProcessorNode.java:1301)
> >
> >         at
> > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> >
> >         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> >
> >         at
> > java.util.concurrent.ScheduledThreadPoolExecutor$
> ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
> >
> >         at
> > java.util.concurrent.ScheduledThreadPoolExecutor$
> ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
> >
> >         at
> > java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
> >
> >         at
> > java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
> >
> >         at java.lang.Thread.run(Thread.java:745)
> >
> > Caused by: java.util.concurrent.ExecutionException:
> > java.lang.reflect.InvocationTargetException
> >
> >         at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> >
> >         at java.util.concurrent.FutureTask.get(FutureTask.java:206)
> >
> >         at
> > org.apache.nifi.controller.StandardProcessorNode.
> invokeTaskAsCancelableFuture(StandardProcessorNode.java:1463)
> >
> >         ... 9 common frames omitted
> >
> > Caused by: java.lang.reflect.InvocationTargetException: null
> >
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> >         at
> > sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
> >
> >         at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> >
> >         at java.lang.reflect.Method.invoke(Method.java:498)
> >
> >         at
> > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(
> ReflectionUtils.java:137)
> >
> >         at
> > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(
> ReflectionUtils.java:125)
> >
> >         at
> > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(
> ReflectionUtils.java:70)
> >
> >         at
> > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotation(
> ReflectionUtils.java:47)
> >
> >         at
> > org.apache.nifi.controller.StandardProcessorNode$1$1.
> call(StandardProcessorNode.java:1305)
> >
> >         at
> > org.apache.nifi.controller.StandardProcessorNode$1$1.
> call(StandardProcessorNode.java:1301)
> >
> >         ... 6 common frames omitted
> >
> > Caused by: java.lang.NullPointerException: null
> >
> >         at
> > org.apache.kafka.common.security.plain.PlainSaslServer$
> PlainSaslServerFactory.getMechanismNames(PlainSaslServer.java:162)
> >
> >         at
> > org.apache.hadoop.security.SaslRpcServer$FastSaslServerFactory.<init>(
> SaslRpcServer.java:380)
> >
> >         at
> > org.apache.hadoop.security.SaslRpcServer.init(SaslRpcServer.java:184)
> >
> >         at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:577)
> >
> >         at
> > org.apache.hadoop.hdfs.NameNodeProxies.createNNProxyWithClientProtoco
> l(NameNodeProxies.java:418)
> >
> >         at
> > org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(
> NameNodeProxies.java:314)
> >
> >         at
> > org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvide
> r.getProxy(ConfiguredFailoverProxyProvider.java:124)
> >
> >         at
> > org.apache.hadoop.io.retry.RetryInvocationHandler.<init>(
> RetryInvocationHandler.java:73)
> >
> >         at
> > org.apache.hadoop.io.retry.RetryInvocationHandler.<init>(
> RetryInvocationHandler.java:64)
> >
> >         at org.apache.hadoop.io.retry.RetryProxy.create(RetryProxy.
> java:59)
> >
> >         at
> > org.apache.hadoop.hdfs.NameNodeProxies.createProxy(
> NameNodeProxies.java:181)
> >
> >         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:678)
> >
> >         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:619)
> >
> >         at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(
> DistributedFileSystem.java:149)
> >
> >         at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2669)
> >
> >         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
> >
> >         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:172)
> >
> >         at
> > org.apache.nifi.processors.hadoop.AbstractHadoopProcessor$1.run(
> AbstractHadoopProcessor.java:304)
> >
> >         at
> > org.apache.nifi.processors.hadoop.AbstractHadoopProcessor$1.run(
> AbstractHadoopProcessor.java:301)
> >
> >         at java.security.AccessController.doPrivileged(Native Method)
> >
> >         at javax.security.auth.Subject.doAs(Subject.java:422)
> >
> >         at
> > org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1698)
> >
> >         at
> > org.apache.nifi.processors.hadoop.AbstractHadoopProcessor.
> getFileSystemAsUser(AbstractHadoopProcessor.java:301)
> >
> >         at
> > org.apache.nifi.processors.hadoop.AbstractHadoopProcessor.
> resetHDFSResources(AbstractHadoopProcessor.java:268)
> >
> >         at
> > org.apache.nifi.processors.hadoop.AbstractHadoopProcessor.
> abstractOnScheduled(AbstractHadoopProcessor.java:200)
> >
> >         at
> > org.apache.nifi.processors.hadoop.PutHDFS.onScheduled(PutHDFS.java:191)
> >
> >         ... 16 common frames omitted
> >
> >
>

Reply via email to