[ 
https://issues.apache.org/jira/browse/NIFI-2828?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15546250#comment-15546250
 ] 

ASF GitHub Bot commented on NIFI-2828:
--------------------------------------

Github user bbende commented on the issue:

    https://github.com/apache/nifi/pull/1075
  
    Tried this out on against an HDP 2.4 sandbox and got an exception that 
boiled down to:
    
    ```
    Caused by: java.io.IOException: No FileSystem for scheme: hdfs
        at 
org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2584) 
~[hadoop-common-2.6.2.jar:na]
        at 
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591) 
~[hadoop-common-2.6.2.jar:na]
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91) 
~[hadoop-common-2.6.2.jar:na]
        at 
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630) 
~[hadoop-common-2.6.2.jar:na]
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612) 
~[hadoop-common-2.6.2.jar:na]
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370) 
~[hadoop-common-2.6.2.jar:na]
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296) 
~[hadoop-common-2.6.2.jar:na]
        at 
org.apache.hadoop.hive.ql.io.orc.OrcRecordUpdater.<init>(OrcRecordUpdater.java:221)
 ~[hive-exec-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat.getRecordUpdater(OrcOutputFormat.java:292)
 ~[hive-exec-1.2.1.jar:1.2.1]
        at 
org.apache.hive.hcatalog.streaming.AbstractRecordWriter.createRecordUpdater(AbstractRecordWriter.java:141)
 ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]
        at 
org.apache.hive.hcatalog.streaming.AbstractRecordWriter.newBatch(AbstractRecordWriter.java:121)
 ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]
        ... 10 common frames omitted
    ```
    Comparing the JARs that get included with Hadoop Libraries NAR vs the Hive 
NAR there appear to be some differences.
    
    Hadoop Libraries NAR:
    hadoop-annotations-2.6.2.jar
    hadoop-auth-2.6.2.jar
    hadoop-client-2.6.2.jar
    hadoop-common-2.6.2.jar
    hadoop-hdfs-2.6.2.jar
    hadoop-mapreduce-client-app-2.6.2.jar
    hadoop-mapreduce-client-common-2.6.2.jar
    hadoop-mapreduce-client-core-2.6.2.jar
    hadoop-mapreduce-client-jobclient-2.6.2.jar
    hadoop-mapreduce-client-shuffle-2.6.2.jar
    hadoop-yarn-api-2.6.2.jar
    hadoop-yarn-client-2.6.2.jar
    hadoop-yarn-common-2.6.2.jar
    hadoop-yarn-server-common-2.6.2.jar
    
    Hive NAR:
    hadoop-annotations-2.6.2.jar
    hadoop-auth-2.6.2.jar
    hadoop-common-2.6.2.jar
    hadoop-mapreduce-client-core-2.6.2.jar
    hadoop-yarn-api-2.6.2.jar
    hadoop-yarn-common-2.6.2.jar
    hadoop-yarn-server-applicationhistoryservice-2.6.0.jar
    hadoop-yarn-server-common-2.6.0.jar
    hadoop-yarn-server-resourcemanager-2.6.0.jar
    hadoop-yarn-server-web-proxy-2.6.0.jar
    
    I think the Hive NAR at least needs the hadoop-hdfs jar, but not sure what 
else.


> SelectHiveQL and PutHiveQL fail with NoClassDefFoundError when using HTTP 
> transport 
> ------------------------------------------------------------------------------------
>
>                 Key: NIFI-2828
>                 URL: https://issues.apache.org/jira/browse/NIFI-2828
>             Project: Apache NiFi
>          Issue Type: Bug
>            Reporter: Joey Frazee
>            Assignee: Matt Burgess
>             Fix For: 1.1.0
>
>
> SelectHiveQL and PutHiveQL don't currently work with HTTP transport. There 
> appears to be a class loader problem resulting in 
> java.lang.NoClassDefFoundError: Could not initialize class 
> org.apache.http.conn.ssl.SSLConnectionSocketFactory.
> This looks like a conflict with the Apache commons httpclient version in 
> hadoop-common. Removing the hadoop-libraries .nar dependency and provided 
> scope for hadoop-common appears to fix the issue, but I haven't done any 
> rigorous testing so I'm not sure if there are other consequences or not.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to