cdmikechen commented on issue #774: Matching question of the version in Spark 
and Hive2 
URL: https://github.com/apache/incubator-hudi/issues/774#issuecomment-516722052
 
 
   Another things. I found that Hive2 use log4j2 and spark use log4j. If I 
commit a spark task like
   ```bash
   spark-submit --class xxxxxx.Main --jars 
xxx.jar,hoodie-spark-bundle-0.4.8-SNAPSHOT.jar  xxx/sparkserver.jar
   ```
   It will report error:
   ```log
   ERROR StatusLogger Unrecognized format specifier [d]
   ERROR StatusLogger Unrecognized conversion specifier [d] starting at 
position 16 in conversion pattern.
   ERROR StatusLogger Unrecognized format specifier [thread]
   ERROR StatusLogger Unrecognized conversion specifier [thread] starting at 
position 25 in conversion pattern.
   ERROR StatusLogger Unrecognized format specifier [level]
   ERROR StatusLogger Unrecognized conversion specifier [level] starting at 
position 35 in conversion pattern.
   ERROR StatusLogger Unrecognized format specifier [logger]
   ERROR StatusLogger Unrecognized conversion specifier [logger] starting at 
position 47 in conversion pattern.
   ERROR StatusLogger Unrecognized format specifier [msg]
   ERROR StatusLogger Unrecognized conversion specifier [msg] starting at 
position 54 in conversion pattern.
   ERROR StatusLogger Unrecognized format specifier [n]
   ERROR StatusLogger Unrecognized conversion specifier [n] starting at 
position 56 in conversion pattern.
   Exception in thread "main" java.lang.AbstractMethodError: 
org.apache.logging.log4j.core.config.ConfigurationFactory.getConfiguration(Lorg/apache/logging/log4j/core/config/ConfigurationSource;)Lorg/apache/logging/log4j/core/config/Configuration;
           at 
org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:509)
           at 
org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:449)
   ...
   ...
   ```
   So that I need to remove some hive dependencies in pom like
   ```xml
       <dependency>
         <groupId>${hive.groupid}</groupId>
         <artifactId>hive-jdbc</artifactId>
         <version>${hive.version}</version>
         <exclusions>
           <exclusion>
             <groupId>org.eclipse.jetty.aggregate</groupId>
             <artifactId>jetty-all</artifactId>
           </exclusion>
           <exclusion>
             <groupId>org.apache.logging.log4j</groupId>
             <artifactId>log4j-1.2-api</artifactId>
           </exclusion>
           <exclusion>
             <groupId>org.apache.logging.log4j</groupId>
             <artifactId>log4j-web</artifactId>
           </exclusion>
           <exclusion>
             <groupId>org.apache.logging.log4j</groupId>
             <artifactId>log4j-slf4j-impl</artifactId>
           </exclusion>
           <exclusion>
             <groupId>${hive.groupid}</groupId>
             <artifactId>hive-exec</artifactId>
           </exclusion>
         </exclusions>
       </dependency>
   ```
   After that spark can work.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to