[ 
https://issues.apache.org/jira/browse/HIVE-487?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lipcon updated HIVE-487:
-----------------------------

    Attachment: hive-487-runtime.patch

Attaching a new patch which makes the shim behavior happen at runtime. Here's 
the general idea:

- the shims/build.xml now uses Ivy to download tarballs for Hadoop 17, 18, 19, 
and 20. It builds each of the shim sources (from src/0.XX/), which have now 
been renamed so that each classname is unique (eg Hadoop20Shims.class).
- The results of all of these builds end up in a single hive_shims.jar
- Instead of being classes with all static methods, the shim classes are now 
non-static and are instantiated using ShimLoader.class, in a new 
shims/src/common/ directory
- ShimLoader simply uses o.a.h.util.VersionInfo to determine the current 
version info, and reflection to instantiate the proper shims for the current 
version.

I've tested this against pseudodistributed 18 and 20 clusters and it seemed to 
work. Unit tests also appear to work, though I haven't had a chance to let them 
run all the way through. I have not tested HWI at all as of yet.

Still TODO:
- I may have broken eclipse integration somewhat. I'm hoping someone who uses 
Eclipse can twiddle the necessary stuff there.
- I would appreciate a review of the javadocs for the HadoopShims interface. I 
don't know the specifics of some of the 17 behavior, so my docs are lame and 
vague.
- I think build.xml needs to be modified just a bit more so that the output 
directory/tarball no longer includes ${hadoop.version} in it. Additionally 
there are one or two ant conditionals based on hadoop version - I haven't had a 
chance to investigate them, but they should probably be removed
- I think we should have a policy that hadoop.version defaults to the most 
recently released apache trunk - right now it defaults to 0.19.
- To compile the shims we're downloading the entire release tarballs off the 
apache mirror. Would be nicer if we could just download the specific jars we 
need to compile against, but that might be a pipe dream.

> Hive does not compile with Hadoop 0.20.0
> ----------------------------------------
>
>                 Key: HIVE-487
>                 URL: https://issues.apache.org/jira/browse/HIVE-487
>             Project: Hadoop Hive
>          Issue Type: Bug
>    Affects Versions: 0.3.0
>            Reporter: Aaron Kimball
>            Assignee: Justin Lynn
>            Priority: Blocker
>             Fix For: 0.4.0
>
>         Attachments: dynamic-proxy.tar.gz, HIVE-487-2.patch, 
> hive-487-jetty-2.diff, hive-487-jetty.patch, hive-487-runtime.patch, 
> hive-487-with-cli-changes.2.patch, hive-487-with-cli-changes.3.patch, 
> hive-487-with-cli-changes.patch, hive-487.3.patch, hive-487.4.patch, 
> HIVE-487.patch, hive-487.txt, hive-487.txt, jetty-patch.patch, 
> junit-patch1.html, patch-487.txt
>
>
> Attempting to compile Hive with Hadoop 0.20.0 fails:
> aa...@jargon:~/src/ext/svn/hive-0.3.0$ ant -Dhadoop.version=0.20.0 package
> (several lines elided)
> compile:
>      [echo] Compiling: hive
>     [javac] Compiling 261 source files to 
> /home/aaron/src/ext/svn/hive-0.3.0/build/ql/classes
>     [javac] 
> /home/aaron/src/ext/svn/hive-0.3.0/build/ql/java/org/apache/hadoop/hive/ql/exec/ExecDriver.java:94:
>  cannot find symbol
>     [javac] symbol  : method getCommandLineConfig()
>     [javac] location: class org.apache.hadoop.mapred.JobClient
>     [javac]       Configuration commandConf = 
> JobClient.getCommandLineConfig();
>     [javac]                                            ^
>     [javac] 
> /home/aaron/src/ext/svn/hive-0.3.0/build/ql/java/org/apache/hadoop/hive/ql/io/HiveInputFormat.java:241:
>  cannot find symbol
>     [javac] symbol  : method validateInput(org.apache.hadoop.mapred.JobConf)
>     [javac] location: interface org.apache.hadoop.mapred.InputFormat
>     [javac]       inputFormat.validateInput(newjob);
>     [javac]                  ^
>     [javac] Note: Some input files use or override a deprecated API.
>     [javac] Note: Recompile with -Xlint:deprecation for details.
>     [javac] Note: Some input files use unchecked or unsafe operations.
>     [javac] Note: Recompile with -Xlint:unchecked for details.
>     [javac] 2 errors
> BUILD FAILED
> /home/aaron/src/ext/svn/hive-0.3.0/build.xml:145: The following error 
> occurred while executing this line:
> /home/aaron/src/ext/svn/hive-0.3.0/ql/build.xml:135: Compile failed; see the 
> compiler error output for details.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to