Hi Nitin,

I have executed few  test cases and here are my observations.

a) i am not using any utilities for upgrading to 0.11. Just executing the
same  hql which work in 0.8.1.6 in 0.11

b)In my join i am having a view which has an UDAF. (
https://github.com/scribd/hive-udaf-maxrow)
     When i try to join this view(with UDAF) it another table, i am getting
the below errors:


java.lang.InstantiationException:
org.apache.hadoop.hive.ql.parse.ASTNodeOrigin
Continuing ...
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
Continuing ...
java.lang.InstantiationException:
org.apache.hadoop.hive.ql.parse.ASTNodeOrigin
Continuing ...
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
Continuing ...
java.lang.InstantiationException:
org.apache.hadoop.hive.ql.parse.ASTNodeOrigin
Continuing ...
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
Continuing ...
java.lang.InstantiationException:
org.apache.hadoop.hive.ql.parse.ASTNodeOrigin
Continuing ...
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
----------------------------------------------------------------------------------------

My query looks like:

select v.* from view1 v join table1 t  on t.col1=v.col1

The same query works in in 0.8.1.6 without any issues.
This query works in 0.11 , if i remove UDAF from the view.

Do i need to rebuild the UDAF separately for 0.11?
In general, i expect the hql which works in 0.8.1.6 should work in 0.11
without having any code changes. please correct me , if my assumption is
incorrect.

Thanks,
Pandeeswaran



On Wed, Aug 7, 2013 at 9:00 PM, Nitin Pawar <nitinpawar...@gmail.com> wrote:

> Will it be possible for you to share your query ? and if you are using any
> custom udf then the java code for the same ?
>
> how are you upgrading from hive-0.8 to hive-0.11?
>
> aws announced that EMR supports hive 0.11  and that was 4 days ago. Can
> you check if you need to see if you need to change something on EMR side ?
>
>
> On Wed, Aug 7, 2013 at 8:28 PM, pandees waran <pande...@gmail.com> wrote:
>
>> Hi Nitin,
>>
>> Nope! it ended up with below error messages:
>>
>> Examining task ID: task_201308070831_0010_m_000052 (and more) from job
>> job_201308070831_0010
>> Exception in thread "Thread-98" java.lang.ClassFormatError: Absent
>> Code attribute in method that is not native or abstract in class file
>> javax/servlet/http/HttpServlet
>>         at java.lang.ClassLoader.defineClass1(Native Method)
>>         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>         at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>         at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>         at java.lang.ClassLoader.defineClass1(Native Method)
>>         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>         at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>         at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>         at
>> org.apache.hadoop.hive.shims.Hadoop20SShims.getTaskAttemptLogUrl(Hadoop20SShims.java:49)
>>         at
>> org.apache.hadoop.hive.ql.exec.JobDebugger$TaskInfoGrabber.getTaskInfos(JobDebugger.java:190)
>>         at
>> org.apache.hadoop.hive.ql.exec.JobDebugger$TaskInfoGrabber.run(JobDebugger.java:146)
>>         at java.lang.Thread.run(Thread.java:662)
>> Counters:
>> FAILED: Execution Error, return code 2 from
>> org.apache.hadoop.hive.ql.exec.MapRedTask
>> MapReduce Jobs Launched:
>> Job 0: Map: 67  Reduce: 1   Cumulative CPU: 4172.93 sec   HDFS Read:
>> 43334 HDFS Write: 12982162918 SUCCESS
>> Job 1: Map: 51   HDFS Read: 0 HDFS Write: 0 FAIL
>> Total MapReduce CPU Time Spent: 0 days 1 hours 9 minutes 32 seconds 930
>> msec
>>
>>
>> But, the same query works fine in hive 0.8.1.6 without any issues.
>> i am working on the 0.11 upgrade and facing this issue.
>>
>> Thanks,
>> Pandeeswaran
>>
>> On 8/7/13, Nitin Pawar <nitinpawar...@gmail.com> wrote:
>> > before applying patch,
>> >
>> > can you confirm that map join query worked fine and gave the results you
>> > wanted?
>> >
>> >
>> > On Wed, Aug 7, 2013 at 6:46 PM, Sathya Narayanan K <ksat...@live.com>
>> > wrote:
>> >
>> >> Hi,****
>> >>
>> >> ** **
>> >>
>> >> I am also facing the same issue. Could anyone please suggest whether we
>> >> can apply any patch?****
>> >>
>> >> ** **
>> >>
>> >> Thanks,****
>> >>
>> >> Sathya Narayanan ****
>> >>
>> >> ** **
>> >>
>> >> *From:* pandees waran [mailto:pande...@gmail.com]
>> >> *Sent:* Wednesday, August 07, 2013 6:39 PM
>> >> *To:* user@hive.apache.org
>> >> *Subject:* Join issue in 0.11****
>> >>
>> >> ** **
>> >>
>> >> Hi,
>> >>
>> >> I am facing the same issue as mentioned in the below JIRA:
>> >>
>> >> https://issues.apache.org/jira/browse/HIVE-3872****
>> >>
>> >> I am using amazon EMR with hive 0.11. ****
>> >>
>> >> Do i need to apply any patch on top of 0.11 to fix this NPE issue.?
>> >> ****
>> >>
>> >>
>> >> -- ****
>> >>
>> >> Thanks,****
>> >>
>> >> Pandeeswaran****
>> >>
>> >
>> >
>> >
>> > --
>> > Nitin Pawar
>> >
>>
>>
>> --
>> Thanks,
>> Pandeeswaran
>>
>
>
>
> --
> Nitin Pawar
>



-- 
Thanks,
Pandeeswaran

Reply via email to