Also seems like the UDF is being run on the client machine (I am using
beeline). No map reduce job gets spawned. I have removed limit clause as I
found that solved the issue for someone else in the mailing list. However,
still no luck. I looked at the MapredContext class's needConfigure method
and it seems like out UDF should satisfy the condition check there and thus
the configure method should be called. What are the top things I could
check to make sure why job is not being submitted and why the UDF is being
run on the local machine itself?

On Tue, Aug 25, 2015 at 11:32 AM, Rahul Sharma <kippy....@gmail.com> wrote:

> Hi Guys,
>
> We have a UDF which extends GenericUDF and does some configuration within
> the public void configure(MapredContext ctx) method.
>
> MapredContext in configure method gives access to the HiveConfiguration
> via JobConf, which contains custom attributes of the form xy.abc.something.
> Reading these values is required for the semantics of the UDF.
>
> Everything works fine till Hive 0.13, however with Hive 0.14 (or 1.0) the
> configure method of the UDF is never called by the runtime and hence the
> UDF cannot configure itself dynamically.
>
> Is this the intended behavior? If so, what is the new way to read
> configuration of the Map Reduce Job within the UDF?
>
> I would be grateful for any help.
>

Reply via email to