Hi,

thanks for your reply. I read about the expected behavior on the front-end
and I am getting the NPE on the back-end. The Mappers log the Exception
during Execution.

I am currently digging through debug messages. What to look out for? There
are bunch of

[main] DEBUG org.apache.hadoop.conf.Configuration  - java.io.IOException:
config()

log messages. But I recall them as being "normal" for reasons I don't
remember.

Regards


2013/10/30 Cheolsoo Park <piaozhe...@gmail.com>

> Hi,
>
> Are you getting NPE on the front-end or the back-end? Sounds like jobConf
> is not added to UDFContext, which is expected on the front-end. Please see
> the comments in getJobConf() and addJobConf() in the source code:
>
>
> https://svn.apache.org/repos/asf/pig/trunk/src/org/apache/pig/impl/util/UDFContext.java
>
> Thanks,
> Cheolsoo
>
>
> On Wed, Oct 30, 2013 at 9:57 AM, Henning Kropp <henning.kr...@gmail.com
> >wrote:
>
> > Hi,
> >
> > I am stuck. In my UDF (Java) extends EvalFunc the following code throws
> and
> > NPE in exec(), when executed in -x mapred mode:
> >
> > Configuration jobConf = UDFContext.getUDFContext().getJobConf();
> > System.err.println(jobConf.toString());
> >
> > I did not find any useful information as why my JobConf is always null.
> All
> > I find is that this is the right way to get the JobConf in a UDF and that
> > the behavior of what is returned when running locally (jira issue).
> >
> > Any ideas? I am running it on a very old Hadoop version 0.20.2 Are there
> > some known issues? I use Pig 0.11.1
> >
> > Many thanks in advanced
> >
> > PS: Just found someone with the same issue
> > http://stackoverflow.com/questions/18795008/accessing-hdfs-from-pig-udf
> >
>

Reply via email to