Oh, I see what my problem is... my
PigPen<http://wiki.apache.org/pig/PigPen>isn't configured correctly so
it's not highlighting any errors...

Is there any work towards something like C languages '#include' in Pig? My
large pig script is actually developed separately in several smaller pig
files. Individually the pig files do not run because they depend on previous
scripts, but logically they are separate because each step does something
different.

If '#include' is allowed, then I can actually edit the original source and
debug in PigPen as opposed to manually concatenating them and editing
outside of repository.




On Wed, Mar 10, 2010 at 9:34 AM, hc busy <hc.b...@gmail.com> wrote:

>
> Okay, just a quick update, I eventually found the actual java error from
> hadoop logs, but it was equally confusing. It complains of accessing the 4th
> element of a tuple that has only one item. But still, it doesn't say which
> line of pig latin introduced that error.
>
> I commented out portions of my large pig script until I found the offending
> line... I wish there was an easier way to debug this...
>
>
> On Mon, Mar 8, 2010 at 5:25 PM, hc busy <hc.b...@gmail.com> wrote:
>
>>
>> Guys, I just ran into a weird exception 500 lines into writing a pig
>> script... Below attached is the error. Does anybody have any idea about how
>> to debug this? I don't even know which step of my 500 line pig script caused
>> this error.
>>
>> Any suggestions on how to track down the offending operation?
>>
>> Thanks in advance!
>> *
>> *
>> *
>> *
>> *Pig Stack Trace*
>> *---------------*
>> *ERROR 6017: Execution failed, while processing
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp939224290,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp-1028111033,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp-198156265,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp-72050900,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp-141993299,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp2135611534,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp-2093411384,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp250626628,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp2100381358,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp167762091*
>> *
>> *
>> *org.apache.pig.backend.executionengine.ExecException: ERROR 6017:
>> Execution failed, while processing
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp939224290,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp-1028111033,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp-198156265,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp-72050900,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp-141993299,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp2135611534,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp-2093411384,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp250626628,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp2100381358,
>> hdfs://tasktracker:44445/tmp/temp1581022765/tmp167762091*
>> *        at
>> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:181)
>> *
>> *        at
>> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.execute(HExecutionEngine.java:265)
>> *
>> *        at
>> org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:777)*
>> *        at org.apache.pig.PigServer.execute(PigServer.java:770)*
>> *        at org.apache.pig.PigServer.access$100(PigServer.java:89)*
>> *        at org.apache.pig.PigServer$Graph.execute(PigServer.java:947)*
>> *        at org.apache.pig.PigServer.executeBatch(PigServer.java:249)*
>> *        at
>> org.apache.pig.tools.grunt.GruntParser.executeBatch(GruntParser.java:115)
>> *
>> *        at
>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:172)
>> *
>> *        at
>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:144)
>> *
>> *        at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:89)*
>> *        at org.apache.pig.Main.main(Main.java:320)*
>> *
>> ================================================================================
>> *
>>
>
>

Reply via email to