Re: HDFS "file" missing a part-file

2012-10-02 Thread Robert Molina
What I guess might be happening is that your data may contain some text data that pig is not fully parsing because the data contains characters that pig uses as delimiters (i.e commas and curly brackets). Thus, you can probably take a look at the data and see if you can find any of the characters

Re: HDFS "file" missing a part-file

2012-10-02 Thread Björn-Elmar Macek
Hi again, i executed a slightly different script again, that included some more operations. The logs look similar, but this time i have 2 attempt files for the same job-package: (1) _temporary/_attempt_201210021204_0001_r_01_0/part-r-1 (2) _temporary/_attempt_201210021204_0001_r_01

Re: HDFS "file" missing a part-file

2012-10-01 Thread Björn-Elmar Macek
The script i now want to executed looks like this: x = load 'tag_count_ts_pro_userpair' as (group:tuple(),cnt:int,times:bag{t:tuple(c:chararray)}); y = foreach x generate *, moins.daysFromStart('2011-06-01 00:00:00', times); store y into 'test_daysFromStart'; The problem is, that i do not h

Re: HDFS "file" missing a part-file

2012-10-01 Thread Robert Molina
It seems that maybe the previous pig script didn't generate the output data or write correctly on hdfs. Can you provide the pig script you are trying to run? Also, for the original script that ran and generated the file, can you verify if that job had any failed tasks? On Mon, Oct 1, 2012 at 10:

Re: HDFS "file" missing a part-file

2012-10-01 Thread Björn-Elmar Macek
Hi Robert, the exception i see in the output of the grunt shell and in the pig log respectively is: Backend error message - java.util.EmptyStackException at java.util.Stack.peek(Stack.java:102) at org.apache.pig.builtin.Utf8StorageConverter.consumeTuple(U

Re: HDFS "file" missing a part-file

2012-10-01 Thread Robert Molina
Hi Bjorn, Can you post the exception you are getting during the map phase? On Mon, Oct 1, 2012 at 9:11 AM, Björn-Elmar Macek wrote: > Hi, > > i am kind of unsure where to post this problem, but i think it is more > related to hadoop than to pig. > > By successfully executing a pig script i crea

HDFS "file" missing a part-file

2012-10-01 Thread Björn-Elmar Macek
Hi, i am kind of unsure where to post this problem, but i think it is more related to hadoop than to pig. By successfully executing a pig script i created a new file in my hdfs. Sadly though, i cannot use it for further processing except for "dump"ing and viewing the data: every data-manipul