HI Jagat Singh
I have the permission to write file there.
My scrip consists of Loops,which means several job will be created.
It is weird such error happens irregularly, sometimes first job will fail,
sometimes certain
job will fail after several successful jobs.
2012/11/23 Jagat Singh
> First
First check i would do is the permission check of this temp folder.
On Sat, Nov 24, 2012 at 2:19 PM, Jieru Shi wrote:
> Hi
> I'm using embeded Pig to implement graph algorithm.
> It is fine when I worked in local mode, but when I worked on hadoop
> cluster,
> there always popped up some error m
Hi,
I dont have a system to test it right now, but I have been passing it using
under parameter -p and it works.
change line to accept parameters like avro = load '$INPUT' USING
AvroStorage();
bin/pig -p INPUT="/data/2012/trace_ejb3/2012-**01-0[12].avro"
I think if you dont give doubl
Hello,
I have the following files on HDFS:
-rw-r--r-- 3 hdfs supergroup 22989179 2012-11-22 11:17
/data/2012/trace_ejb3/2012-01-01.avro
-rw-r--r-- 3 hdfs supergroup 240551819 2012-11-22 14:27
/data/2012/trace_ejb3/2012-01-02.avro
-rw-r--r-- 3 hdfs supergroup 324464635 2012-11-22 18:2