Seems today, if I work in cluster mode my job will  hang

a = load '/user/robert/nyse.csv'  using PigStorage(',') as
(exchange:chararray,stock_symbol:chararray,
date:chararray,stock_price_open:double,stock_price_high:double,
stock_price_low:double,
stock_price_close:double,stock_volume:int,stock_price_adj_close:double);
dump a;

When a dumps - it just hangs...
 ie

2014-08-25 15:10:14,262 [main] INFO
 org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- 1 map-reduce job(s) waiting for submission.
2014-08-25 15:10:14,762 [main] INFO
 org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- 0% complete
2014-08-25 15:10:15,267 [Thread-4] INFO
 org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths
to process : 1
2014-08-25 15:10:15,267 [Thread-4] INFO
 org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total
input paths to process : 1
...
2014-08-25 15:10:15,295 [Thread-4] INFO
 org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total
input paths (combined) to process : 1
2014-08-25 15:10:17,307 [main] INFO
 org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- HadoopJobId: job_201408251438_0006
2014-08-25 15:10:17,307 [main] INFO
 org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- More information at:
http://hdfs:50030/jobdetails.jsp?jobid=job_201408251438_0006

the above url does not work, looking in the logs directory All I see are
xml related logs

this is a tiny file like 100k

If I run the same in local mode( pig -x local) - it runs fine

robert@dime910 ~ :( $ hadoop dfs -ls
-rw-r--r--   3 robert supergroup     186505 2014-08-25 13:48
/user/robert/nyse.csv

I have a fully functioning system but something is wrong and not sure where
to look. Logs seems likely but not sure where?

any ideas, direction would be greatly appreciated
Thanks
Bob

Reply via email to