Hi Dave,

The same happened to me because even though we are not supposed to set the
env variables for PIG, it needs them. So go to your sh file and edite it
with whatever your values are.

#!/bin/sh
PIG_PATH=$HOME/bin/pig-0.7.0
PIG_CLASSPATH=$PIG_PATH/pig-0.3.0-core.jar:$HOME/bin/hadoop-0.20.2/conf \
PIG_HADOOP_VERSION=0.20.2 \

I found this recommendations on the web but referencing old versions, I
tried anyways and worked (:

2010/7/1 Jeff Zhang <zjf...@gmail.com>

> Try to put the core-site.xml , hdfs-site.xml, mapred-site.xml under conf
> folder
>
>
>
> On Thu, Jul 1, 2010 at 1:58 PM, Dave Viner <davevi...@pobox.com> wrote:
>
> > Whenever I start up pig from the commandline, I see the same message from
> > both -x local and -x mapreduce:
> >
> > [main] INFO
>  org.apache.pig.backend.hadoop.executionengine.HExecutionEngine
> > - Connecting to hadoop file system at: file:///
> >
> > Somehow, that feels like it's not connecting to my running Hadoop
> cluster.
> >  Is there some way to verify that Pig is talking to my Hadoop cluster?
>  Is
> > there some additional setting that I need to use in order to "point" Pig
> to
> > my Hadoop cluster?
> >
> > Thanks
> > Dave Viner
> >
>
>
>
> --
> Best Regards
>
> Jeff Zhang
>

Reply via email to