You don't have to do that. You just need to copy the hdfs-site.xml,
mapred-site.xml and yarn-site.xml of the cluster configuration and put that
in your eclipse classpath.

On Thu, Dec 18, 2014 at 6:09 PM, 李运田 <[email protected]> wrote:
>
> hi all.
> I want to use pig in eclipse.my hadoop(yarn) cluster and eclipse are in
> the same linux cluster .my pig configuration  in eclipse::
>
>  Properties props = new Properties();
>      props.setProperty("fs.defaultFS", "hdfs://10.210.90.*:8020");
>      props.setProperty("hadoop.job.user", "hadoop");
>      props.setProperty("mapreduce.framework.name", "yarn");
>      props.setProperty("yarn.resourcemanager.hostname", "10.210.90.*");
>      props.setProperty("yarn.resourcemanager.admin.address",
> "10.210.90.*:8141");
>         props.setProperty("yarn.resourcemanager.address",
> "10.210.90.*:8050");
>      props.setProperty("yarn.resourcemanager.resource-tracker.address",
> "10.210.90.*:8025");
>      props.setProperty("yarn.resourcemanager.scheduler.address",
> "10.210.90.*:8030");
>
>
> but,it  is not connected. I dont know how I can configure the pig in
> eclipse?
> can you help me? please

Reply via email to