Okay, I started to read the source of pig and hadoop-common

the problem was not in pig.jar

it was in the hadoop-common.

in order to get my code to work the conf directory did not need to be in the
classpath, but the bootclasspath -Xbootclasspath/a:<dir>

you append this inside your code in the
System.setProperty("sun.boot.class.path", System.getProperty( ... ), <dir>)

after doing that, my code would then work.


On Fri, Apr 29, 2011 at 12:35 PM, Andrew Wells <agwells0...@gmail.com>wrote:

> I currently have a Executable Jar made by eclipse.
>
> I am currently able to tell this program to run its embedded program in
> local or mapreduce mode.
>
> I can run my jobs in local with no problems, and get the desired results.
>
> However when I try to run it in mapreduce mode, I have issues; it states it
> cannot find the hadoop-core.xml nor site-core.xml configuration files on the
> classpath
>
>
> Here are the steps I have taken to debug
>
> checked the 'System.getProperty("java.class.path")'  returns executable.jar
>
>
> So I execute
> 'java -cp $HADOOPCONFDIR -jar executable.jar -x mapreduce'
>
> again 'System.getProperty("java.class.path")'  returns executable.jar
>
> feed up, I implement my own way of appending to the class path creatively
>
> 'java -jar executable.jar -x mapreduce -cp $HADOOPCONFDIR'
>
> now 'System.getProperty("java.class.path")'  returns
> executable.jar:$HADOOPCONFDIR
>
> but I still get an error
> cannot find the hadoop-core.xml nor site-core.xml
>
>
> Save this computers life,
>
> AGWELLS
>

Reply via email to