Hi,

As far as I understand HADOOP ( by following its scripts like start-all.sh
etc .. a& my experience ) it does not care about the rest of the
environmental variable as long as you provide right values in its
environment file.

ie .. conf/hadoop-env.sh

set some thing similar to get around your problem.

# The java implementation to use.  Required.
 export JAVA_HOME="/cygdrive/c/Program Files/Java/jre1.5.0_11"
#/usr/lib/j2sdk1.5-sun

Regards,
-Vikas.


On Sat, May 24, 2008 at 6:11 AM, vatsan <[EMAIL PROTECTED]> wrote:

>
> I have installed hadoop on cygwin , I am running windows XP.
>
> My Java directory is C:\Program Files\Java\jre1.6.0_06
>
> I am not able to run hadoop as it complains of "no such file or directory
> error".
>
> I did some searching and found out someone had proposed a solution of doing
>
> SET JAVA_HOME=C:\Program Files\Java\jre1.6.0_06
>
> in the Cygwin.bat file,
>
> but that doesn't work for me.
>
> Neither does using the absolute path name "\cygwin\c\Program Files\Java" OR
> using  \cygwin\c\"Program Files"\Java
>
> Can someone guide me here?
>
> (I understand that the problem is because of the path convention conflicts
> in windows and Cygwin, I found some stuff on fixes for the path issues that
> spoke of using cygpath.exe as a fix ... for example while running a java
> program on cygwin, but could not find anything that addressed my problem.)
>
> --
> View this message in context:
> http://www.nabble.com/JAVA_HOME-Cygwin-problem-%28solution-doesn%27t-work%29-tp17443172p17443172.html
> Sent from the Hadoop core-user mailing list archive at Nabble.com.
>
>

Reply via email to