Re: libhdfs working for test program when run from ant but failing when run individually

2008-03-28 Thread Raghavendra K
Hi,
The following "segmentation fault" still exists.
I re wrote my application to use ant, but when I integrate it with libhdfs
it fails saying "segmentation fault" and "exiting with 139".
Please do help, as I have already spent a lot of time on re writing my
application to use hadoop and this one fails for no reason.

On Wed, Mar 19, 2008 at 12:52 PM, Raghavendra K <[EMAIL PROTECTED]>
wrote:

> I am passing the following arguments
>
> OS_NAME=Linux
> OS_ARCH=i386
> LIBHDFS_BUILD_DIR=/garl/garl-alpha1/home1/raghu/Desktop/hadoop-0.15.3
> /build/libhdfs
> JAVA_HOME=/garl/garl-alpha1/home1/raghu/Desktop/jdk1.5.0_14
> PLATFORM=linux
> SHLIB_VERSION=1
>
> I have commented out the line
> #PLATFORM = $(shell echo $$OS_NAME | tr [A-Z] [a-z])
> and passing PLATFORM=linux
> as the line was not executing if I just type
> make test
> separately.
>  and also changed the line
>
> $(HDFS_TEST): hdfs_test.c
> $(CC) $(CPPFLAGS) $< -L$(LIBHDFS_BUILD_DIR) -l$(LIB_NAME)
> $(LDFLAGS) -o $@
>
> (have added LDFLAGS, because when run it was complaining that ljvm.so was
> not found)
>
> Where am I going wrong? Kindly let me know if I have to provide any other
> information.
>
>
> On Tue, Mar 18, 2008 at 11:41 PM, Arun C Murthy <[EMAIL PROTECTED]>
> wrote:
>
> >
> > On Mar 14, 2008, at 11:48 PM, Raghavendra K wrote:
> >
> > > Hi,
> > >   My apologies for bugging the forum again and again.
> > > I am able to get the sample program for libhdfs working. I followed
> > > these
> > > steps.
> > >
> > > ---> compiled using ant
> > > ---> modified the test-libhdfs.sh to include CLASSPATH, HADOOP_HOME,
> > > HADOOP_CONF_DIR, HADOOP_LOG_DIR, LIBHDFS_BUILD_DIR (since I ran
> > > test-libhdfs.sh individually and dint invoke it from ant)
> > > ---> The program ran succesfully and was able to write, read and all.
> > >
> > > Now I copy the same program to a different directory and use the same
> > > Makefile(used by ant) and modified the variables accordingly. Used
> > > make test
> > > compiled successfully
> > > Used the same test-libhdfs.sh to invoke hdfs_test, but now it fails
> > > saying
> > > Segmentation Fault.
> > > I dont know where it is going wrong.
> > > Cant libhdfs be compiled without using ant? I want to test it and
> > > integrate
> > > libhdfs with my program
> > > Please do reply and help me out as this is driving me crazy.
> >
> > I can only assume there is something wrong with the values you are
> > passing for the requisite environment variables: OS_{NAME|OS_ARCH},
> > SHLIB_VERSION, LIBHDFS_VERSION, HADOOP_{HOME|CONF_DIR|LOG_DIR} since
> > it works when you run 'make test'.
> >
> > Sorry it isn't of much help... could you share the values you are
> > using for these?
> >
> > Arun
> >
> >
> > > Thanks in advance.
> > >
> > > --
> > > Regards,
> > > Raghavendra K
> >
> >
>
>
> --
> Regards,
> Raghavendra K




-- 
Regards,
Raghavendra K


Re: libhdfs working for test program when run from ant but failing when run individually

2008-03-19 Thread Raghavendra K
I am passing the following arguments

OS_NAME=Linux
OS_ARCH=i386
LIBHDFS_BUILD_DIR=/garl/garl-alpha1/home1/raghu/Desktop/hadoop-0.15.3
/build/libhdfs
JAVA_HOME=/garl/garl-alpha1/home1/raghu/Desktop/jdk1.5.0_14
PLATFORM=linux
SHLIB_VERSION=1

I have commented out the line
#PLATFORM = $(shell echo $$OS_NAME | tr [A-Z] [a-z])
and passing PLATFORM=linux
as the line was not executing if I just type
make test
separately.
 and also changed the line

$(HDFS_TEST): hdfs_test.c
$(CC) $(CPPFLAGS) $< -L$(LIBHDFS_BUILD_DIR) -l$(LIB_NAME) $(LDFLAGS)
-o $@

(have added LDFLAGS, because when run it was complaining that ljvm.so was
not found)

Where am I going wrong? Kindly let me know if I have to provide any other
information.

On Tue, Mar 18, 2008 at 11:41 PM, Arun C Murthy <[EMAIL PROTECTED]> wrote:

>
> On Mar 14, 2008, at 11:48 PM, Raghavendra K wrote:
>
> > Hi,
> >   My apologies for bugging the forum again and again.
> > I am able to get the sample program for libhdfs working. I followed
> > these
> > steps.
> >
> > ---> compiled using ant
> > ---> modified the test-libhdfs.sh to include CLASSPATH, HADOOP_HOME,
> > HADOOP_CONF_DIR, HADOOP_LOG_DIR, LIBHDFS_BUILD_DIR (since I ran
> > test-libhdfs.sh individually and dint invoke it from ant)
> > ---> The program ran succesfully and was able to write, read and all.
> >
> > Now I copy the same program to a different directory and use the same
> > Makefile(used by ant) and modified the variables accordingly. Used
> > make test
> > compiled successfully
> > Used the same test-libhdfs.sh to invoke hdfs_test, but now it fails
> > saying
> > Segmentation Fault.
> > I dont know where it is going wrong.
> > Cant libhdfs be compiled without using ant? I want to test it and
> > integrate
> > libhdfs with my program
> > Please do reply and help me out as this is driving me crazy.
>
> I can only assume there is something wrong with the values you are
> passing for the requisite environment variables: OS_{NAME|OS_ARCH},
> SHLIB_VERSION, LIBHDFS_VERSION, HADOOP_{HOME|CONF_DIR|LOG_DIR} since
> it works when you run 'make test'.
>
> Sorry it isn't of much help... could you share the values you are
> using for these?
>
> Arun
>
>
> > Thanks in advance.
> >
> > --
> > Regards,
> > Raghavendra K
>
>


-- 
Regards,
Raghavendra K


Re: libhdfs working for test program when run from ant but failing when run individually

2008-03-18 Thread Arun C Murthy


On Mar 14, 2008, at 11:48 PM, Raghavendra K wrote:


Hi,
  My apologies for bugging the forum again and again.
I am able to get the sample program for libhdfs working. I followed  
these

steps.

---> compiled using ant
---> modified the test-libhdfs.sh to include CLASSPATH, HADOOP_HOME,
HADOOP_CONF_DIR, HADOOP_LOG_DIR, LIBHDFS_BUILD_DIR (since I ran
test-libhdfs.sh individually and dint invoke it from ant)
---> The program ran succesfully and was able to write, read and all.

Now I copy the same program to a different directory and use the same
Makefile(used by ant) and modified the variables accordingly. Used  
make test

compiled successfully
Used the same test-libhdfs.sh to invoke hdfs_test, but now it fails  
saying

Segmentation Fault.
I dont know where it is going wrong.
Cant libhdfs be compiled without using ant? I want to test it and  
integrate

libhdfs with my program
Please do reply and help me out as this is driving me crazy.


I can only assume there is something wrong with the values you are  
passing for the requisite environment variables: OS_{NAME|OS_ARCH},  
SHLIB_VERSION, LIBHDFS_VERSION, HADOOP_{HOME|CONF_DIR|LOG_DIR} since  
it works when you run 'make test'.


Sorry it isn't of much help... could you share the values you are  
using for these?


Arun



Thanks in advance.

--
Regards,
Raghavendra K




libhdfs working for test program when run from ant but failing when run individually

2008-03-14 Thread Raghavendra K
Hi,
  My apologies for bugging the forum again and again.
I am able to get the sample program for libhdfs working. I followed these
steps.

---> compiled using ant
---> modified the test-libhdfs.sh to include CLASSPATH, HADOOP_HOME,
HADOOP_CONF_DIR, HADOOP_LOG_DIR, LIBHDFS_BUILD_DIR (since I ran
test-libhdfs.sh individually and dint invoke it from ant)
---> The program ran succesfully and was able to write, read and all.

Now I copy the same program to a different directory and use the same
Makefile(used by ant) and modified the variables accordingly. Used make test
compiled successfully
Used the same test-libhdfs.sh to invoke hdfs_test, but now it fails saying
Segmentation Fault.
I dont know where it is going wrong.
Cant libhdfs be compiled without using ant? I want to test it and integrate
libhdfs with my program
Please do reply and help me out as this is driving me crazy.
Thanks in advance.

-- 
Regards,
Raghavendra K