Hi,
The following "segmentation fault" still exists.
I re wrote my application to use ant, but when I integrate it with libhdfs
it fails saying "segmentation fault" and "exiting with 139".
Please do help, as I have already spent a lot of time on re writing my
application to use hadoop and this one
I am passing the following arguments
OS_NAME=Linux
OS_ARCH=i386
LIBHDFS_BUILD_DIR=/garl/garl-alpha1/home1/raghu/Desktop/hadoop-0.15.3
/build/libhdfs
JAVA_HOME=/garl/garl-alpha1/home1/raghu/Desktop/jdk1.5.0_14
PLATFORM=linux
SHLIB_VERSION=1
I have commented out the line
#PLATFORM = $(shell echo $$
On Mar 14, 2008, at 11:48 PM, Raghavendra K wrote:
Hi,
My apologies for bugging the forum again and again.
I am able to get the sample program for libhdfs working. I followed
these
steps.
---> compiled using ant
---> modified the test-libhdfs.sh to include CLASSPATH, HADOOP_HOME,
HADOOP_C
Hi,
My apologies for bugging the forum again and again.
I am able to get the sample program for libhdfs working. I followed these
steps.
---> compiled using ant
---> modified the test-libhdfs.sh to include CLASSPATH, HADOOP_HOME,
HADOOP_CONF_DIR, HADOOP_LOG_DIR, LIBHDFS_BUILD_DIR (since I ran
te