I notice that for fs.default.name, you have in core-site.xml
hdfs://127.0.0.1:8020

This is local host address, of your local machine so hdfs is there. Where
you are expecting it to be created and where it is being created?fs

Regards,
Shahab


On Fri, Jun 7, 2013 at 10:49 PM, Venkivolu, Dayakar Reddy <
dayakarreddy.venkiv...@ca.com> wrote:

>  Hi,****
>
> ** **
>
> I have created the sample program to write the contents into HDFS file
> system. The file gets created successfully, but unfortunately the file is
> getting created in****
>
> Local system instead of HDFS system.****
>
> ** **
>
> Here is the source code of sample program:****
>
> ** **
>
> *int main(int argc, char **argv) {*
>
> * *
>
> *         const char* writePath = "/user/testuser/test1.txt";*
>
> *        const char *tuser = "root";*
>
> *        hdfsFS fs = NULL;*
>
> *        int exists = 0;*
>
> * *
>
> *        fs = hdfsConnectAsUser("default", 0, tuser);*
>
> *        if(fs == NULL ) {*
>
> *                fprintf(stderr, "Oops! Failed to connect to hdfs!\n");*
>
> *                exit(-1);*
>
> *                }*
>
> * *
>
> *          hdfsFile writeFile = hdfsOpenFile(fs, writePath,
> O_WRONLY|O_CREAT, 0, 0, 0);*
>
> *                  if(!writeFile) {*
>
> *                                fprintf(stderr, "Failed to open %s for
> writing!\n", writePath);*
>
> *                                exit(-1);*
>
> *                                }*
>
> *                *
>
> *                  fprintf(stderr, "Opened %s for writing
> successfully...\n", writePath);*
>
> *                  *
>
> *                  char* buffer = "Hello, World!";*
>
> *                  tSize num_written_bytes = hdfsWrite(fs, writeFile,
> (void*)buffer, strlen(buffer)+1);*
>
> *                  fprintf(stderr, "Wrote %d bytes\n", num_written_bytes);
> *
>
> *                *
>
> *                  fprintf(stderr, "Flushed %s successfully!\n",
> writePath);*
>
> *                  hdfsCloseFile(fs, writeFile);*
>
> } ****
>
> ** **
>
> *CLASSPATH*
>
> /usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/avro-1.7.3.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:
>
>
> /usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:
>
>
> /usr/lib/hadoop/lib/commons-collections-3.2.1.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:
>
>
> /usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/commons-io-2.1.jar:/usr/lib/hadoop/lib/commons-lang-2.5.jar:
>
>
> /usr/lib/hadoop/lib/commons-logging-1.1.1.jar:/usr/lib/hadoop/lib/commons-math-2.1.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:
>
>
> /usr/lib/hadoop/lib/hue-plugins-2.2.0-cdh4.2.0.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.8.8.jar:
>
>
> /usr/lib/hadoop/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop/lib/jackson-xc-1.8.8.jar:/usr/lib/hadoop/lib/jackson-xc-1.8.8.jar:
>
>
> /usr/lib/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:
>
>
> /usr/lib/hadoop/lib/jersey-core-1.8.jar:/usr/lib/hadoop/lib/jersey-json-1.8.jar:/usr/lib/hadoop/lib/jersey-server-1.8.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:
>
>         /usr/lib/hadoop/lib/jetty-6.1.26.cloudera.2.jar:
> /usr/lib/hadoop/lib/jetty-util-6.1.26.cloudera.2.jar:/usr/lib/hadoop/lib/jline-0.9.94.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:
>
>
> /usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/jsr305-1.3.9.jar:/usr/lib/hadoop/lib/junit-4.8.2.jar:/usr/lib/hadoop/lib/kfs-0.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:
>
>
> /usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/protobuf-java-2.4.0a.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:
>
>
> /usr/lib/hadoop/lib/slf4j-api-1.6.1.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.6.1.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/stax-api-1.0.1.jar:
>
>
> /usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/zookeeper-3.4.5-cdh4.2.0.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-auth-2.0.0-cdh4.2.0.jar:
>
>
> /usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-common-2.0.0-
> hdfs_site.xml <http://hadoop.6.n7.nabble.com/file/n69039/hdfs_site.xml>
> core-site.xml 
> <http://hadoop.6.n7.nabble.com/file/n69039/core-site.xml>cdh4.2.0.jar:/usr/lib/hadoop/hadoop-common-2.0.0-cdh4.2.0-tests.jar:/usr/lib/hadoop/hadoop-common.jar:
>
>
> /usr/lib/hadoop/etc/hadoop/yarn-site.xml:/usr/lib/hadoop/etc/hadoop/core-site.xml:/usr/lib/hadoop/etc/hadoop/hadoop-metrics.properties:/usr/lib/hadoop/etc/hadoop/hdfs-site.xml:
>
>         /usr/lib/hadoop/etc/hadoop/mapred-site.xml****
>
> ** **
>
> Please find attached the hdfs_site.xml and core_site.xml.****
>
> ** **
>
> Regards,****
>
> Dayakar****
>

Reply via email to