Thanks Shekhar,

The problem was not in my building of the jar.  It was in fact in execution

I was running command

*hadoop -jar* <jar filename> <qualified class name> input output

The problem was with -jar.  It should be

*hadoop jar* <jar filename> <qualified class name> input output


Thanks for help once again

regards
ashish


On Tue, Jul 23, 2013 at 10:31 AM, Shekhar Sharma <shekhar2...@gmail.com>wrote:

> hadoop jar wc.jar <fully qualified driver name> inputdata outputdestination
>
>
> Regards,
> Som Shekhar Sharma
> +91-8197243810
>
>
> On Tue, Jul 23, 2013 at 10:58 PM, Ashish Umrani 
> <ashish.umr...@gmail.com>wrote:
>
>> Jitendra, Som,
>>
>> Thanks.  Issue was in not having any file there.  Its working fine now.
>>
>> I am able to do -ls and could also do -mkdir and -put.
>>
>> Now is time to run the jar and apparently I am getting
>>
>> no main manifest attribute, in wc.jar
>>
>>
>> But I believe its because of maven pom file does not have the main class
>> entry.
>>
>> Which I go ahead and change the pom file and build it again, please let
>> me know if you guys think of some other reason.
>>
>> Once again this user group rocks.  I have never seen this quick a
>> response.
>>
>> Regards
>> ashish
>>
>>
>> On Tue, Jul 23, 2013 at 10:21 AM, Jitendra Yadav <
>> jeetuyadav200...@gmail.com> wrote:
>>
>>> Try..
>>>
>>> *hadoop fs -ls /*
>>>
>>> **
>>> Thanks
>>>
>>>
>>> On Tue, Jul 23, 2013 at 10:27 PM, Ashish Umrani <ashish.umr...@gmail.com
>>> > wrote:
>>>
>>>> Thanks Jitendra, Bejoy and Yexi,
>>>>
>>>> I got past that.  And now the ls command says it can not access the
>>>> directory.  I am sure this is a permissions issue.  I am just wondering
>>>> which directory and I missing permissions on.
>>>>
>>>> Any pointers?
>>>>
>>>> And once again, thanks a lot
>>>>
>>>> Regards
>>>> ashish
>>>>
>>>>  *hduser@ashish-HP-Pavilion-dv6-Notebook-PC:/usr/local/hadoop/conf$
>>>> hadoop fs -ls*
>>>> *Warning: $HADOOP_HOME is deprecated.*
>>>> *
>>>> *
>>>> *ls: Cannot access .: No such file or directory.*
>>>>
>>>>
>>>>
>>>> On Tue, Jul 23, 2013 at 9:42 AM, Jitendra Yadav <
>>>> jeetuyadav200...@gmail.com> wrote:
>>>>
>>>>> Hi Ashish,
>>>>>
>>>>> Please check <property></property>  in hdfs-site.xml.
>>>>>
>>>>> It is missing.
>>>>>
>>>>> Thanks.
>>>>> On Tue, Jul 23, 2013 at 9:58 PM, Ashish Umrani <
>>>>> ashish.umr...@gmail.com> wrote:
>>>>>
>>>>>> Hey thanks for response.  I have changed 4 files during installation
>>>>>>
>>>>>> core-site.xml
>>>>>> mapred-site.xml
>>>>>> hdfs-site.xml   and
>>>>>> hadoop-env.sh
>>>>>>
>>>>>>
>>>>>> I could not find any issues except that all params in the
>>>>>> hadoop-env.sh are commented out.  Only java_home is un commented.
>>>>>>
>>>>>> If you have a quick minute can you please browse through these files
>>>>>> in email and let me know where could be the issue.
>>>>>>
>>>>>> Regards
>>>>>> ashish
>>>>>>
>>>>>>
>>>>>>
>>>>>> I am listing those files below.
>>>>>>  *core-site.xml *
>>>>>>  <?xml version="1.0"?>
>>>>>> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>>>>>>
>>>>>> <!-- Put site-specific property overrides in this file. -->
>>>>>>
>>>>>> <configuration>
>>>>>>   <property>
>>>>>>     <name>hadoop.tmp.dir</name>
>>>>>>     <value>/app/hadoop/tmp</value>
>>>>>>     <description>A base for other temporary directories.</description>
>>>>>>   </property>
>>>>>>
>>>>>>   <property>
>>>>>>     <name>fs.default.name</name>
>>>>>>     <value>hdfs://localhost:54310</value>
>>>>>>     <description>The name of the default file system.  A URI whose
>>>>>>     scheme and authority determine the FileSystem implementation.  The
>>>>>>     uri's scheme determines the config property (fs.SCHEME.impl)
>>>>>> naming
>>>>>>     the FileSystem implementation class.  The uri's authority is used
>>>>>> to
>>>>>>     determine the host, port, etc. for a filesystem.</description>
>>>>>>   </property>
>>>>>> </configuration>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *mapred-site.xml*
>>>>>>  <?xml version="1.0"?>
>>>>>> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>>>>>>
>>>>>> <!-- Put site-specific property overrides in this file. -->
>>>>>>
>>>>>> <configuration>
>>>>>>   <property>
>>>>>>     <name>mapred.job.tracker</name>
>>>>>>     <value>localhost:54311</value>
>>>>>>     <description>The host and port that the MapReduce job tracker runs
>>>>>>     at.  If "local", then jobs are run in-process as a single map
>>>>>>     and reduce task.
>>>>>>     </description>
>>>>>>   </property>
>>>>>> </configuration>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *hdfs-site.xml   and*
>>>>>>  <?xml version="1.0"?>
>>>>>> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>>>>>>
>>>>>> <!-- Put site-specific property overrides in this file. -->
>>>>>>
>>>>>> <configuration>
>>>>>>   <name>dfs.replication</name>
>>>>>>   <value>1</value>
>>>>>>   <description>Default block replication.
>>>>>>     The actual number of replications can be specified when the file
>>>>>> is created.
>>>>>>     The default is used if replication is not specified in create
>>>>>> time.
>>>>>>   </description>
>>>>>> </configuration>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *hadoop-env.sh*
>>>>>>  # Set Hadoop-specific environment variables here.
>>>>>>
>>>>>> # The only required environment variable is JAVA_HOME.  All others are
>>>>>> # optional.  When running a distributed configuration it is best to
>>>>>> # set JAVA_HOME in this file, so that it is correctly defined on
>>>>>> # remote nodes.
>>>>>>
>>>>>> # The java implementation to use.  Required.
>>>>>> export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_25
>>>>>>
>>>>>> # Extra Java CLASSPATH elements.  Optional.
>>>>>> # export HADOOP_CLASSPATH=
>>>>>>
>>>>>>
>>>>>> All pther params in hadoop-env.sh are commented
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Tue, Jul 23, 2013 at 8:38 AM, Jitendra Yadav <
>>>>>> jeetuyadav200...@gmail.com> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> You might have missed some configuration (XML tags ), Please check
>>>>>>> all the Conf files.
>>>>>>>
>>>>>>> Thanks
>>>>>>> On Tue, Jul 23, 2013 at 6:25 PM, Ashish Umrani <
>>>>>>> ashish.umr...@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi There,
>>>>>>>>
>>>>>>>> First of all, sorry if I am asking some stupid question.  Myself
>>>>>>>> being new to the Hadoop environment , am finding it a bit difficult to
>>>>>>>> figure out why its failing
>>>>>>>>
>>>>>>>> I have installed hadoop 1.2, based on instructions given in the
>>>>>>>> folllowing link
>>>>>>>>
>>>>>>>> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
>>>>>>>>
>>>>>>>> All went well and I could do the start-all.sh and the jps command
>>>>>>>> does show all 5 process to be present.
>>>>>>>>
>>>>>>>> However when I try to do
>>>>>>>>
>>>>>>>> hadoop fs -ls
>>>>>>>>
>>>>>>>> I get the following error
>>>>>>>>
>>>>>>>>  hduser@ashish-HP-Pavilion-dv6-Notebook-PC:/usr/local/hadoop/conf$
>>>>>>>> hadoop fs -ls
>>>>>>>> Warning: $HADOOP_HOME is deprecated.
>>>>>>>>
>>>>>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element
>>>>>>>> not <property>
>>>>>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element
>>>>>>>> not <property>
>>>>>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element
>>>>>>>> not <property>
>>>>>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element
>>>>>>>> not <property>
>>>>>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element
>>>>>>>> not <property>
>>>>>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element
>>>>>>>> not <property>
>>>>>>>> ls: Cannot access .: No such file or directory.
>>>>>>>> hduser@ashish-HP-Pavilion-dv6-Notebook-PC:/usr/local/hadoop/conf$
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> Can someone help me figure out whats the issue in my installation
>>>>>>>>
>>>>>>>>
>>>>>>>> Regards
>>>>>>>> ashish
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to