--
Thanks
Anand
Unsubscribe
--
Thanks
Anand
Awesome one Tariq!!
On Fri, Jan 18, 2013 at 6:39 AM, Mohammad Tariq wrote:
> You are right Michael, as always :)
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Fri, Jan 18, 2013 at 6:33 AM, Michael Segel
> wrote:
>
>> I'm thinking 'Downfall'
>>
>> But I
+1 the way jon elaborated it.
On Fri, Dec 21, 2012 at 6:36 AM, Todd Lipcon wrote:
> Hi Jon,
>
> FYI, this issue in the fair scheduler was fixed by
> https://issues.apache.org/jira/browse/MAPREDUCE-2905 for 1.1.0.
> Though it is present again in MR2:
> https://issues.apache.org/jira/browse/MAPRE
, anand sharma wrote:
> Thanks harsh it worked and i will try Maven now.
>
>
> On Sat, Dec 15, 2012 at 11:56 PM, Harsh J wrote:
>
>> If you are compiling in the old-world way (javac!) it would be simpler
>> to use the whole classpath, given the modularity of jars,
>
> On Sat, Dec 15, 2012 at 9:11 PM, anand sharma
> wrote:
> > Hi please can someone let me know what are are core jar files and
> depeancies
> > we need to attach in classpath for a job to compile successfully from a
> java
> > source.
> >
> > say..
&
Thanks for the generous reply both of you and yes it was a typo and my
mistake. got it working by "hdfs namenode -format" Thanks to Andy.
On Sat, Dec 15, 2012 at 2:12 AM, Andy Isaacson wrote:
> On Fri, Dec 14, 2012 at 7:47 AM, anand sharma
> wrote:
> > Hi i am follo
cool!
On Wed, Dec 12, 2012 at 5:40 PM, Mohammad Tariq wrote:
> Hello group,
>
>I have created a Google+ community, keeping those folks who are
> comparatively new to Hadoop, in mind.
>
> Although everybody, including me, knows that the official mailing lists
> are always the best place
Hi bharath Apache Flume is there and also if you want to take a look Scribe
from Facebook is also there then there are other log aggregation tool too.
On Tue, Oct 30, 2012 at 7:18 AM, bharath vissapragada <
bharathvissapragada1...@gmail.com> wrote:
> Hi list,
>
> Are the any tools for parsing and
Hi, i have successfully installed CDH4 YARN and Pig in Pseudo distributed
mode on CentOS 6.2 following could era's tutorial but in
pig installation section there is a snap that i am not getting how to go
about, so i exported pig conf then next statement is a bit ambiguous as i
am new to Linux.
exp
ot;sudo service hadoop-0.20-namenode start" to
> > start it in the background. This will fix it up for you.
> >
> > 2. Your format was aborted cause in 0.20.x/1.x, the input required was
> > case-sensitive, while in 2.x onwards the input is non-case-sensitive.
> > So
And are permission for that file which is causing problem..
[root@localhost hive]# ls -l
/var/lib/hadoop-0.20/cache/hadoop/dfs/name/in_use.lock
-rwxrwxrwx. 1 hdfs hdfs 0 Aug 10 21:23
/var/lib/hadoop-0.20/cache/hadoop/dfs/name/in_use.lock
On Thu, Aug 9, 2012 at 3:46 PM, anand sharma wrote
another user..?
>
> ** **
>
> Try formatting and starting from same user console.
>
> ** **
>
> *From:* anand sharma [mailto:anand2sha...@gmail.com]
> *Sent:* Friday, August 10, 2012 9:37 AM
> *To:* user@hadoop.apache.org
> *Subject:* Re: namenode instantiation
yes Owen i did.
On Thu, Aug 9, 2012 at 6:23 PM, Owen Duan wrote:
> have you tried hadoop namenode -format?
>
> 2012/8/9 anand sharma
>
>> yea Tariq !1 its a fresh installation i m doing it for the first time,
>> hope someone will know the error code and the reason o
ards
> Abhishek
>
>
> Sent from my iPhone
>
> On Aug 9, 2012, at 8:41 AM, anand sharma wrote:
>
> yea Tariq !1 its a fresh installation i m doing it for the first time,
> hope someone will know the error code and the reason of error.
>
> On Thu, Aug 9, 2012
CDH3.
> >>
> >>
> >>
> >> On Thu, Aug 9, 2012 at 6:21 PM, Mohammad Tariq
> wrote:
> >>>
> >>> Hello Anand,
> >>>
> >>> Is there any specific reason behind not using ssh??
> >>>
> >>> Re
namenode -format
>
> then try to start namenode :)
>
>
> On Thu, Aug 9, 2012 at 3:51 PM, Mohammad Tariq wrote:
>
>> Hello Anand,
>>
>> Is there any specific reason behind not using ssh??
>>
>> Regards,
>> Mohammad Tariq
>>
>>
&g
Hi, i am just learning the Hadoop and i am setting
the development environment with CDH3 pseudo distributed mode without any
ssh cofiguration in CentOS 6.2 . i can run the sample programs as usual
but when i try and run namenode this is the error it logs...
[hive@localhost ~]$ hadoop namenode
12/
18 matches
Mail list logo