Re: Eclipse plugin

2011-06-26 Thread Jack Ye
I found that Hadoop 0.20.2 and eclipse Helios can work correctly.
It's really hard to make them work together properly.

clakshminarasu clakshminar...@gmail.com编写:


Hi, 

I am new to this Hadoop programming. I tried downloading the vm ware
appliance and Eclipse/Hadoop plugin. I got the same error while
configuring the eclipse plugin for Hadoop DFS. 
I started by configuring the VM Ware server formation; entered the IP
address that appeared in the appliance. But I was not able to see the
hadoop.job.ugi parameter in the advanced tab.

I tried setting each parameter which requires a path (eg. local dir, system
dir, etc.) by mentioning the absolute path. In my case the temp folder was -
/tmp/hadoop-hadoop-user (hadoop appearing 2 times is not a typo, but it is
actually appearing like that) then it cleaves into 2 subs - dfs and
mapred. 

Even then I was not able to see the vm ware DFS. When I open the folder, I
get Error:null inside.

Can any one guide me in setting this up?
:working:

Thanks in advance.
Lakshminarasu Chenduri
+91 90499.18088
Pune, India
-- 
View this message in context: 
http://old.nabble.com/Eclipse-plugin-tp21983984p31932110.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.





Re: Hadoop Eclipse plugin 0.20.203.0 doesn't work

2011-06-22 Thread Jack Ye
can anyone help me?

叶达峰 kobe082...@qq.com编写:

Hi,
  
 I am a freshman on Hadoop. Today, I spent the whole night trying to set up a 
 development environment for Hadoop. I encounter several problems, first is 
 that the eclipse can't load the plugin, I changed to another version, this 
 problem was resolved.
  
 But now, I have one more difficult problem. I try to set up Map/Reduce 
 Location, if everything were well, it should connect to the server and the 
 DFS Location could list the whole file system. Sadly, it doesn't work.
 I have checked to configuration several times, it should be correct.
  
 Here is the message I get:
 Error: failure to login
 An internal error occured during: Map/reduce location status updater.
 org/codehaus/jackson/map/jsonmappingexception

Re: Hadoop eclipse plugin stopped working after replacing hadoop-0.20.2 jar files with hadoop-0.20-append jar files

2011-06-22 Thread Jack Ye
do you use hadoop 0.20.203.0?
I also have problem about this plugin.

Yaozhen Pan itzhak@gmail.com编写:

Hi,

I am using Eclipse Helios Service Release 2.
I encountered a similar problem (map/reduce perspective failed to load) when
upgrading eclipse plugin from 0.20.2 to 0.20.3-append version.

I compared the source code of eclipse plugin and found only a few
difference. I tried to revert the differences one by one to see if it can
work.
What surprised me was that when I only reverted the jar name from
hadoop-0.20.3-eclipse-plugin.jar to hadoop-0.20.2-eclipse-plugin.jar, it
worked in eclipse.

Yaozhen


On Thu, Jun 23, 2011 at 1:22 AM, praveenesh kumar praveen...@gmail.comwrote:

 I am doing that.. its not working.. If I am replacing the hadoop-core from
 hadoop-plugin.jar.. I am not able to see map-reduce perspective at all.
 Guys.. any help.. !!!

 Thanks,
 Praveenesh

 On Wed, Jun 22, 2011 at 12:34 PM, Devaraj K devara...@huawei.com wrote:

  Every time when hadoop builds, it also builds the hadoop eclipse plug-in
  using the latest hadoop core jar. In your case eclipse plug-in contains
 the
  other version jar and cluster is running with other version. That's why
 it
  is giving the version mismatch error.
 
 
 
  Just replace the hadoop-core jar in your eclipse plug-in with the jar
  whatever the hadoop cluster is using  and check.
 
 
 
  Devaraj K
 
   _
 
  From: praveenesh kumar [mailto:praveen...@gmail.com]
  Sent: Wednesday, June 22, 2011 12:07 PM
  To: common-user@hadoop.apache.org; devara...@huawei.com
  Subject: Re: Hadoop eclipse plugin stopped working after replacing
  hadoop-0.20.2 jar files with hadoop-0.20-append jar files
 
 
 
   I followed michael noll's tutorial for making hadoop-0-20-append jars..
 
 
 
 http://www.michael-noll.com/blog/2011/04/14/building-an-hadoop-0-20-x-versio
  n-for-hbase-0-90-2/
 
  After following the article.. we get 5 jar files which we need to replace
  it
  from hadoop.0.20.2 jar file.
  There is no jar file for hadoop-eclipse plugin..that I can see in my
  repository if I follow that tutorial.
 
  Also the hadoop-plugin I am using..has no info on JIRA MAPREDUCE-1280
  regarding whether it is compatible with hadoop-0.20-append.
 
  Does anyone else. faced this kind of issue ???
 
  Thanks,
  Praveenesh
 
 
 
  On Wed, Jun 22, 2011 at 11:48 AM, Devaraj K devara...@huawei.com
 wrote:
 
  Hadoop eclipse plugin also uses hadoop-core.jar file communicate to the
  hadoop cluster. For this it needs to have same version of hadoop-core.jar
  for client as well as server(hadoop cluster).
 
  Update the hadoop eclipse plugin for your eclipse which is provided with
  hadoop-0.20-append release, it will work fine.
 
 
  Devaraj K
 
  -Original Message-
  From: praveenesh kumar [mailto:praveen...@gmail.com]
  Sent: Wednesday, June 22, 2011 11:25 AM
  To: common-user@hadoop.apache.org
  Subject: Hadoop eclipse plugin stopped working after replacing
  hadoop-0.20.2
  jar files with hadoop-0.20-append jar files
 
 
  Guys,
  I was using hadoop eclipse plugin on hadoop 0.20.2 cluster..
  It was working fine for me.
  I was using Eclipse SDK Helios 3.6.2 with the plugin
  hadoop-eclipse-plugin-0.20.3-SNAPSHOT.jar downloaded from JIRA
  MAPREDUCE-1280
 
  Now for Hbase installation.. I had to use hadoop-0.20-append compiled
  jars..and I had to replace the old jar files with new 0.20-append
 compiled
  jar files..
  But now after replacing .. my hadoop eclipse plugin is not working well
 for
  me.
  Whenever I am trying to connect to my hadoop master node from that and
 try
  to see DFS locations..
  it is giving me the following error:
  *
  Error : Protocol org.apache.hadoop.hdfs.protocol.clientprotocol version
  mismatch (client 41 server 43)*
 
  However the hadoop cluster is working fine if I go directly on hadoop
  namenode use hadoop commands..
  I can add files to HDFS.. run jobs from there.. HDFS web console and
  Map-Reduce web console are also working fine. but not able to use my
  previous hadoop eclipse plugin.
 
  Any suggestions or help for this issue ?
 
  Thanks,
  Praveenesh
 
 
 
 



Re: Hadoop eclipse plugin stopped working after replacing hadoop-0.20.2 jar files with hadoop-0.20-append jar files

2011-06-22 Thread Jack Ye
I used the 0.20.203.0, and can't access the Dfs locations.
Following is the error:
failure to login
 internal error:map/reduce location status updater
org/codehaus/jackson/map/jsonmappingexceptoon

Yaozhen Pan itzhak@gmail.com编写:

Hi,

Our hadoop version was built on 0.20-append with a few patches.
However, I didn't see big differences in eclipse-plugin.

Yaozhen

On Thu, Jun 23, 2011 at 11:29 AM, 叶达峰 (Jack Ye) kobe082...@qq.com wrote:

 do you use hadoop 0.20.203.0?
 I also have problem about this plugin.

 Yaozhen Pan itzhak@gmail.com编写:

 Hi,
 
 I am using Eclipse Helios Service Release 2.
 I encountered a similar problem (map/reduce perspective failed to load)
 when
 upgrading eclipse plugin from 0.20.2 to 0.20.3-append version.
 
 I compared the source code of eclipse plugin and found only a few
 difference. I tried to revert the differences one by one to see if it can
 work.
 What surprised me was that when I only reverted the jar name from
 hadoop-0.20.3-eclipse-plugin.jar to hadoop-0.20.2-eclipse-plugin.jar,
 it
 worked in eclipse.
 
 Yaozhen
 
 
 On Thu, Jun 23, 2011 at 1:22 AM, praveenesh kumar praveen...@gmail.com
 wrote:
 
  I am doing that.. its not working.. If I am replacing the hadoop-core
 from
  hadoop-plugin.jar.. I am not able to see map-reduce perspective at all.
  Guys.. any help.. !!!
 
  Thanks,
  Praveenesh
 
  On Wed, Jun 22, 2011 at 12:34 PM, Devaraj K devara...@huawei.com
 wrote:
 
   Every time when hadoop builds, it also builds the hadoop eclipse
 plug-in
   using the latest hadoop core jar. In your case eclipse plug-in
 contains
  the
   other version jar and cluster is running with other version. That's
 why
  it
   is giving the version mismatch error.
  
  
  
   Just replace the hadoop-core jar in your eclipse plug-in with the jar
   whatever the hadoop cluster is using  and check.
  
  
  
   Devaraj K
  
_
  
   From: praveenesh kumar [mailto:praveen...@gmail.com]
   Sent: Wednesday, June 22, 2011 12:07 PM
   To: common-user@hadoop.apache.org; devara...@huawei.com
   Subject: Re: Hadoop eclipse plugin stopped working after replacing
   hadoop-0.20.2 jar files with hadoop-0.20-append jar files
  
  
  
I followed michael noll's tutorial for making hadoop-0-20-append
 jars..
  
  
  
 
 http://www.michael-noll.com/blog/2011/04/14/building-an-hadoop-0-20-x-versio
   n-for-hbase-0-90-2/
  
   After following the article.. we get 5 jar files which we need to
 replace
   it
   from hadoop.0.20.2 jar file.
   There is no jar file for hadoop-eclipse plugin..that I can see in my
   repository if I follow that tutorial.
  
   Also the hadoop-plugin I am using..has no info on JIRA MAPREDUCE-1280
   regarding whether it is compatible with hadoop-0.20-append.
  
   Does anyone else. faced this kind of issue ???
  
   Thanks,
   Praveenesh
  
  
  
   On Wed, Jun 22, 2011 at 11:48 AM, Devaraj K devara...@huawei.com
  wrote:
  
   Hadoop eclipse plugin also uses hadoop-core.jar file communicate to
 the
   hadoop cluster. For this it needs to have same version of
 hadoop-core.jar
   for client as well as server(hadoop cluster).
  
   Update the hadoop eclipse plugin for your eclipse which is provided
 with
   hadoop-0.20-append release, it will work fine.
  
  
   Devaraj K
  
   -Original Message-
   From: praveenesh kumar [mailto:praveen...@gmail.com]
   Sent: Wednesday, June 22, 2011 11:25 AM
   To: common-user@hadoop.apache.org
   Subject: Hadoop eclipse plugin stopped working after replacing
   hadoop-0.20.2
   jar files with hadoop-0.20-append jar files
  
  
   Guys,
   I was using hadoop eclipse plugin on hadoop 0.20.2 cluster..
   It was working fine for me.
   I was using Eclipse SDK Helios 3.6.2 with the plugin
   hadoop-eclipse-plugin-0.20.3-SNAPSHOT.jar downloaded from JIRA
   MAPREDUCE-1280
  
   Now for Hbase installation.. I had to use hadoop-0.20-append compiled
   jars..and I had to replace the old jar files with new 0.20-append
  compiled
   jar files..
   But now after replacing .. my hadoop eclipse plugin is not working
 well
  for
   me.
   Whenever I am trying to connect to my hadoop master node from that and
  try
   to see DFS locations..
   it is giving me the following error:
   *
   Error : Protocol org.apache.hadoop.hdfs.protocol.clientprotocol
 version
   mismatch (client 41 server 43)*
  
   However the hadoop cluster is working fine if I go directly on hadoop
   namenode use hadoop commands..
   I can add files to HDFS.. run jobs from there.. HDFS web console and
   Map-Reduce web console are also working fine. but not able to use my
   previous hadoop eclipse plugin.
  
   Any suggestions or help for this issue ?
  
   Thanks,
   Praveenesh