Caution using Hadoop 0.21

2010-11-13 Thread Steve Lewis
Our group made a very poorly considered decision to build out cluster using
Hadoop 0.21
We discovered that a number of programs written and running properly under
0.20.2 did not work
under 0.21

The first issue is that Mapper.Context and Reducer.Context and many of their
superclasses were
converted from concrete classes to interfaces. This change, and I have never
in 15 years of programming Java seen so major
a change to well known public classes is guaranteed to break any code which
subclasses these objects.

While it is a far better decision to make these classes interface, the
manner of the change and the fact that it is poorly
documented shows extraordinary poor judgement on the part of the Hadoop
developers

http://lordjoesoftware.blogspot.com/

-- 
Steven M. Lewis PhD
4221 105th Ave Ne
Kirkland, WA 98033
206-384-1340 (cell)
Institute for Systems Biology
Seattle WA


Re: 0.21 found interface but class was expected

2010-11-13 Thread Steve Lewis
I have a long rant at http://lordjoesoftware.blogspot.com/ on this but
the moral is that there seems to have been a deliberate decision that  0,20
code will may not be comparable with -
I have NEVER seen a major library so directly abandon backward compatability


On Fri, Nov 12, 2010 at 8:04 AM, Sebastian Schoenherr 
sebastian.schoenh...@student.uibk.ac.at wrote:

 Hi Steve,
 we had a similar problem. We've compiled our code with version 0.21 but
 included the wrong jars into the classpath. (version 0.20.2;
 NInputFormat.java). It seems that Hadoop changed this class to an interface,
 maybe you've a simliar problem.
 Hope this helps.
 Sebastian


 Zitat von Steve Lewis lordjoe2...@gmail.com:


  Cassandra sees this error with 0.21 of hadoop

 Exception in thread main java.lang.IncompatibleClassChangeError: Found
 interface org.apache.hadoop.mapreduce.JobContext, but class was expected

 I see something similar
 Error: Found interface org.apache.hadoop.mapreduce.TaskInputOutputContext,
 but class was expected

 I find this especially puzzling
 since org.apache.hadoop.mapreduce.TaskInputOutputContext IS a class not an
 interface

 Does anyone have bright ideas???

 --
 Steven M. Lewis PhD
 4221 105th Ave Ne
 Kirkland, WA 98033
 206-384-1340 (cell)
 Institute for Systems Biology
 Seattle WA







-- 
Steven M. Lewis PhD
4221 105th Ave Ne
Kirkland, WA 98033
206-384-1340 (cell)
Institute for Systems Biology
Seattle WA


Re: 0.21 found interface but class was expected

2010-11-13 Thread Konstantin Boudnik
As much as I love ranting I can't help but wonder if there were any promises
to make 0.21+ be backward compatible with 0.20 ?

Just curious?

On Sat, Nov 13, 2010 at 02:50PM, Steve Lewis wrote:
 I have a long rant at http://lordjoesoftware.blogspot.com/ on this but
 the moral is that there seems to have been a deliberate decision that  0,20
 code will may not be comparable with -
 I have NEVER seen a major library so directly abandon backward compatability
 
 
 On Fri, Nov 12, 2010 at 8:04 AM, Sebastian Schoenherr 
 sebastian.schoenh...@student.uibk.ac.at wrote:
 
  Hi Steve,
  we had a similar problem. We've compiled our code with version 0.21 but
  included the wrong jars into the classpath. (version 0.20.2;
  NInputFormat.java). It seems that Hadoop changed this class to an interface,
  maybe you've a simliar problem.
  Hope this helps.
  Sebastian
 
 
  Zitat von Steve Lewis lordjoe2...@gmail.com:
 
 
   Cassandra sees this error with 0.21 of hadoop
 
  Exception in thread main java.lang.IncompatibleClassChangeError: Found
  interface org.apache.hadoop.mapreduce.JobContext, but class was expected
 
  I see something similar
  Error: Found interface org.apache.hadoop.mapreduce.TaskInputOutputContext,
  but class was expected
 
  I find this especially puzzling
  since org.apache.hadoop.mapreduce.TaskInputOutputContext IS a class not an
  interface
 
  Does anyone have bright ideas???
 
  --
  Steven M. Lewis PhD
  4221 105th Ave Ne
  Kirkland, WA 98033
  206-384-1340 (cell)
  Institute for Systems Biology
  Seattle WA
 
 
 
 
 
 
 
 -- 
 Steven M. Lewis PhD
 4221 105th Ave Ne
 Kirkland, WA 98033
 206-384-1340 (cell)
 Institute for Systems Biology
 Seattle WA


signature.asc
Description: Digital signature


Re: 0.21 found interface but class was expected

2010-11-13 Thread Steve Lewis
Java libraries are VERY reluctant to change major classes in a way that
breaks backward compatability -
NOTE that while the 0.18 packages are  deprecated, they are separate from
the 0.20 packages allowing
0.18 code to run on 0.20 systems - this is true of virtually all Java
libraries

On Sat, Nov 13, 2010 at 3:08 PM, Konstantin Boudnik c...@apache.org wrote:

 As much as I love ranting I can't help but wonder if there were any
 promises
 to make 0.21+ be backward compatible with 0.20 ?

 Just curious?

 On Sat, Nov 13, 2010 at 02:50PM, Steve Lewis wrote:
  I have a long rant at http://lordjoesoftware.blogspot.com/ on this but
  the moral is that there seems to have been a deliberate decision that
  0,20
  code will may not be comparable with -
  I have NEVER seen a major library so directly abandon backward
 compatability
 
 
  On Fri, Nov 12, 2010 at 8:04 AM, Sebastian Schoenherr 
  sebastian.schoenh...@student.uibk.ac.at wrote:
 
   Hi Steve,
   we had a similar problem. We've compiled our code with version 0.21 but
   included the wrong jars into the classpath. (version 0.20.2;
   NInputFormat.java). It seems that Hadoop changed this class to an
 interface,
   maybe you've a simliar problem.
   Hope this helps.
   Sebastian
  
  
   Zitat von Steve Lewis lordjoe2...@gmail.com:
  
  
Cassandra sees this error with 0.21 of hadoop
  
   Exception in thread main java.lang.IncompatibleClassChangeError:
 Found
   interface org.apache.hadoop.mapreduce.JobContext, but class was
 expected
  
   I see something similar
   Error: Found interface
 org.apache.hadoop.mapreduce.TaskInputOutputContext,
   but class was expected
  
   I find this especially puzzling
   since org.apache.hadoop.mapreduce.TaskInputOutputContext IS a class
 not an
   interface
  
   Does anyone have bright ideas???
  
   --
   Steven M. Lewis PhD
   4221 105th Ave Ne
   Kirkland, WA 98033
   206-384-1340 (cell)
   Institute for Systems Biology
   Seattle WA
  
  
  
  
  
 
 
  --
  Steven M. Lewis PhD
  4221 105th Ave Ne
  Kirkland, WA 98033
  206-384-1340 (cell)
  Institute for Systems Biology
  Seattle WA

 -BEGIN PGP SIGNATURE-
 Version: GnuPG v1.4.10 (GNU/Linux)

 iF4EAREIAAYFAkzfGnwACgkQenyFlstYjhK6RwD+IdUVZuqXACV9+9By7fMiy/MO
 Uxyt4o4Z4naBzvjMu0cBAMkHLuHFHxuM5Yzb7doeC8eAzq+brsBzVHDKGeUD5FG4
 =dr5x
 -END PGP SIGNATURE-




-- 
Steven M. Lewis PhD
4221 105th Ave Ne
Kirkland, WA 98033
206-384-1340 (cell)
Institute for Systems Biology
Seattle WA


Re: 0.21 found interface but class was expected

2010-11-13 Thread Konstantin Boudnik
It doesn't answer my question. I guess I will have to look for the answer 
somewhere else

On Sat, Nov 13, 2010 at 03:22PM, Steve Lewis wrote:
 Java libraries are VERY reluctant to change major classes in a way that
 breaks backward compatability -
 NOTE that while the 0.18 packages are  deprecated, they are separate from
 the 0.20 packages allowing
 0.18 code to run on 0.20 systems - this is true of virtually all Java
 libraries
 
 On Sat, Nov 13, 2010 at 3:08 PM, Konstantin Boudnik c...@apache.org wrote:
 
  As much as I love ranting I can't help but wonder if there were any
  promises
  to make 0.21+ be backward compatible with 0.20 ?
 
  Just curious?
 
  On Sat, Nov 13, 2010 at 02:50PM, Steve Lewis wrote:
   I have a long rant at http://lordjoesoftware.blogspot.com/ on this but
   the moral is that there seems to have been a deliberate decision that
   0,20
   code will may not be comparable with -
   I have NEVER seen a major library so directly abandon backward
  compatability
  
  
   On Fri, Nov 12, 2010 at 8:04 AM, Sebastian Schoenherr 
   sebastian.schoenh...@student.uibk.ac.at wrote:
  
Hi Steve,
we had a similar problem. We've compiled our code with version 0.21 but
included the wrong jars into the classpath. (version 0.20.2;
NInputFormat.java). It seems that Hadoop changed this class to an
  interface,
maybe you've a simliar problem.
Hope this helps.
Sebastian
   
   
Zitat von Steve Lewis lordjoe2...@gmail.com:
   
   
 Cassandra sees this error with 0.21 of hadoop
   
Exception in thread main java.lang.IncompatibleClassChangeError:
  Found
interface org.apache.hadoop.mapreduce.JobContext, but class was
  expected
   
I see something similar
Error: Found interface
  org.apache.hadoop.mapreduce.TaskInputOutputContext,
but class was expected
   
I find this especially puzzling
since org.apache.hadoop.mapreduce.TaskInputOutputContext IS a class
  not an
interface
   
Does anyone have bright ideas???
   
--
Steven M. Lewis PhD
4221 105th Ave Ne
Kirkland, WA 98033
206-384-1340 (cell)
Institute for Systems Biology
Seattle WA
   
   
   
   
   
  
  
   --
   Steven M. Lewis PhD
   4221 105th Ave Ne
   Kirkland, WA 98033
   206-384-1340 (cell)
   Institute for Systems Biology
   Seattle WA
 
  -BEGIN PGP SIGNATURE-
  Version: GnuPG v1.4.10 (GNU/Linux)
 
  iF4EAREIAAYFAkzfGnwACgkQenyFlstYjhK6RwD+IdUVZuqXACV9+9By7fMiy/MO
  Uxyt4o4Z4naBzvjMu0cBAMkHLuHFHxuM5Yzb7doeC8eAzq+brsBzVHDKGeUD5FG4
  =dr5x
  -END PGP SIGNATURE-
 
 
 
 
 -- 
 Steven M. Lewis PhD
 4221 105th Ave Ne
 Kirkland, WA 98033
 206-384-1340 (cell)
 Institute for Systems Biology
 Seattle WA


signature.asc
Description: Digital signature


Re: Hadoop installation on Windows

2010-11-13 Thread Christopher Worley
Thanks for the advice, guys.  I found this tutorial that covers
installation of 0.21.0:
http://alans.se/blog/2010/hadoop-hbase-cygwin-windows-7-x64/

The author suggests adding the CLASSPATH=`cygpath -wp $CLASSPATH`
to bin/hadoop-config.sh just like you suggested.  I made that change
and then checked the version.  Here's what I got:


$ bin/hadoop version
cygwin warning:
  MS-DOS style path detected: C:\cygwin\usr\local\hadoop-0.21.0\/build/native
  Preferred POSIX equivalent is: /usr/local/hadoop-0.21.0/build/native
  CYGWIN environment variable option nodosfilewarning turns off this warning.
  Consult the user's guide for more details about POSIX paths:
http://cygwin.com/cygwin-ug-net/using.html#using-pathnames
Hadoop 0.21.0
Subversion https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.21 -
r 985326
Compiled by tomwhite on Tue Aug 17 01:02:28 EDT 2010
From source with checksum a1aeb15b4854808d152989ba76f90fac


It gives a warning on the path format, but it appears to work--it
gives the correct Hadoop version.

When I try to do the next step, formatting the namenode, I get the
following exceptions:
http://pastebin.com/YhZ8JZjG

When I try to run bin/start-dfs.sh or bin/start-mapred.sh I get
Hadoop common not found.

I appreciate any help.

-Chris




On Fri, Nov 12, 2010 at 3:29 PM, Vijay tec...@gmail.com wrote:
 It seems like on Windows java doesn't work well with cygwin-style paths on
 classpath (/cygdrive/d/). The PlatformName error at the beginning is due to
 that. This is coming from the bin/hadoop-config.sh script which is using
 cygwin-styles paths for the jar files.

 The bin/hadoop script works correctly since it has this extra command for
 cywin:
 CLASSPATH=`cygpath -p -w $CLASSPATH`

 If the above line is added to the bin/hadoop-config.sh, the platform error
 goes away.

 In the end I'm getting an error like Bad connection to FS. command
 aborted. when I try to use the fs command. I'm still figuring out why that
 is.

 On Fri, Nov 12, 2010 at 12:09 PM, Jamie Cockrill
 jamie.cockr...@gmail.comwrote:

 Can't spot anything obvious that would prevent it working... can you
 paste in the other exceptions? Or put them in pastebin
 (http://pastebin.com/) and link to them here.

 Also, if you're just trying it out to see what it can do, perhaps
 using something like Cloudera's training VM might be worthwhile:

 http://www.cloudera.com/downloads/virtual-machine/

 Start it up with VMWare player and you have yourself a little test-bed
 to work with.

 Ta

 Jamie

 On 12 November 2010 20:03, Jamie Cockrill jamie.cockr...@gmail.com
 wrote:
  Very quick spot:
 
  From the docs:
 http://hadoop.apache.org/common/docs/current/commands_manual.html#version
  --
  version
 
  Prints the version.
 
  Usage: hadoop version
  --
 
  you don't need the '--' on the front of the version argument. Will
  have a look at the other thing.
 
  Ta
 
  Jamie
 
 
  On 11 November 2010 17:36, Christopher Worley
  christoph.wor...@gmail.com wrote:
  Hello,
 
  I'm trying to get Hadoop working on Cygwin/Windows XP in
  Pseudo-Distributed Mode.  I downloaded version 0.21.0 and unpacked it
  to a cygwin directory.  I'm following the quickstart directions here:
  http://hadoop.apache.org/common/docs/r0.20.2/quickstart.html
 
 
  I get the following error when I try to check the Hadoop version:
 
 
  $ bin/hadoop --version
  Exception in thread main java.lang.NoClassDefFoundError:
  org/apache/hadoop/util/PlatformName
  Caused by: java.lang.ClassNotFoundException:
 org.apache.hadoop.util.PlatformName
 
         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
         at java.security.AccessController.doPrivileged(Native Method)
         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
  Could not find the main class: org.apache.hadoop.util.PlatformName.
  Program wil
  l exit.
  Unrecognized option: --version
  Could not create the Java virtual machine.
 
 
  I get similar exceptions when I run start-all.sh, but I am still
  able to browse the web interface for NameNode and JobTracker.
 
  I'm sure has something to do with an installation step that I'm
  missing--some step that windows users have to do, that linux users do
  not.  I did web searches for this problem, but I only found it
  occurring when people were building Hadoop.  I'm not building.
 
  I appreciate any help anyone can give.
 
  Thanks,
  Chris
 
 




Re: 0.21 found interface but class was expected

2010-11-13 Thread Lance Norskog
It is considered good manners :)

Seriously, if you want to attract a community you have an obligation
to tell them when you're going to jerk the rug out from under their
feet.

On Sat, Nov 13, 2010 at 3:27 PM, Konstantin Boudnik c...@apache.org wrote:
 It doesn't answer my question. I guess I will have to look for the answer 
 somewhere else

 On Sat, Nov 13, 2010 at 03:22PM, Steve Lewis wrote:
 Java libraries are VERY reluctant to change major classes in a way that
 breaks backward compatability -
 NOTE that while the 0.18 packages are  deprecated, they are separate from
 the 0.20 packages allowing
 0.18 code to run on 0.20 systems - this is true of virtually all Java
 libraries

 On Sat, Nov 13, 2010 at 3:08 PM, Konstantin Boudnik c...@apache.org wrote:

  As much as I love ranting I can't help but wonder if there were any
  promises
  to make 0.21+ be backward compatible with 0.20 ?
 
  Just curious?
 
  On Sat, Nov 13, 2010 at 02:50PM, Steve Lewis wrote:
   I have a long rant at http://lordjoesoftware.blogspot.com/ on this but
   the moral is that there seems to have been a deliberate decision that
   0,20
   code will may not be comparable with -
   I have NEVER seen a major library so directly abandon backward
  compatability
  
  
   On Fri, Nov 12, 2010 at 8:04 AM, Sebastian Schoenherr 
   sebastian.schoenh...@student.uibk.ac.at wrote:
  
Hi Steve,
we had a similar problem. We've compiled our code with version 0.21 but
included the wrong jars into the classpath. (version 0.20.2;
NInputFormat.java). It seems that Hadoop changed this class to an
  interface,
maybe you've a simliar problem.
Hope this helps.
Sebastian
   
   
Zitat von Steve Lewis lordjoe2...@gmail.com:
   
   
 Cassandra sees this error with 0.21 of hadoop
   
Exception in thread main java.lang.IncompatibleClassChangeError:
  Found
interface org.apache.hadoop.mapreduce.JobContext, but class was
  expected
   
I see something similar
Error: Found interface
  org.apache.hadoop.mapreduce.TaskInputOutputContext,
but class was expected
   
I find this especially puzzling
since org.apache.hadoop.mapreduce.TaskInputOutputContext IS a class
  not an
interface
   
Does anyone have bright ideas???
   
--
Steven M. Lewis PhD
4221 105th Ave Ne
Kirkland, WA 98033
206-384-1340 (cell)
Institute for Systems Biology
Seattle WA
   
   
   
   
   
  
  
   --
   Steven M. Lewis PhD
   4221 105th Ave Ne
   Kirkland, WA 98033
   206-384-1340 (cell)
   Institute for Systems Biology
   Seattle WA
 
  -BEGIN PGP SIGNATURE-
  Version: GnuPG v1.4.10 (GNU/Linux)
 
  iF4EAREIAAYFAkzfGnwACgkQenyFlstYjhK6RwD+IdUVZuqXACV9+9By7fMiy/MO
  Uxyt4o4Z4naBzvjMu0cBAMkHLuHFHxuM5Yzb7doeC8eAzq+brsBzVHDKGeUD5FG4
  =dr5x
  -END PGP SIGNATURE-
 
 


 --
 Steven M. Lewis PhD
 4221 105th Ave Ne
 Kirkland, WA 98033
 206-384-1340 (cell)
 Institute for Systems Biology
 Seattle WA

 -BEGIN PGP SIGNATURE-
 Version: GnuPG v1.4.10 (GNU/Linux)

 iF4EAREIAAYFAkzfHswACgkQenyFlstYjhKtUAD+Nu/gL5DQ+v9iC89dIaHltvCK
 Oa6HOwVWNXWksUxhZhgBAMueLiItX6y4jhCKA5xCOqAmfxA0KZpTkyZr4+ozrazg
 =wScC
 -END PGP SIGNATURE-





-- 
Lance Norskog
goks...@gmail.com


Re: 0.21 found interface but class was expected

2010-11-13 Thread Todd Lipcon
We do have policies against breaking APIs between consecutive major versions
except for very rare exceptions (eg UnixUserGroupInformation went away when
security was added).

We do *not* have any current policies that existing code can work against
different major versions without a recompile in between. Switching an
implementation class to an interface is a case where a simple recompile of
the dependent app should be sufficient to avoid issues. For whatever reason,
the JVM bytecode for invoking an interface method (invokeinterface) is
different than invoking a virtual method in a class (invokevirtual).

-Todd

On Sat, Nov 13, 2010 at 5:28 PM, Lance Norskog goks...@gmail.com wrote:

 It is considered good manners :)

 Seriously, if you want to attract a community you have an obligation
 to tell them when you're going to jerk the rug out from under their
 feet.

 On Sat, Nov 13, 2010 at 3:27 PM, Konstantin Boudnik c...@apache.org
 wrote:
  It doesn't answer my question. I guess I will have to look for the answer
 somewhere else
 
  On Sat, Nov 13, 2010 at 03:22PM, Steve Lewis wrote:
  Java libraries are VERY reluctant to change major classes in a way that
  breaks backward compatability -
  NOTE that while the 0.18 packages are  deprecated, they are separate
 from
  the 0.20 packages allowing
  0.18 code to run on 0.20 systems - this is true of virtually all Java
  libraries
 
  On Sat, Nov 13, 2010 at 3:08 PM, Konstantin Boudnik c...@apache.org
 wrote:
 
   As much as I love ranting I can't help but wonder if there were any
   promises
   to make 0.21+ be backward compatible with 0.20 ?
  
   Just curious?
  
   On Sat, Nov 13, 2010 at 02:50PM, Steve Lewis wrote:
I have a long rant at http://lordjoesoftware.blogspot.com/ on this
 but
the moral is that there seems to have been a deliberate decision
 that
0,20
code will may not be comparable with -
I have NEVER seen a major library so directly abandon backward
   compatability
   
   
On Fri, Nov 12, 2010 at 8:04 AM, Sebastian Schoenherr 
sebastian.schoenh...@student.uibk.ac.at wrote:
   
 Hi Steve,
 we had a similar problem. We've compiled our code with version
 0.21 but
 included the wrong jars into the classpath. (version 0.20.2;
 NInputFormat.java). It seems that Hadoop changed this class to an
   interface,
 maybe you've a simliar problem.
 Hope this helps.
 Sebastian


 Zitat von Steve Lewis lordjoe2...@gmail.com:


  Cassandra sees this error with 0.21 of hadoop

 Exception in thread main
 java.lang.IncompatibleClassChangeError:
   Found
 interface org.apache.hadoop.mapreduce.JobContext, but class was
   expected

 I see something similar
 Error: Found interface
   org.apache.hadoop.mapreduce.TaskInputOutputContext,
 but class was expected

 I find this especially puzzling
 since org.apache.hadoop.mapreduce.TaskInputOutputContext IS a
 class
   not an
 interface

 Does anyone have bright ideas???

 --
 Steven M. Lewis PhD
 4221 105th Ave Ne
 Kirkland, WA 98033
 206-384-1340 (cell)
 Institute for Systems Biology
 Seattle WA





   
   
--
Steven M. Lewis PhD
4221 105th Ave Ne
Kirkland, WA 98033
206-384-1340 (cell)
Institute for Systems Biology
Seattle WA
  
   -BEGIN PGP SIGNATURE-
   Version: GnuPG v1.4.10 (GNU/Linux)
  
   iF4EAREIAAYFAkzfGnwACgkQenyFlstYjhK6RwD+IdUVZuqXACV9+9By7fMiy/MO
   Uxyt4o4Z4naBzvjMu0cBAMkHLuHFHxuM5Yzb7doeC8eAzq+brsBzVHDKGeUD5FG4
   =dr5x
   -END PGP SIGNATURE-
  
  
 
 
  --
  Steven M. Lewis PhD
  4221 105th Ave Ne
  Kirkland, WA 98033
  206-384-1340 (cell)
  Institute for Systems Biology
  Seattle WA
 
  -BEGIN PGP SIGNATURE-
  Version: GnuPG v1.4.10 (GNU/Linux)
 
  iF4EAREIAAYFAkzfHswACgkQenyFlstYjhKtUAD+Nu/gL5DQ+v9iC89dIaHltvCK
  Oa6HOwVWNXWksUxhZhgBAMueLiItX6y4jhCKA5xCOqAmfxA0KZpTkyZr4+ozrazg
  =wScC
  -END PGP SIGNATURE-
 
 



 --
 Lance Norskog
 goks...@gmail.com




-- 
Todd Lipcon
Software Engineer, Cloudera


Re: 0.21 found interface but class was expected

2010-11-13 Thread Edward Capriolo
On Sat, Nov 13, 2010 at 9:50 PM, Todd Lipcon t...@cloudera.com wrote:
 We do have policies against breaking APIs between consecutive major versions
 except for very rare exceptions (eg UnixUserGroupInformation went away when
 security was added).

 We do *not* have any current policies that existing code can work against
 different major versions without a recompile in between. Switching an
 implementation class to an interface is a case where a simple recompile of
 the dependent app should be sufficient to avoid issues. For whatever reason,
 the JVM bytecode for invoking an interface method (invokeinterface) is
 different than invoking a virtual method in a class (invokevirtual).

 -Todd

 On Sat, Nov 13, 2010 at 5:28 PM, Lance Norskog goks...@gmail.com wrote:

 It is considered good manners :)

 Seriously, if you want to attract a community you have an obligation
 to tell them when you're going to jerk the rug out from under their
 feet.

 On Sat, Nov 13, 2010 at 3:27 PM, Konstantin Boudnik c...@apache.org
 wrote:
  It doesn't answer my question. I guess I will have to look for the answer
 somewhere else
 
  On Sat, Nov 13, 2010 at 03:22PM, Steve Lewis wrote:
  Java libraries are VERY reluctant to change major classes in a way that
  breaks backward compatability -
  NOTE that while the 0.18 packages are  deprecated, they are separate
 from
  the 0.20 packages allowing
  0.18 code to run on 0.20 systems - this is true of virtually all Java
  libraries
 
  On Sat, Nov 13, 2010 at 3:08 PM, Konstantin Boudnik c...@apache.org
 wrote:
 
   As much as I love ranting I can't help but wonder if there were any
   promises
   to make 0.21+ be backward compatible with 0.20 ?
  
   Just curious?
  
   On Sat, Nov 13, 2010 at 02:50PM, Steve Lewis wrote:
I have a long rant at http://lordjoesoftware.blogspot.com/ on this
 but
the moral is that there seems to have been a deliberate decision
 that
    0,20
code will may not be comparable with -
I have NEVER seen a major library so directly abandon backward
   compatability
   
   
On Fri, Nov 12, 2010 at 8:04 AM, Sebastian Schoenherr 
sebastian.schoenh...@student.uibk.ac.at wrote:
   
 Hi Steve,
 we had a similar problem. We've compiled our code with version
 0.21 but
 included the wrong jars into the classpath. (version 0.20.2;
 NInputFormat.java). It seems that Hadoop changed this class to an
   interface,
 maybe you've a simliar problem.
 Hope this helps.
 Sebastian


 Zitat von Steve Lewis lordjoe2...@gmail.com:


  Cassandra sees this error with 0.21 of hadoop

 Exception in thread main
 java.lang.IncompatibleClassChangeError:
   Found
 interface org.apache.hadoop.mapreduce.JobContext, but class was
   expected

 I see something similar
 Error: Found interface
   org.apache.hadoop.mapreduce.TaskInputOutputContext,
 but class was expected

 I find this especially puzzling
 since org.apache.hadoop.mapreduce.TaskInputOutputContext IS a
 class
   not an
 interface

 Does anyone have bright ideas???

 --
 Steven M. Lewis PhD
 4221 105th Ave Ne
 Kirkland, WA 98033
 206-384-1340 (cell)
 Institute for Systems Biology
 Seattle WA





   
   
--
Steven M. Lewis PhD
4221 105th Ave Ne
Kirkland, WA 98033
206-384-1340 (cell)
Institute for Systems Biology
Seattle WA
  
   -BEGIN PGP SIGNATURE-
   Version: GnuPG v1.4.10 (GNU/Linux)
  
   iF4EAREIAAYFAkzfGnwACgkQenyFlstYjhK6RwD+IdUVZuqXACV9+9By7fMiy/MO
   Uxyt4o4Z4naBzvjMu0cBAMkHLuHFHxuM5Yzb7doeC8eAzq+brsBzVHDKGeUD5FG4
   =dr5x
   -END PGP SIGNATURE-
  
  
 
 
  --
  Steven M. Lewis PhD
  4221 105th Ave Ne
  Kirkland, WA 98033
  206-384-1340 (cell)
  Institute for Systems Biology
  Seattle WA
 
  -BEGIN PGP SIGNATURE-
  Version: GnuPG v1.4.10 (GNU/Linux)
 
  iF4EAREIAAYFAkzfHswACgkQenyFlstYjhKtUAD+Nu/gL5DQ+v9iC89dIaHltvCK
  Oa6HOwVWNXWksUxhZhgBAMueLiItX6y4jhCKA5xCOqAmfxA0KZpTkyZr4+ozrazg
  =wScC
  -END PGP SIGNATURE-
 
 



 --
 Lance Norskog
 goks...@gmail.com




 --
 Todd Lipcon
 Software Engineer, Cloudera


Cloudera and Yahoo have back ported the interesting security
components and some enhancements into their 0.20 based distributions.
Out of the box, I know hive does not (yet) work with 0.21. I imagine
other tools have minor problems as well. I have not read anything
about the big players looking to move to 0.21. These factors and
other make it unclear what the adoption of 0.21.X will be. The way I
look at it I am not very compelled to go 20-21 when everything
interesting is back ported and the only thing I lose seems to be
compatibility.


Re: Caution using Hadoop 0.21

2010-11-13 Thread Edward Capriolo
On Sat, Nov 13, 2010 at 4:33 PM, Shi Yu sh...@uchicago.edu wrote:
 I agree with Steve. That's why I am still using 0.19.2 in my production.

 Shi

 On 2010-11-13 12:36, Steve Lewis wrote:

 Our group made a very poorly considered decision to build out cluster
 using
 Hadoop 0.21
 We discovered that a number of programs written and running properly under
 0.20.2 did not work
 under 0.21

 The first issue is that Mapper.Context and Reducer.Context and many of
 their
 superclasses were
 converted from concrete classes to interfaces. This change, and I have
 never
 in 15 years of programming Java seen so major
 a change to well known public classes is guaranteed to break any code
 which
 subclasses these objects.

 While it is a far better decision to make these classes interface, the
 manner of the change and the fact that it is poorly
 documented shows extraordinary poor judgement on the part of the Hadoop
 developers

 http://lordjoesoftware.blogspot.com/






At times we have been frustrated by rapidly changing API's

# 23 August, 2010: release 0.21.0 available
# 26 February, 2010: release 0.20.2 available
# 14 September, 2009: release 0.20.1 available
# 23 July, 2009: release 0.19.2 available
# 22 April, 2009: release 0.20.0 available

By the standard major/minor/revision scheme 0.20.X-0.21.X is a minor
release. However since hadoop has never had a major release you might
consider 0.20-0.21 to be a major release.

In any case, are you saying that in 15 years of coding you have never
seen an API change between minor releases? I think that is quite
common. It was also more then a year between 0.20.X and 0.21.X.  Again
common to expect a change in that time frame.