[
https://issues.apache.org/jira/browse/HADOOP-4605?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12646601#action_12646601
]
Doug Cutting commented on HADOOP-4605:
--------------------------------------
> you aren't going to be able to run them against a different hadoop release
> than in the classpath [ ... ]
For API back-compatibility it is sufficient to run things in a single JVM. A
separate JVM is probably the best way to check inter-version RPC compatibility,
but that's a step further than I am proposing here. For this Jira I'm simply
proposing that we automate API back-compatiblity testing.
We currently promise only that if your code compiles without deprecation
warnings against the current version, that your code will still run when you
upgrade to the next version. We do not yet promise inter-compatibility of
mismatched client and server versions, but rather require that Hadoop's client
and server code be upgraded in lockstep. There are ongoing discussions about
what promises we can make when client and server versions do not match, and,
once we agree on this, we should then develop a means to test it. But, for
now, that's out of scope.
> should run old version of unit tests to check back-compatibility
> ----------------------------------------------------------------
>
> Key: HADOOP-4605
> URL: https://issues.apache.org/jira/browse/HADOOP-4605
> Project: Hadoop Core
> Issue Type: Improvement
> Components: test
> Reporter: Doug Cutting
>
> We should test back-compatibility by running unit tests from a prior release.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.