[ 
https://issues.apache.org/jira/browse/SPARK-5670?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-5670.
------------------------------
    Resolution: Not a Problem

This is another question that should be asked at user@, please.

The artifacts published to Maven can only be compiled against one version of 
anything. Well, you can make a bunch of different artifacts with different 
{{classifier}}s, but, here the idea is that it doesn't matter: you are always 
compiling against these artifacts as an API, and never relying on them for 
their transitive Hadoop dependency. You mark these dependencies as "provided" 
in your app, and when executed on a cluster, they are using the correct 
dependencies for that cluster or something.

Your error suggests that you have actually bundled old Hadoop code into your 
application. Don't do that; use provided scope.

> Spark artifacts compiled with Hadoop 1.x
> ----------------------------------------
>
>                 Key: SPARK-5670
>                 URL: https://issues.apache.org/jira/browse/SPARK-5670
>             Project: Spark
>          Issue Type: Bug
>          Components: Java API
>    Affects Versions: 1.2.0
>         Environment: Spark 1.2
>            Reporter: DeepakVohra
>
> Why are Spark artifacts available from Maven compiled with Hadoop 1.x while 
> the Spark binaries for Hadoop 1.x are not available? Also CDH is not 
> available for Hadoop 1.x.
> Using Hadoop 2.0.0 or Hadoop 2.3 with Spark artifacts generates error such as 
> the following.
> Server IPC version 7 cannot communicate with client version 4
> Server IPC version 9 cannot communicate with client version 4



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to