[ 
https://issues.apache.org/jira/browse/SPARK-26512?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16734072#comment-16734072
 ] 

Saisai Shao commented on SPARK-26512:
-------------------------------------

This seems like a Netty version problem, netty-3.9.9.Final.jar is unrelated. I 
was thinking if we can put spark classpath in front of Hadoop classpath, maybe 
this can be worked. There's a such configuration for driver/executor, not such 
if there's a similar one for AM only.

> Spark 2.4.0 is not working with Hadoop 2.8.3 in windows 10
> ----------------------------------------------------------
>
>                 Key: SPARK-26512
>                 URL: https://issues.apache.org/jira/browse/SPARK-26512
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, Spark Shell, YARN
>    Affects Versions: 2.4.0
>         Environment: operating system : Windows 10
> Spark Version : 2.4.0
> Hadoop Version : 2.8.3
>            Reporter: Anubhav Jain
>            Priority: Minor
>              Labels: windows
>         Attachments: log.png
>
>
> I have installed Hadoop version 2.8.3 in my windows 10 environment and its 
> working fine. Now when i try to install Apache Spark(version 2.4.0) with yarn 
> as cluster manager and its not working. When i try to submit a spark job 
> using spark-submit for testing , so its coming under ACCEPTED tab in YARN UI 
> after that it fail



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to