[ 
https://issues.apache.org/jira/browse/SPARK-38218?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17493029#comment-17493029
 ] 

Mehul Batra edited comment on SPARK-38218 at 2/16/22, 6:57 AM:
---------------------------------------------------------------

HI [~hyukjin.kwon] So if I download tgz from the spark downloads page will it 
download hadoop 3.3.1, because it still shows hadoop 3.2 in that section 
attaching screenshot for the same.
!image-2022-02-16-12-26-32-871.png|width=736,height=121!


was (Author: me_bat):
So if I download tgz from the spark downloads page will it download hadoop 
3.3.1, because it still shows hadoop 3.2 in that section attaching screenshot 
for the same.
!image-2022-02-16-12-26-32-871.png|width=736,height=121!

> Looks like the wrong package is available on the spark downloads page. The 
> name reads pre built for hadoop3.3 but the tgz file is marked as hadoop3.2
> -----------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-38218
>                 URL: https://issues.apache.org/jira/browse/SPARK-38218
>             Project: Spark
>          Issue Type: Bug
>          Components: Documentation
>    Affects Versions: 3.2.1
>            Reporter: Mehul Batra
>            Priority: Major
>         Attachments: Screenshot_20220214-013156.jpg, 
> image-2022-02-16-12-26-32-871.png
>
>
> !https://files.slack.com/files-pri/T4S1WH2J3-F032FA551U7/screenshot_20220214-013156.jpg!
> !https://files.slack.com/files-pri/T4S1WH2J3-F032FA551U7/screenshot_20220214-013156.jpg!
> Does the tgz have Hadoop 3.3 but it was written wrong or it is 3.2 Hadoop 
> version only?
> if yes is hadoop comes with the S3 magic commitor support.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to