raulcd opened a new issue, #36199:
URL: https://github.com/apache/arrow/issues/36199

   ### Describe the enhancement requested
   
   It does seem we are using some pretty old and deprecated spark versions on 
our CI nightly integration builds with spark.
   As discussed on this PR comment discussion: 
https://github.com/apache/arrow/pull/36061#discussion_r1230266720
   
   The following comments were added:
   
   Do we still need Spark 3.1.2?
   
   I think that Spark 3.1 reached EOL:
   
   https://spark.apache.org/versioning-policy.html
   
   >    Feature release branches will, generally, be maintained with bug fix 
releases for a period of 18 months.
   
   And 3.1.1 was released on 2021-03-02:
   
   https://spark.apache.org/news/index.html
   
   >    [Spark 3.1.1 
released](https://spark.apache.org/news/spark-3-1-1-released.html)
   >    March 2, 2021
   
   3.2 might reach EOL because 3.2.0 was released on 2021-10-13:
   
   >    [Spark 3.2.0 
released](https://spark.apache.org/news/spark-3-2-0-released.html)
   >    October 13, 2021
   
   
   ### Component(s)
   
   Continuous Integration, Python


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to