[ 
https://issues.apache.org/jira/browse/SPARK-33605?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-33605.
-----------------------------------
    Resolution: Won't Do

I made a PR and had a discussion, but this issue is closed based on the 
following.
> note that the gcs connector (at leasts the builds off their master) are java 
> 11 only; not sure where that stands w.r.t older releases

Apache Spark community has no plan to drop Java 8 yet.

> Add GCS FS/connector config (dependencies?) akin to S3
> ------------------------------------------------------
>
>                 Key: SPARK-33605
>                 URL: https://issues.apache.org/jira/browse/SPARK-33605
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark, Spark Core
>    Affects Versions: 3.0.1
>            Reporter: Rafal Wojdyla
>            Priority: Major
>
> Spark comes with some S3 batteries included, which makes it easier to use 
> with S3, for GCS to work users are required to manually configure the jars. 
> This is especially problematic for python users who may not be accustomed to 
> java dependencies etc. This is an example of workaround for pyspark: 
> [pyspark_gcs|https://github.com/ravwojdyla/pyspark_gcs]. If we include the 
> [GCS 
> connector|https://cloud.google.com/dataproc/docs/concepts/connectors/cloud-storage],
>  it would make things easier for GCS users.
> Please let me know what you think.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to