[ 
https://issues.apache.org/jira/browse/SPARK-28092?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16866977#comment-16866977
 ] 

Steve Loughran commented on SPARK-28092:
----------------------------------------

You're going to have to see if you can replicate this on the real ASF artifacts 
otherwise push it to your supplier of built spark artifacts

FWIW, problems with ":" in filenames in paths are known: HADOOP-14829 
HADOOP-14217 for example. HDFS hasn't allowed ":" in names, so nobody even 
noticed that Filesystem.globFiles doesn't work

While nobody actually wants to stop this, its  not something which, AFAIK, 
anyone is actively working on. Patches welcome, *with tests*

> Spark cannot load files with COLON(:) char if not specified full path
> ---------------------------------------------------------------------
>
>                 Key: SPARK-28092
>                 URL: https://issues.apache.org/jira/browse/SPARK-28092
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.4.3
>         Environment: Cloudera 6.2
> Spark latest parcel (I think 2.4.3)
>            Reporter: Ladislav Jech
>            Priority: Major
>
> Scenario:
> I have CSV files in S3 bucket like this:
> s3a://bucket/prefix/myfile_2019:04:05.csv
> s3a://bucket/prefix/myfile_2019:04:06.csv
> Now when I try to load files with something like:
> df = spark.read.load("s3://bucket/prefix/*", format="csv", sep=":", 
> inferSchema="true", header="true")
>  
> It fails on error about URI (sorry don't have here exact exception), but when 
> I list all files from S3 and provide path like array:
> df = 
> spark.read.load(path=["s3://bucket/prefix/myfile_2019:04:05.csv","s3://bucket/prefix/myfile_2019:04:05.csv"],
>  format="csv", sep=":", inferSchema="true", header="true")
>  
> It works, the reason is COLON character in the name of files as per my 
> observations.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to