[ 
https://issues.apache.org/jira/browse/SPARK-20528?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16199786#comment-16199786
 ] 

Hyukjin Kwon commented on SPARK-20528:
--------------------------------------

I know Scala one reads it as a stream, {{PortableDataStream}} and Python reads 
it as bytes (namely {{str}} in Python 2 and {{bytes}} in Python 3) but I think 
the actual codes could be quite similar as below:

Scala:

{code}
val binaryFilesRDD = sc.binaryFiles("README.md").map { x => (x._1, 
x._2.toArray) }
spark.createDataFrame(binaryFilesRDD)
{code}

Python:

{code}
binaryFilesRDD = sc.binaryFiles("README.md").map(lambda x: (x[0], 
bytearray(x[1])))
spark.createDataFrame(binaryFilesRDD).collect()
{code}

> Add BinaryFileReader and Writer for DataFrames
> ----------------------------------------------
>
>                 Key: SPARK-20528
>                 URL: https://issues.apache.org/jira/browse/SPARK-20528
>             Project: Spark
>          Issue Type: New Feature
>          Components: SQL
>    Affects Versions: 2.2.0
>            Reporter: Joseph K. Bradley
>
> It would be very useful to have a binary data reader/writer for DataFrames, 
> presumably called via {{spark.read.binaryFiles}}, etc.
> Currently, going through RDDs is annoying since it requires different code 
> paths for Scala vs Python:
> Scala:
> {code}
> val binaryFilesRDD = sc.binaryFiles("mypath")
> val binaryFilesDF = spark.createDataFrame(binaryFilesRDD)
> {code}
> Python:
> {code}
> binaryFilesRDD = sc.binaryFiles("mypath")
> binaryFilesRDD_recast = binaryFilesRDD.map(lambda x: (x[0], bytearray(x[1])))
> binaryFilesDF = spark.createDataFrame(binaryFilesRDD_recast)
> {code}
> This is because Scala and Python {{sc.binaryFiles}} return different types, 
> which makes sense in RDD land but not DataFrame land.
> My motivation here is working with images in Spark.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to