[ 
https://issues.apache.org/jira/browse/SPARK-2421?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14057820#comment-14057820
 ] 

Bertrand Dechoux commented on SPARK-2421:
-----------------------------------------

Actually, Hadoop doesn't even require keys to be Writable. It is only the 
default configuration that requires it so. The truth is that Hadoop has a kind 
of pluggable Serialisation strategy : 
http://hadoop.apache.org/docs/r2.3.0/api/org/apache/hadoop/io/serializer/package-summary.html

So ideally, Spark should be as flexible.

> Spark should treat writable as serializable for keys
> ----------------------------------------------------
>
>                 Key: SPARK-2421
>                 URL: https://issues.apache.org/jira/browse/SPARK-2421
>             Project: Spark
>          Issue Type: Improvement
>          Components: Input/Output, Java API
>    Affects Versions: 1.0.0
>            Reporter: Xuefu Zhang
>
> It seems that Spark requires the key be serializable (class implement 
> Serializable interface). In Hadoop world, Writable interface is used for the 
> same purpose. A lot of existing classes, while writable, are not considered 
> by Spark as Serializable. It would be nice if Spark can treate Writable as 
> serializable and automatically serialize and de-serialize these classes using 
> writable interface.
> This is identified in HIVE-7279, but its benefits are seen global.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to