[ 
https://issues.apache.org/jira/browse/SPARK-12080?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-12080:
------------------------------
    Affects Version/s:     (was: 1.6.1)
                       1.5.2

> Kryo - Support multiple user registrators
> -----------------------------------------
>
>                 Key: SPARK-12080
>                 URL: https://issues.apache.org/jira/browse/SPARK-12080
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.5.2
>            Reporter: Rotem
>            Priority: Minor
>              Labels: kryo, registrator, serializers
>   Original Estimate: 72h
>  Remaining Estimate: 72h
>
> Background: Currently when users need to have a custom serializer for their 
> registered classes, they use the user registrator of Kryo using the 
> spark.kryo.registrator configuration parameter.
> Problem: If the Spark user is an infrastructure itself, it may receive 
> multiple such registrators but won't be able to register them.
> Important note: Currently the single registrator supported can't reach any 
> state/configuration (it is instantiated by reflection with empty constructor)
> Using SparkEnv from user code isn't acceptable.
> Workaround:
> Create a wrapper registrator as a user, and have its implementation scan the 
> class path for multiple classes. 
> Caveat: this is inefficient and too complicated.
> Suggested solution - support multiple registrators + stay backward compatible
> Option 1:
> enhance the value of spark.kryo.registrator  to support a comma separated 
> list for class names. This will be backward compatible and won't add new 
> parameters. 
> Option 2:
> to be more logical, add spark.kryo.registrators new parameter, while keeping 
> the code handling the old one.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to