[
https://issues.apache.org/jira/browse/HADOOP-1986?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12531916
]
Owen O'Malley commented on HADOOP-1986:
---------------------------------------
Actually, I'd probably set it up so that you could configure the list of
Serializers with something like:
{code}
<property>
<name>hadoop.serializers</name>
<value>org.apache.hadoop.io.WritableSerializer,org.apache.hadoop.io.ThriftSerializer</value>
</property>
{code}
and serializer could also have a target class:
{code}
public interface Serializer<T> {
void serialize(T t, OutputStream out) throws IOException;
void deserialize(T t, InputStream in) throws IOException;
// Get the base class that this serializer will work on
Class getTargetClass();
}
{code}
> Add support for a general serialization mechanism for Map Reduce
> ----------------------------------------------------------------
>
> Key: HADOOP-1986
> URL: https://issues.apache.org/jira/browse/HADOOP-1986
> Project: Hadoop
> Issue Type: New Feature
> Components: mapred
> Reporter: Tom White
> Fix For: 0.16.0
>
>
> Currently Map Reduce programs have to use WritableComparable-Writable
> key-value pairs. While it's possible to write Writable wrappers for other
> serialization frameworks (such as Thrift), this is not very convenient: it
> would be nicer to be able to use arbitrary types directly, without explicit
> wrapping and unwrapping.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.