[
https://issues.apache.org/jira/browse/THRIFT-765?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Bryan Duxbury updated THRIFT-765:
---------------------------------
Attachment: thrift-765-redux.patch
I had a look at Hadoop's RecordIO implementation, and this did the trick. After
adapting it to Thrift, I found that it was pretty slow, so I monkeyed around a
bit until I'd basically regained all of the encoding performance, though not
very much of the decoding performance. It's still faster, but I just can't seem
to shake the StringBuilder overhead even when doing nothing but ASCII strings.
All the test pass. What do you guys think?
> Improved string encoding and decoding performance
> -------------------------------------------------
>
> Key: THRIFT-765
> URL: https://issues.apache.org/jira/browse/THRIFT-765
> Project: Thrift
> Issue Type: Improvement
> Components: Library (Java)
> Affects Versions: 0.2
> Reporter: Bryan Duxbury
> Assignee: Bryan Duxbury
> Fix For: 0.3
>
> Attachments: thrift-765-redux.patch, thrift-765.patch
>
>
> One of the most consistent time-consuming spots of Thrift serialization and
> deserialization is string encoding. For some inscrutable reason,
> String.getBytes("UTF-8") is slow.
> However, it's recently been brought to my attention that DataOutputStream's
> writeUTF method has a faster implementation of UTF-8 encoding and decoding.
> We should use this style of encoding.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.