[ https://issues.apache.org/jira/browse/HADOOP-4065?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12633822#action_12633822 ]
Doug Cutting commented on HADOOP-4065: -------------------------------------- > my intention was to put this in contrib/serialization, but if there is > objection, i can change the patch to contrib/hive. +1 I'd rather not have contrib/serialization just become a grab-bag of io-related stuff. If this is needed by Hive only, then it belongs in contrib/hive. If we decide (subsequently, perhaps) that it has wide utility as a generic API for access to files in a variety of formats for a variety of applications, then perhaps it could be moved to mapred. But that doesn't yet sound like the consensus, so contrib/hive is probably best for now. > support for reading binary data from flat files > ----------------------------------------------- > > Key: HADOOP-4065 > URL: https://issues.apache.org/jira/browse/HADOOP-4065 > Project: Hadoop Core > Issue Type: Bug > Components: contrib/serialization, mapred > Reporter: Joydeep Sen Sarma > Attachments: FlatFileReader.java, HADOOP-4065.0.txt, > HADOOP-4065.1.txt, HADOOP-4065.1.txt, ThriftFlatFile.java > > > like textinputformat - looking for a concrete implementation to read binary > records from a flat file (that may be compressed). > it's assumed that hadoop can't split such a file. so the inputformat can set > splittable to false. > tricky aspects are: > - how to know what class the file contains (has to be in a configuration > somewhere). > - how to determine EOF (would be nice if hadoop can determine EOF and not > have the deserializer throw an exception (which is hard to distinguish from > a exception due to corruptions?)). this is easy for non-compressed streams - > for compressed streams - DecompressorStream has a useful looking > getAvailable() call - except the class is marked package private. -- This message is automatically generated by JIRA. - You can reply to this email to add a comment to the issue online.