We are using Spark 0.7.3 compiled (and running) against Hadood 2.0.0-mr1-cdh4.2.1.
When I read a sequence file in, I have a series of key-value pairs (specifically, the keys are longs and the values are byte arrays). When I use the scala-based Scoobi library to parse each byte-array into a protobuf message, I have no issues. However, when I try to parse the values in these sequence files into the protobuf messages they were created as, I get the following: com.google.protobuf.InvalidProtocolBufferException: Protocol message contained an invalid tag (zero) Has anyone else experiened this before? Is there anything special that must be done when reading in the sequence files?
