hadoop::RecordReader::read() throws exception in HadoopPipes::RecordWriter
--------------------------------------------------------------------------
Key: HADOOP-2360
URL: https://issues.apache.org/jira/browse/HADOOP-2360
Project: Hadoop
Issue Type: Bug
Affects Versions: 0.14.3
Reporter: Yiping Han
Priority: Blocker
The jute record is in format:
class SampleValue
{
ustring data;
}
And in HadoopPipes::RecordWriter::emit(), has code like this:
void SampleRecordWriterC::emit(const std::string& key, const std::string& value)
{
if (key.empty() || value.empty()) {
return;
}
hadoop::StringInStream key_in_stream(const_cast<std::string&>(key));
hadoop::RecordReader key_record_reader(key_in_stream, hadoop::kCSV);
EmitKeyT emit_key;
key_record_reader.read(emit_key);
hadoop::StringInStream value_in_stream(const_cast<std::string&>(value));
hadoop::RecordReader value_record_reader(value_in_stream, hadoop::kCSV);
EmitValueT emit_value;
value_record_reader.read(emit_value);
return;
}
And the code throw hadoop::IOException at the read() line.
In the mapper, I have faked record emitted by the following code:
std::string value;
EmitValueT emit_value;
emit_value.getData().assign("FakeData");
hadoop::StringOutStream value_out_stream(value);
hadoop::RecordWriter value_record_writer(value_out_stream, hadoop::kCSV);
value_record_writer.write(emit_value);
We haven't update to the up-to-date version of hadoop. But I've searched the
tickets and didn't find one issuing this problem.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.