This is maybe not the nicest implementation since it feels way to
complicated but the only on I found out. Checkout encode starting at line
95.
Note that the example encodes data using confluent's schema registry format
(ie 5 extra bytes) and does a double copy - I have not found a way to get
rid of that.

https://github.com/bitbouncer/kspp/blob/master/include/kspp/avro/avro_serdes.h

/svante


Den fre 12 juli 2019 kl 14:42 skrev steinio <s...@datarespons.no>:

> I am trying to serialize some data, created by a .json definition, and I
> would like to send the data that DataFileWriter writes.
> DataFileWriter takes a file name and writes it to a binary file.
> I can get around this by reading back the file to a string and sending the
> string over by stream (kafka.producer).
> This is not really a viable solution for a high speed producer application,
> and by looking at the DataFileWriter, it looks like it should also be able
> to take an std::unique_ptr<OutPutStream> instead of a file name, and write
> it to a stream.
> But this gives an error when trying to build the application.
>
> /error C2280:
>
> 'std::unique_ptr<avro::OutputStream,std::default_delete&lt;_Ty>>::unique_ptr(const
> std::unique_ptr<_Ty,std::default_delete<_Ty>> &)': attempting to reference
> a
> deleted function
>         with
>         [
>             _Ty=avro::OutputStream
>         ]
> c:\program files (x86)\microsoft visual studio
> 14.0\vc\include\memory(1435):
> note: see declaration of
>
> 'std::unique_ptr<avro::OutputStream,std::default_delete&lt;_Ty>>::unique_ptr'
>         with
>         [
>             _Ty=avro::OutputStream
>         ]/
>
> The error I guess is that I am trying pass a std::unique_ptr as an
> argument,
> which is not possible since they can not be copied, but I should rather
> call
> std::move(myUniquePtr) as an argument instead.
> But this gives me another error:
>
> /error C2664: 'avro::DataFileWriter<c::ProcessMsg>::DataFileWriter(const
> avro::DataFileWriter<c::ProcessMsg> &)': cannot convert argument 1 from
> 'std::shared_ptr<avro::OutputStream>' to 'const char *'/
>
> There are no examples no how to send data as object data file that includes
> header data, so I am just trying and failing here. Is there a "correct" way
> of doing this?
> I see that this is really easy to do in the Java library, it is just
>
> /ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
> DatumWriter<GenericRecord> writer = new GenericDatumWriter<>(schema);
> DataFileWriter<GenericRecord> dataFileWriter = new
> DataFileWriter<>(writer);
> dataFileWriter.create(schema, outputStream);
>
> dataFileWriter.append(data);
> dataFileWriter.close();/
>
> What I have done so far is this:
>
> /std::ifstream ifs("data.json");
> avro::ValidSchema dataSchm;
> avro::compileJsonSchema(ifs, dataSchm);
> const char* file = "data.bin";
> std::shared_ptr<avro::OutputStream> out = avro::memoryOutputStream();
> avro::DataFileWriter<c::ProcessMsg> dfw(file, dataSchm);
> dfw.write(data);
> dfw.close();
> std::ifstream ifs("processmsg.bin");
> std::string str((std::istreambuf_iterator<char>(ifs)),
> builder.payload(str);
> producer.produce(builder);/
>
> How can I avoid having to write this to a file, and instead just write the
> binary data directly to an output stream that I can encode and send?
>
>
>
> --
> Sent from: http://apache-avro.679487.n3.nabble.com/Avro-Users-f679479.html
>

Reply via email to