I answered on SO Kelsey,
You should be able to add this I believe to explicitly declare the coder to
use:
p.getCoderRegistry()
.registerCoderForClass(HCatRecord.class,
WritableCoder.of(DefaultHCatRecord.class));
On Fri, Jul 20, 2018 at 5:05 PM, Kelsey RIDER wrote:
> Hello,
>
>
>
> I wrote
I looked into the PaneInfo but unfortunately it doesn't contain information
regarding the session window that it is useful for me (at least to tags all
the events belonging to same session with a sessionID) . :(
I have no idea of BEAM internal or how complex would be to implement it,
but it would
On Fri, Jul 20, 2018 at 2:58 AM Jozef Vilcek wrote:
> Hm, that is interesting idea to make the write composite and merge files
> later. Do not know Beam well yet.
> I will look into it and learn about Wait.on() transform (wonder how it
> will work with late fires). Thanks!
>
> But keeps me thinki
Hello,
I wrote it all in this SO post:
https://stackoverflow.com/questions/51443966/simple-hive-write-not-working
Since then, I've started looking at writing my own Coder to handle the
serialization of HCatRecords. But shouldn't this be included?
Suite ? l'?volution des dispositifs de r?glement
Alice,
Totally agree with what Lukasz said.
Also, as alternative solution for job testing, I can suggest to install Flink
locally and run Beam pipeline with just CLI command like this “bin/flink run -c
/path/to/jar --runner=FlinkRunner “.
> On 18 Jul 2018, at 17:09, Lukasz Cwik wrote:
>
> Y
Hm, that is interesting idea to make the write composite and merge files
later. Do not know Beam well yet.
I will look into it and learn about Wait.on() transform (wonder how it will
work with late fires). Thanks!
But keeps me thinking...
Does it make sense to have support from SDK?
Is my use case