Hey, can you provide the full stack trace for the error you're seeing? Also
is this happening consistently?
*+1* to raising a Google ticket where we'll have more visibility.
On Wed, Dec 6, 2023 at 11:33 AM John Casey wrote:
> Hmm. It may be best if you raise a ticket with Google support for
Hmm. It may be best if you raise a ticket with Google support for this. I
can inspect your job directly if you do that, and that will make this more
straightforward.
On Wed, Dec 6, 2023 at 11:24 AM hsy...@gmail.com wrote:
> I’m just using dataflow engine
> On Wed, Dec 6, 2023 at 08:23 John
Unfortunately, there's no way to leverage the existing cross language
connector in python.
Your options are somewhat limited, in my opinion.
Option 1 (My Recommendation) : Implement a DoFn that checks your data
quality before sending it to KafkaIO. If it fails the quality check, send
it to some
I’m just using dataflow engine
On Wed, Dec 6, 2023 at 08:23 John Casey via user
wrote:
> Well, that is odd. It looks like the underlying client is closed, which is
> unexpected.
>
> Do you see any retries in your pipeline? Also, what runner are you using?
>
> @Ahmed Abualsaud this might be
Ok Jhon. But If i want to implement an alternative for myself. What do you
recommend in order to get the message and send it to other target (you said
is possible)? taking in mind that we re using the kafka connector which is
a java transformation which is invoke for python
El mié, 6 dic 2023 a
For the moment, yes.
On Wed, Dec 6, 2023 at 11:21 AM Juan Romero wrote:
> Thanks John. Is it the same case if i want to write in a postgres table
> with the sql connector?
>
> El mié, 6 dic 2023 a las 11:05, John Casey ()
> escribió:
>
>> It is, but it's not possible to to take an existing
Well, that is odd. It looks like the underlying client is closed, which is
unexpected.
Do you see any retries in your pipeline? Also, what runner are you using?
@Ahmed Abualsaud this might be interesting to
you too
On Tue, Dec 5, 2023 at 9:39 PM hsy...@gmail.com wrote:
> I'm using version
Thanks John. Is it the same case if i want to write in a postgres table
with the sql connector?
El mié, 6 dic 2023 a las 11:05, John Casey ()
escribió:
> It is, but it's not possible to to take an existing transform, and simply
> configure it to do this.
>
> For example (and this is what I'm
It is, but it's not possible to to take an existing transform, and simply
configure it to do this.
For example (and this is what I'm doing), it's possible to write a
transform that tries to write to kafka, and upon failure, emits the failure
to an alternate pcollection.
It's not possible (yet)
But , is it not possible to get the message that can't reach the target
sink and put it in another target (eg: kafka error topic where we can
verify which messages failed to be delivered to the target)?
El mié, 6 dic 2023 a las 10:40, John Casey via user ()
escribió:
> I'm currently
I'm currently implementing improvements on Kafka, File, Spanner, and
Bigtable IOs.
I'm planning on tackling PubSub and BQ next year.
All of this is still in progress though, so there aren't easy workarounds
for the moment.
On Tue, Dec 5, 2023 at 5:56 PM Robert Bradshaw wrote:
> Currently
11 matches
Mail list logo