l/12708 which will be available in
> 2.25.0 release (this release is currently underway).
>
> On Mon, Oct 5, 2020 at 11:28 AM Praveen K Viswanathan <
> harish.prav...@gmail.com> wrote:
>
>> Thanks for sharing the tool Tomo. I will give it a try and let you know.
>>
>
it
> a try?
>
> https://github.com/GoogleCloudPlatform/cloud-opensource-java/wiki/Linkage-Checker-Enforcer-Rule
>
> Regards,
> Tomo
>
> On Fri, Oct 2, 2020 at 21:34 Praveen K Viswanathan <
> harish.prav...@gmail.com> wrote:
>
>> Hi - We have a beam pi
ollectionView
> 2020-10-03 00:42:31,115 INFO
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator - |
> leaveCompositeTransform- View.AsList
> 2020-10-03 00:42:31,116 INFO
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator - |
> enterCompositeTransform-
a profile for flink-runner as shown below.
>
> flink-runner
>
>
>
>org.apache.beam
>beam-runners-flink-1.10
>2.21.0
>
>
>
>
>
And the docker image has below flink version
FROM flink:1.10.0-scala_2.12
Both our pipeline and SDF based IO connector are on Beam 2.23.0 version.
Appreciate if you can guide us on what is causing this exception.
--
Thanks,
Praveen K Viswanathan
> // write PCollection> to stream
>>
>> .apply(MyIO.write()
>>
>> .withStream(outputStream)
>>
>> .withConsumerConfig(config));
>>
>>
>>
>>
>>
>> Without the window transform, we can read from the stream and write to
>> it, however, I don’t see output after the Window transform. Could you
>> please help pin down the issue?
>>
>> Thank you,
>>
>> Gaurav
>>
>
--
Thanks,
Praveen K Viswanathan
te()
>>
>> .withStream(outputStream)
>>
>> .withConsumerConfig(config));
>>
>>
>>
>>
>>
>> Without the window transform, we can read from the stream and write to
>> it, however, I don’t see output after the Window transform. Could you
>> please help pin down the issue?
>>
>> Thank you,
>>
>> Gaurav
>>
>
--
Thanks,
Praveen K Viswanathan
No worries, the error was due to the DoFn element not defined as an
instance of SchemaCoder.
On Tue, Jul 7, 2020 at 10:46 PM Praveen K Viswanathan <
harish.prav...@gmail.com> wrote:
> Thanks Luke. I changed the pipeline structure as shown below. I could see
> that this flow would
(Decider) -outA-> PCollection ->
> ParDo(DoFnA)
> \outB-> PCollection ->
> ParDo(DoFnB)
>
> See[1] for how to create a DoFn with multiple outputs.
>
> 1:
> https://beam.apache.org/documentation/pipelines/design-your-pipeline/#a-single
ching case.
--
Thanks,
Praveen K Viswanathan
t; 1: https://beam.apache.org/documentation/programming-guide/#side-inputs
> 2: https://beam.apache.org/documentation/patterns/side-inputs/
>
> On Sun, Jun 28, 2020 at 6:24 PM Praveen K Viswanathan <
> harish.prav...@gmail.com> wrote:
>
>>
>> Hi All - I am facing an is
@ProcessElement
public void processElement(@Element Map>> input, OutputReceiver out) {
* Log.of("UpdateFn " + input);*
out.output(new CustomObject());
}
}
--
Thanks,
Praveen K Viswanathan
parallelism
> during processing (for small pipelines this likely won't matter).
>
> On Fri, Jun 26, 2020 at 7:29 AM Praveen K Viswanathan <
> harish.prav...@gmail.com> wrote:
>
>> Hi All - I have a DoFn which generates data (KV pair) for each element
>> that it i
e value from KV
if there is a match in the key
--
Thanks,
Praveen K Viswanathan
sasl.jaas.config",
"org.apache.kafka.common.security.plain.PlainLoginModule required
username=\"user\" password=\"pwd\";";
))
.withConsumerConfigUpdates(ImmutableMap.of(
ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, true
))
--
Thanks,
Praveen K Viswanathan
n / Transform instance multiple times in the
> graph or you can follow regular development practices where the common code
> is factored into a method and two different DoFn's invoke it.
>
> On Tue, Jun 23, 2020 at 4:28 PM Praveen K Viswanathan <
> harish.prav...@gmail.com&g
create a cycle though.
>
> On Mon, Jun 22, 2020 at 6:02 PM Praveen K Viswanathan <
> harish.prav...@gmail.com> wrote:
>
>> Another way to put this question is, how do we write a beam pipeline for
>> an existing pipeline (in Java) that has a dozen of custom obje
objects, getters and setters and HashMap *but inside
a DoFn*. Is this the optimal way or does Beam offer something else?
On Mon, Jun 22, 2020 at 3:47 PM Praveen K Viswanathan <
harish.prav...@gmail.com> wrote:
> Hi Luke,
>
> We can say Map 2 as a kind of a template using which you
, 2020 at 8:23 AM Luke Cwik wrote:
> Who reads map 1?
> Can it be stale?
>
> It is unclear what you are trying to do in parallel and why you wouldn't
> stick all this logic into a single DoFn / stateful DoFn.
>
> On Sat, Jun 20, 2020 at 7:14 PM Praveen K Viswanathan <
>
.
These are my initial thoughts/questions and I would like to get some expert
advice on how we typically design such an interleaved transformation in
Apache Beam. Appreciate your valuable insights on this.
--
Thanks,
Praveen K Viswanathan
/transforms/java/aggregation/groupintobatches/
>
> On Fri, Jun 19, 2020 at 8:20 AM Brian Hulette wrote:
>
>> Kenn wrote a blog post showing how to do batched RPCs with the state and
>> timer APIs: https://beam.apache.org/blog/timely-processing/
>>
>> Is that helpful?
>>
&
; Is that helpful?
>
> Brian
>
> On Thu, Jun 18, 2020 at 5:29 PM Praveen K Viswanathan <
> harish.prav...@gmail.com> wrote:
>
>> Hello Everyone,
>>
>> In my pipeline I have to make a *single RPC call* as well as a *Batched
>> RPC call* to fetch data for e
and could share a
sample code or details on how to do this.
--
Thanks,
Praveen K Viswanathan
--
Thanks,
Praveen K Viswanathan
23 matches
Mail list logo