Can you explain more about " that current sinks for Avro and Parquet with
the destination of GCS are not supported"?
We do have AvroIO and ParquetIO (
https://beam.apache.org/documentation/io/connectors/) in Python.
On Wed, Mar 13, 2024 at 5:35 PM Ondřej Pánek wrote:
> Hello Beam team!
>
>
>
>
Hello Beam team!
We’re currently onboarding customer’s infrastructure to the Google Cloud
Platform. The decision was made that one of the technologies they will use is
Dataflow. Let me briefly the usecase specification:
They have kafka cluster where data from CDC data source is stored. The data
> When I check the expansion service docker container, normally it
downloads a JAR file and starts SDK Fn Harness
To clarify the terminology here, I think you meant the Java SDK harness
container not the expansion service. Expansion service is only needed
during job submission and your failure is
Hello everyone,
I am working on the Dataflow Template for (imports to) Neo4j and we are
currently in the middle of revisiting the whole Beam pipeline logic.
The main concepts are:
- data sources: typically data from a SQL query or a text file
- import targets: a target is linked to a