Thanks XQ and Evan. I am going to try it out. Thanks for your suggestions.
Regards,
Sumit Desai
On Sat, Dec 23, 2023 at 12:16 AM Evan Galpin wrote:
> I assume from the previous messages that GCP Dataflow is being used as the
> pipeline runner. Even without Flex Templates, the v2 runner can use
I assume from the previous messages that GCP Dataflow is being used as the
pipeline runner. Even without Flex Templates, the v2 runner can use docker
containers to install all dependencies from various sources[1]. I have
used docker containers to solve the same problem you mention: installing a
p
You can use the same docker image for both template launcher and Dataflow
job. Here is one example:
https://github.com/google/dataflow-ml-starter/blob/main/tensorflow_gpu.flex.Dockerfile#L60
On Fri, Dec 22, 2023 at 8:04 AM Sumit Desai wrote:
> Yes, I will have to try it out.
>
> Regards
> Sumit
Yes, I will have to try it out.
Regards
Sumit Desai
On Fri, Dec 22, 2023 at 3:53 PM Sofia’s World wrote:
> I guess so, i am not an expert on using env variables in dataflow
> pipelines as any config dependencies i need, i pass them as job input
> params
>
> But perhaps you can configure variab
I guess so, i am not an expert on using env variables in dataflow pipelines
as any config dependencies i need, i pass them as job input params
But perhaps you can configure variables in your docker file (i am not an
expert in this either), as flex templates use Docker?
https://cloud.google.com
We are using an external non-public package which expects environmental
variables only. If environmental variables are not found, it will throw an
error. We can't change source of this package.
Does this mean we will face same problem with flex templates also?
On Fri, 22 Dec 2023, 3:39 pm Sofia’s
The flex template will allow you to pass input params with dynamic values
to your data flow job so you could replace the env variable with that
input? That is, unless you have to have env bars..but from your snippets it
appears you are just using them to configure one of your components?
Hth
On Fr
Hi Sofia and XQ,
The application is failing because I have loggers defined in every file and
the method to create a logger tries to create an object of
UplightTelemetry. If I use flex templated, will the environmental variables
I supply be loaded before the application gets loaded? If not, it woul
Dataflow VMs cannot know your local env variable. I think you should use
custom container:
https://cloud.google.com/dataflow/docs/guides/using-custom-containers. Here
is a sample project: https://github.com/google/dataflow-ml-starter
On Wed, Dec 20, 2023 at 4:57 AM Sofia’s World wrote:
> Hello S
Hello Sumit
Thanks. Sorry...I guess if the value of the env variable is always the
same u can pass it as job params?..though it doesn't sound like a
viable option...
Hth
On Wed, 20 Dec 2023, 09:49 Sumit Desai, wrote:
> Hi Sofia,
>
> Thanks for the response. For now, we have decided not to use f
Hi Sofia,
Thanks for the response. For now, we have decided not to use flex template.
Is there a way to pass environmental variables without using any template?
Thanks & Regards,
Sumit Desai
On Wed, Dec 20, 2023 at 3:16 PM Sofia’s World wrote:
> Hi
> My 2 cents. .have u ever considered using
Hi
My 2 cents. .have u ever considered using flex templates to run your
pipeline? Then you can pass all your parameters at runtime..
(Apologies in advance if it does not cover your use case...)
On Wed, 20 Dec 2023, 09:35 Sumit Desai via user,
wrote:
> Hi all,
>
> I have a Python application whi
Hi all,
I have a Python application which is using Apache beam and Dataflow as
runner. The application uses a non-public Python package
'uplight-telemetry' which is configured using 'extra_packages' while
creating pipeline_options object. This package expects an environmental
variable named 'OTEL_
13 matches
Mail list logo