Hi Eila,

It looks like you're attempting to set the option on the GoogleCloudOptions
class directly, I think you want to set it on an instance of
PipelineOptions that you've viewed as GoogleCloudOptions. Like this example
from
https://cloud.google.com/dataflow/docs/guides/specifying-exec-params#configuring-pipelineoptions-for-execution-on-the-cloud-dataflow-service

# Create and set your PipelineOptions.
options = PipelineOptions(flags=argv)

# For Cloud execution, specify DataflowRunner and set the Cloud Platform
# project, job name, staging file location, temp file location, and region.
options.view_as(StandardOptions).runner = 'DataflowRunner'
google_cloud_options = options.view_as(GoogleCloudOptions)
google_cloud_options.project = 'my-project-id'
...
# Create the Pipeline with the specified options.
p = Pipeline(options=options)

Alternatively you should be able to just specify --worker_machine_type at
the command line if you're parsing the PipelineOptions from sys.argv. Does
that help?

Brian

On Tue, May 12, 2020 at 8:30 AM OrielResearch Eila Arich-Landkof <
[email protected]> wrote:

> Hello,
>
> I am trying to check if the setting of the resources are actually being
> implemented.
> What will be the right way to do it.
> *the code is:*
> GoogleCloudOptions.worker_machine_type = 'n1-highcpu-96'
>
> and *the dataflow view is *the following (nothing that reflects
> the highcpu machine.
> Please advice
>
> Thanks,
> Eila
> Resource metrics
> Current vCPUs
>
> 1
>
> Total vCPU time
>
> 0.07 vCPU hr
>
> Current memory
>
> 3.75 GB
>
> Total memory time
>
> 0.264 GB hr
>
> Current PD
>
> 250 GB
>
> Total PD time
>
> 17.632 GB hr
>
> Current SSD PD
>
> 0 B
>
> Total SSD PD time
>
> 0 GB hr
>
>
> --
> Eila
> <http://www.orielresearch.com>
> Meetup <https://www.meetup.com/Deep-Learning-In-Production/>
>

Reply via email to