[ 
https://issues.apache.org/jira/browse/BEAM-6706?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16773376#comment-16773376
 ] 

Valentyn Tymofieiev commented on BEAM-6706:
-------------------------------------------

[~arpi], it sounds like you would like to pass a custom container image, on 
Dataflow side this functionality is supported only Portable Pipeline execution 
model (that uses FnAPI). Python streaming and Go pipelines would always use 
this execution model in Dataflow runner.

Custom containers currently are always pulled at runtime from GCR, since a 
custom container is not present on the VM image used by Dataflow worker. 

If SDK is using a legacy execution mode, for example a typical Java Batch 
pipeline, passing a custom worker harness container image is currently 
disallowed by Dataflow service. 

> User reports trouble downloading 2.10.0 Dataflow worker image
> -------------------------------------------------------------
>
>                 Key: BEAM-6706
>                 URL: https://issues.apache.org/jira/browse/BEAM-6706
>             Project: Beam
>          Issue Type: Bug
>          Components: runner-dataflow
>            Reporter: Kenneth Knowles
>            Assignee: Kenneth Knowles
>            Priority: Blocker
>
> DataFlow however is throwing all sorts of errors.  For example:
> * Handler for GET 
> /v1.27/images/gcr.io/cloud-dataflow/v1beta3/beam-java-batch:beam-2.10.0/json 
> returned error: No such image: 
> gcr.io/cloud-dataflow/v1beta3/beam-java-batch:beam-2.10.0"
> * while reading 'google-dockercfg' metadata: http status code: 404 while 
> fetching url 
> http://metadata.google.internal./computeMetadata/v1/instance/attributes/google-dockercfg";
> * Error syncing pod..."
> The job gets stuck after starting a worker and after an hour or so it gives 
> up with a failure.  2.9.0 runs fine.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to