Could you try with --tempLocation set to a GCS bucket that your pipeline
has access to in your pipeline options?

Cheers
Reza

On Tue, Feb 25, 2020 at 9:23 AM Wenbing Bai <wenbing....@getcruise.com>
wrote:

> Hi there,
>
> I am using WriteToBigQuery in apache-beam Python SDK 2.16. I get this
> error when I run my pipeline in Dataflow Runner.
>
> RuntimeError: IOError: [Errno 2] Not found:
> gs://tmp-e3271c8deb2f655-00000-of-00001.avro [while running
> 'WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']
>
> Anyone who had this before? Can I get any hints on where Dataflow worker
> writing data to avro?
>
> --
>
>
>
>
>
> Wenbing Bai
>
> Senior Software Engineer, MLP
>
> Cruise
>
> Pronouns: She/Her
>
>
>
> *Confidentiality Note:* We care about protecting our proprietary
> information, confidential material, and trade secrets. This message may
> contain some or all of those things. Cruise will suffer material harm if
> anyone other than the intended recipient disseminates or takes any action
> based on this message. If you have received this message (including any
> attachments) in error, please delete it immediately and notify the sender
> promptly.

Reply via email to