Re: Help needed on Dataflow worker exception of WriteToBigQuery

2020-02-24 Thread Reza Rokni
Could you try with --tempLocation set to a GCS bucket that your pipeline has access to in your pipeline options? Cheers Reza On Tue, Feb 25, 2020 at 9:23 AM Wenbing Bai wrote: > Hi there, > > I am using WriteToBigQuery in apache-beam Python SDK 2.16. I get this > error when I run my pipeline in

Help needed on Dataflow worker exception of WriteToBigQuery

2020-02-24 Thread Wenbing Bai
Hi there, I am using WriteToBigQuery in apache-beam Python SDK 2.16. I get this error when I run my pipeline in Dataflow Runner. RuntimeError: IOError: [Errno 2] Not found: gs://tmp-e3271c8deb2f655-0-of-1.avro [while running 'WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile