It seems like you specified gs:/ and not gs:// Typo?
On Mon, Aug 27, 2018 at 2:02 PM Sameer Abhyankar <[email protected]> wrote: > See this thread to see if it is related to the way the executable jar is > being created: > > https://lists.apache.org/thread.html/0ba685412fa0e21adf59dc50f95d1d3e95fb9a1e59a4218bb00918c6@%3Cuser.beam.apache.org%3E > > You are probably dropping or incorrectly merging the service files. I was > able to work around this issue by using a custom assembly (custom assembly > descriptor in the maven-assembly-plugin). > > On Mon, Aug 27, 2018 at 3:50 PM [email protected] <[email protected]> > wrote: > >> hi >> >> i am creating a Dataflow job from a configuration file and I have hard >> coded the gs staging location it. >> and I compile an executable jar for my pipeline. >> >> I copy the executable jar to a cloud shell environment and execute the >> jar. >> >> But my hard coded part of staging location is not picked and it gives me >> this error. >> >> '/home/aniruddh_sharma/gs:/XXX/yyyy/zzz-ICMmloxNLmCleApYeIIHFA.jar' is >> inaccessible. Causes: Path >> "/home/aniruddh_sharma/gs:/XXX/yyyy/zzz-ICMmloxNLmCleApYeIIHFA.jar" is not >> a valid filepattern. The pattern must be of the form >> "gs://<bucket>/path/to/file". >> >> It somehow appends my Unix home directory path to gs staging location. I >> am not using any environment variable in Dataflow job to read $Home. In >> this jar staging location is hard coded with correct gs URI but still home >> path gets appended. >> > > > -- > > *Sameer Abhyankar* > > Strategic Cloud Engineer > > [email protected] > > +1 (404) 431-7806 >
