Re: bigquery issue

2018-10-31 Thread Ismaël Mejía
Hello,

If you think it is a bug (or issue) you can report it at Apache's JIRA
https://issues.apache.org/jira/projects/BEAM/issues

If it is more of a use related question probably it is better to do it
in the user@ mailing list.

Notice that reporting issues is a way of contributing to Beam, for
more info on how to contribute please take a look at:
https://beam.apache.org/contribute/



On Wed, Oct 31, 2018 at 10:12 AM Chaim Turkel  wrote:
>
> Hi,
>   I have an issue with the bigquery sdk code, where is the correct
> group to send them?
> chaim
>
> --
>
>
> Loans are funded by
> FinWise Bank, a Utah-chartered bank located in Sandy,
> Utah, member FDIC, Equal
> Opportunity Lender. Merchant Cash Advances are
> made by Behalf. For more
> information on ECOA, click here
> . For important information about
> opening a new
> account, review Patriot Act procedures here
> .
> Visit Legal
>  to
> review our comprehensive program terms,
> conditions, and disclosures.


bigquery issue

2018-10-31 Thread Chaim Turkel
Hi,
  I have an issue with the bigquery sdk code, where is the correct
group to send them?
chaim

-- 


Loans are funded by
FinWise Bank, a Utah-chartered bank located in Sandy, 
Utah, member FDIC, Equal
Opportunity Lender. Merchant Cash Advances are 
made by Behalf. For more
information on ECOA, click here 
. For important information about 
opening a new
account, review Patriot Act procedures here 
.
Visit Legal 
 to
review our comprehensive program terms, 
conditions, and disclosures. 


Re: bigquery issue

2018-01-18 Thread Reuven Lax
There are multiple reasons why this operation might fail (one is that
BigQuery has a daily quota for number of load jobs, so if you exceed the
daily quota all loads will fail for the rest of the day). Lookinng at the
logs as Luke said should help you diagnose this.

On Thu, Jan 18, 2018 at 12:06 AM, Chaim Turkel  wrote:

> i must say i am disappointed in the errors, when i run the same code
> in a different project it works, so it must be some limitation that
> exists that i don't know
> chaim
>
> On Tue, Jan 16, 2018 at 8:22 PM, Lukasz Cwik  wrote:
> > Look at the worker logs. This page shows how to log information and how
> to
> > find what was logged:
> > https://cloud.google.com/dataflow/pipelines/logging#
> cloud-dataflow-worker-log-example
> > The worker logs contain a lot of information written by Dataflow and
> also by
> > your code. Note that you may need to change log levels to get enough
> > information:
> > https://cloud.google.com/dataflow/pipelines/logging#SettingLevels
> >
> > Also good to take a look at this generic troubleshooting information:
> > https://cloud.google.com/dataflow/pipelines/
> troubleshooting-your-pipeline
> >
> > On Mon, Jan 15, 2018 at 12:18 AM, Chaim Turkel  wrote:
> >>
> >> Hi,
> >>   I have a fairly simple pipeline that create daily snapshots of my
> >> data, and it sometimes fails, but the reason is not obvious:
> >>
> >>
> >> (863777e448a29a5c): Workflow failed. Causes: (863777e448a298ff):
> >>
> >> S41:Account_audit/BigQueryIO.Write/BatchLoads/
> SinglePartitionsReshuffle/GroupByKey/Read+Account_audit/
> BigQueryIO.Write/BatchLoads/SinglePartitionsReshuffle/
> GroupByKey/GroupByWindow+Account_audit/BigQueryIO.Write/BatchLoads/
> SinglePartitionsReshuffle/ExpandIterable+Account_audit/
> BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/
> ParMultiDo(WriteTables)+Account_audit/BigQueryIO.Write/BatchLoads/
> SinglePartitionWriteTables/ParMultiDo(WriteTables).
> WriteTablesMainOutput/extract
> >> table name
> >> +Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/
> ParMultiDo(WriteTables).WriteTablesMainOutput/count/
> GroupByKey+Account_audit/BigQueryIO.Write/BatchLoads/
> SinglePartitionWriteTables/ParMultiDo(WriteTables).
> WriteTablesMainOutput/count/Combine.GroupedValues/Partial+
> Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/
> ParMultiDo(WriteTables).WriteTablesMainOutput/count/
> GroupByKey/Reify+Account_audit/BigQueryIO.Write/BatchLoads/
> SinglePartitionWriteTables/ParMultiDo(WriteTables).
> WriteTablesMainOutput/count/GroupByKey/Write+Account_
> audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/
> WithKeys/AddKeys/Map+Account_audit/BigQueryIO.Write/BatchLoads/
> SinglePartitionWriteTables/Window.Into()/Window.Assign+
> Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/
> GroupByKey/Reify+Account_audit/BigQueryIO.Write/BatchLoads/
> SinglePartitionWriteTables/GroupByKey/Write
> >> failed., (a412b0e93a586d57): A work item was attempted 4 times without
> >> success. Each time the worker eventually lost contact with the
> >> service. The work item was attempted on:
> >> dailysnapshotoptions-chai-01142358-ef6c-harness-qw6s,
> >> dailysnapshotoptions-chai-01142358-ef6c-harness-j09b,
> >> dailysnapshotoptions-chai-01142358-ef6c-harness-3t9m,
> >> dailysnapshotoptions-chai-01142358-ef6c-harness-372t
> >>
> >> is there any way to get more information?
> >>
> >> the job is:
> >>
> >>
> >>
> >> https://console.cloud.google.com/dataflow/jobsDetail/
> locations/us-central1/jobs/2018-01-14_23_58_54-
> 6125672650598375925?project=ordinal-ember-163410&
> organizationId=782381653268
> >>
> >>
> >> chaim
> >>
> >> --
> >>
> >>
> >> Loans are funded by FinWise Bank, a Utah-chartered bank located in
> Sandy,
> >> Utah, member FDIC, Equal Opportunity Lender. Merchant Cash Advances are
> >> made by Behalf. For more information on ECOA, click here
> >> . For important information about
> >> opening a new account, review Patriot Act procedures here
> >> . Visit Legal
> >>  to review our comprehensive program
> terms,
> >> conditions, and disclosures.
> >
> >
>
> --
>
>
> Loans are funded by FinWise Bank, a Utah-chartered bank located in Sandy,
> Utah, member FDIC, Equal Opportunity Lender. Merchant Cash Advances are
> made by Behalf. For more information on ECOA, click here
> . For important information about
> opening a new account, review Patriot Act procedures here
> . Visit Legal
>  to review our comprehensive program terms,
> conditions, and disclosures.
>


Re: bigquery issue

2018-01-18 Thread Chaim Turkel
i must say i am disappointed in the errors, when i run the same code
in a different project it works, so it must be some limitation that
exists that i don't know
chaim

On Tue, Jan 16, 2018 at 8:22 PM, Lukasz Cwik  wrote:
> Look at the worker logs. This page shows how to log information and how to
> find what was logged:
> https://cloud.google.com/dataflow/pipelines/logging#cloud-dataflow-worker-log-example
> The worker logs contain a lot of information written by Dataflow and also by
> your code. Note that you may need to change log levels to get enough
> information:
> https://cloud.google.com/dataflow/pipelines/logging#SettingLevels
>
> Also good to take a look at this generic troubleshooting information:
> https://cloud.google.com/dataflow/pipelines/troubleshooting-your-pipeline
>
> On Mon, Jan 15, 2018 at 12:18 AM, Chaim Turkel  wrote:
>>
>> Hi,
>>   I have a fairly simple pipeline that create daily snapshots of my
>> data, and it sometimes fails, but the reason is not obvious:
>>
>>
>> (863777e448a29a5c): Workflow failed. Causes: (863777e448a298ff):
>>
>> S41:Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionsReshuffle/GroupByKey/Read+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionsReshuffle/GroupByKey/GroupByWindow+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionsReshuffle/ExpandIterable+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/ParMultiDo(WriteTables)+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/ParMultiDo(WriteTables).WriteTablesMainOutput/extract
>> table name
>> +Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/ParMultiDo(WriteTables).WriteTablesMainOutput/count/GroupByKey+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/ParMultiDo(WriteTables).WriteTablesMainOutput/count/Combine.GroupedValues/Partial+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/ParMultiDo(WriteTables).WriteTablesMainOutput/count/GroupByKey/Reify+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/ParMultiDo(WriteTables).WriteTablesMainOutput/count/GroupByKey/Write+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/WithKeys/AddKeys/Map+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/Window.Into()/Window.Assign+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/GroupByKey/Reify+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/GroupByKey/Write
>> failed., (a412b0e93a586d57): A work item was attempted 4 times without
>> success. Each time the worker eventually lost contact with the
>> service. The work item was attempted on:
>> dailysnapshotoptions-chai-01142358-ef6c-harness-qw6s,
>> dailysnapshotoptions-chai-01142358-ef6c-harness-j09b,
>> dailysnapshotoptions-chai-01142358-ef6c-harness-3t9m,
>> dailysnapshotoptions-chai-01142358-ef6c-harness-372t
>>
>> is there any way to get more information?
>>
>> the job is:
>>
>>
>>
>> https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-01-14_23_58_54-6125672650598375925?project=ordinal-ember-163410&organizationId=782381653268
>>
>>
>> chaim
>>
>> --
>>
>>
>> Loans are funded by FinWise Bank, a Utah-chartered bank located in Sandy,
>> Utah, member FDIC, Equal Opportunity Lender. Merchant Cash Advances are
>> made by Behalf. For more information on ECOA, click here
>> . For important information about
>> opening a new account, review Patriot Act procedures here
>> . Visit Legal
>>  to review our comprehensive program terms,
>> conditions, and disclosures.
>
>

-- 


Loans are funded by FinWise Bank, a Utah-chartered bank located in Sandy, 
Utah, member FDIC, Equal Opportunity Lender. Merchant Cash Advances are 
made by Behalf. For more information on ECOA, click here 
. For important information about 
opening a new account, review Patriot Act procedures here 
. Visit Legal 
 to review our comprehensive program terms, 
conditions, and disclosures. 


Re: bigquery issue

2018-01-16 Thread Lukasz Cwik
Look at the worker logs. This page shows how to log information and how to
find what was logged:
https://cloud.google.com/dataflow/pipelines/logging#cloud-dataflow-worker-log-example
The worker logs contain a lot of information written by Dataflow and also
by your code. Note that you may need to change log levels to get enough
information:
https://cloud.google.com/dataflow/pipelines/logging#SettingLevels

Also good to take a look at this generic troubleshooting information:
https://cloud.google.com/dataflow/pipelines/troubleshooting-your-pipeline

On Mon, Jan 15, 2018 at 12:18 AM, Chaim Turkel  wrote:

> Hi,
>   I have a fairly simple pipeline that create daily snapshots of my
> data, and it sometimes fails, but the reason is not obvious:
>
>
> (863777e448a29a5c): Workflow failed. Causes: (863777e448a298ff):
> S41:Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionsReshuffle/
> GroupByKey/Read+Account_audit/BigQueryIO.Write/BatchLoads/
> SinglePartitionsReshuffle/GroupByKey/GroupByWindow+
> Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionsReshuffle/
> ExpandIterable+Account_audit/BigQueryIO.Write/BatchLoads/
> SinglePartitionWriteTables/ParMultiDo(WriteTables)+
> Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/
> ParMultiDo(WriteTables).WriteTablesMainOutput/extract
> table name +Account_audit/BigQueryIO.Write/BatchLoads/
> SinglePartitionWriteTables/ParMultiDo(WriteTables).
> WriteTablesMainOutput/count/GroupByKey+Account_audit/
> BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/
> ParMultiDo(WriteTables).WriteTablesMainOutput/count/
> Combine.GroupedValues/Partial+Account_audit/BigQueryIO.Write/BatchLoads/
> SinglePartitionWriteTables/ParMultiDo(WriteTables).
> WriteTablesMainOutput/count/GroupByKey/Reify+Account_
> audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/
> ParMultiDo(WriteTables).WriteTablesMainOutput/count/
> GroupByKey/Write+Account_audit/BigQueryIO.Write/BatchLoads/
> SinglePartitionWriteTables/WithKeys/AddKeys/Map+Account_
> audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/
> Window.Into()/Window.Assign+Account_audit/BigQueryIO.Write/BatchLoads/
> SinglePartitionWriteTables/GroupByKey/Reify+Account_
> audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/
> GroupByKey/Write
> failed., (a412b0e93a586d57): A work item was attempted 4 times without
> success. Each time the worker eventually lost contact with the
> service. The work item was attempted on:
> dailysnapshotoptions-chai-01142358-ef6c-harness-qw6s,
> dailysnapshotoptions-chai-01142358-ef6c-harness-j09b,
> dailysnapshotoptions-chai-01142358-ef6c-harness-3t9m,
> dailysnapshotoptions-chai-01142358-ef6c-harness-372t
>
> is there any way to get more information?
>
> the job is:
>
>
> https://console.cloud.google.com/dataflow/jobsDetail/
> locations/us-central1/jobs/2018-01-14_23_58_54-
> 6125672650598375925?project=ordinal-ember-163410&
> organizationId=782381653268
>
>
> chaim
>
> --
>
>
> Loans are funded by FinWise Bank, a Utah-chartered bank located in Sandy,
> Utah, member FDIC, Equal Opportunity Lender. Merchant Cash Advances are
> made by Behalf. For more information on ECOA, click here
> . For important information about
> opening a new account, review Patriot Act procedures here
> . Visit Legal
>  to review our comprehensive program terms,
> conditions, and disclosures.
>


bigquery issue

2018-01-15 Thread Chaim Turkel
Hi,
  I have a fairly simple pipeline that create daily snapshots of my
data, and it sometimes fails, but the reason is not obvious:


(863777e448a29a5c): Workflow failed. Causes: (863777e448a298ff):
S41:Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionsReshuffle/GroupByKey/Read+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionsReshuffle/GroupByKey/GroupByWindow+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionsReshuffle/ExpandIterable+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/ParMultiDo(WriteTables)+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/ParMultiDo(WriteTables).WriteTablesMainOutput/extract
table name 
+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/ParMultiDo(WriteTables).WriteTablesMainOutput/count/GroupByKey+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/ParMultiDo(WriteTables).WriteTablesMainOutput/count/Combine.GroupedValues/Partial+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/ParMultiDo(WriteTables).WriteTablesMainOutput/count/GroupByKey/Reify+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/ParMultiDo(WriteTables).WriteTablesMainOutput/count/GroupByKey/Write+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/WithKeys/AddKeys/Map+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/Window.Into()/Window.Assign+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/GroupByKey/Reify+Account_audit/BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/GroupByKey/Write
failed., (a412b0e93a586d57): A work item was attempted 4 times without
success. Each time the worker eventually lost contact with the
service. The work item was attempted on:
dailysnapshotoptions-chai-01142358-ef6c-harness-qw6s,
dailysnapshotoptions-chai-01142358-ef6c-harness-j09b,
dailysnapshotoptions-chai-01142358-ef6c-harness-3t9m,
dailysnapshotoptions-chai-01142358-ef6c-harness-372t

is there any way to get more information?

the job is:


https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-01-14_23_58_54-6125672650598375925?project=ordinal-ember-163410&organizationId=782381653268


chaim

-- 


Loans are funded by FinWise Bank, a Utah-chartered bank located in Sandy, 
Utah, member FDIC, Equal Opportunity Lender. Merchant Cash Advances are 
made by Behalf. For more information on ECOA, click here 
. For important information about 
opening a new account, review Patriot Act procedures here 
. Visit Legal 
 to review our comprehensive program terms, 
conditions, and disclosures.