RE: Loading audiences via Adwords api/python SDK

2019-10-03 Thread Google Ads API Forum Advisor Prod
Hi Stephen,

Thank you for reaching out.

However, it appears that the issue you are encountering is more related to the 
Python client library. That said, I would recommend that you reach out directly 
to the client library owners via this link as they are better equipped to 
assist you regarding this matter.

I hope this helps.

Best regards,
Peter
Google Ads API Team
ref:_00D1U1174p._5001UKM14S:ref

-- 
-- 
=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~
Also find us on our blog:
https://googleadsdeveloper.blogspot.com/
=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~

You received this message because you are subscribed to the Google
Groups "AdWords API and Google Ads API Forum" group.
To post to this group, send email to adwords-api@googlegroups.com
To unsubscribe from this group, send email to
adwords-api+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/adwords-api?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"AdWords API and Google Ads API Forum" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to adwords-api+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/adwords-api/GDctU0PYSJ2M00nWayPA29ToWPJnKHQrnYgQ%40sfdc.net.


Loading audiences via Adwords api/python SDK

2019-10-03 Thread 'Stephen Waters' via AdWords API and Google Ads API Forum
Hi

I am trying to update some audiences (email/phone matches) from BigQuery to 
Google Ads
The largest are in the millions of rows, but less than 10 million for now
I am chunking them up to 1m row chunks but sometimes loading these 
consecutively takes a long time.

I have been dabbling with threading, trying to load different chunks of an 
audience concurrently & getting some XML_STREAM_EXC errors
I couldn't find much about this error but as things work without threading, 
I assume it's driven by trying to do much at once for the API to handle, or 
some mistake in the code on my part.
Is this threaded approach possible? Or is there another way to get to the 
same result (faster loads)? Below is what I got to, which didn't work.

Thanks

Stephen

# here's what we're going to do to each chunk
def google_stuff(chunkStart,chunkEnd):
add_to_audience = gAudienceService.mutateMembers([{'operand': 
{'userListId': this_audience_id,'membersList': [json.loads(x) for x in 
add_query_data[chunkStart:chunkEnd].to_dict(orient='list')['GOOGLE']]},'operator':
 
'ADD'}])

# Chunk up the volume
chunkStart = 0
chunkIncrement = 100
chunkEnd = chunkStart+chunkIncrement
threads = []
counter = 0

# make some threads
while True:
if add_len > 0 and chunkStart < add_len:
this_thread_name = 'add_users_'+str(counter)
this_thread = threading.Thread(name=this_thread_name, target=google_stuff, 
args=(chunkStart,chunkEnd,))
threads.append(this_thread)
this_thread.start()
time.sleep(5)
chunkStart += chunkIncrement
chunkEnd += chunkIncrement
counter += 1
else:
break

# wait till they all finish
for t in threads:
t.join()

-- 
-- 
=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~
Also find us on our blog:
https://googleadsdeveloper.blogspot.com/
=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~=~

You received this message because you are subscribed to the Google
Groups "AdWords API and Google Ads API Forum" group.
To post to this group, send email to adwords-api@googlegroups.com
To unsubscribe from this group, send email to
adwords-api+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/adwords-api?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"AdWords API and Google Ads API Forum" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to adwords-api+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/adwords-api/00ca77ec-acde-4fc4-9c10-6017dfa3f444%40googlegroups.com.