Re: Concurrent DataFrame.saveAsTable into non-existant tables fails the second job despite Mode.APPEND

2017-04-20 Thread Subhash Sriram
Would it be an option to just write the results of each job into separate tables and then run a UNION on all of them at the end into a final target table? Just thinking of an alternative! Thanks, Subhash Sent from my iPhone > On Apr 20, 2017, at 3:48 AM, Rick Moritz wrote:

Concurrent DataFrame.saveAsTable into non-existant tables fails the second job despite Mode.APPEND

2017-04-20 Thread Rick Moritz
Hi List, I'm wondering if the following behaviour should be considered a bug, or whether it "works as designed": I'm starting multiple concurrent (FIFO-scheduled) jobs in a single SparkContext, some of which write into the same tables. When these tables already exist, it appears as though both