Can you share your code?

On Tue, 17 Oct 2017 at 10:22 pm, Harsh Choudhary <shry.ha...@gmail.com>
wrote:

> Hi
>
> I'm running a Spark job in which I am appending new data into Parquet
> file. At last, I make a log entry in my Dynamodb table stating the number
> of records appended, time etc. Instead of one single entry in the database,
> multiple entries are being made to it. Is it because of parallel execution
> of code in workers? If it is so then how can I solve it so that it only
> writes once.
>
> *Thanks!*
>
> *Cheers!*
>
> Harsh Choudhary
>
-- 
Best Regards,
Ayan Guha

Reply via email to