Hi Alessandro,
I'd recommend you to check the UTs added in the commit which solved the
issue (ie.
https://github.com/apache/spark/commit/a66fe36cee9363b01ee70e469f1c968f633c5713).
You can use them to try and reproduce the issue.
Thanks,
Marco
2018-06-14 15:57 GMT+02:00 Alessandro Liparoti :
>
Good morning,
I am trying to see how this bug affects the write in spark 2.2.0, but I
cannot reproduce it. Is it ok then using the code
df.write.mode(SaveMode.Overwrite).insertInto("table_name")
?
Thank you,
*Alessandro Liparoti*