Hi All, When I try to read a list parquet files from S3, my application errors out if even one of the files are absent. When I searched for solutions most of them suggested filtering the list of files (on presence) before calling read. Shouldn't this be handled by Spark by providing an option for continuing without throwing an error? If not, could you point me to the thread where this was discussed upon.
Regards, Naresh