[ https://issues.apache.org/jira/browse/SPARK-50483?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yuming Wang updated SPARK-50483: -------------------------------- Summary: BlockMissingException should be thrown even if ignoreCorruptFiles is enabled (was: Still throwing BlockMissingException even with ignoreCorruptFiles enabled) > BlockMissingException should be thrown even if ignoreCorruptFiles is enabled > ---------------------------------------------------------------------------- > > Key: SPARK-50483 > URL: https://issues.apache.org/jira/browse/SPARK-50483 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 3.5.3 > Reporter: Yuming Wang > Priority: Major > Labels: pull-request-available > Attachments: success-task.png > > > {noformat} > 24/11/29 01:56:00 WARN FileScanRDD: Skipped the rest of the content in the > corrupted file: path: > viewfs://hadoop-cluster/path/to/data/part-00320-7915e327-3214-4585-a44e-f9c58e362b43.c000.snappy.parquet, > range: 191727616-281354675, partition values: [empty row] > org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: > BP-169998034-10.210.23.11-1507067630530:blk_83565156183_82548880660 > file/path/to/data/part-00320-7915e327-3214-4585-a44e-f9c58e362b43.c000.snappy.parquet > No live nodes contain current block Block locations: > DatanodeInfoWithStorage[10.209.145.174:50010,DS-c7c0a172-5ffa-4f90-bfb5-717fb1e9ecf2,DISK] > > DatanodeInfoWithStorage[10.3.22.142:50010,DS-a1ba9ac9-dc92-4131-a2c2-9f7d03b97caf,DISK] > > DatanodeInfoWithStorage[10.209.146.156:50010,DS-71d8ae97-15d3-454e-a715-d9490e184989,DISK] > Dead nodes: > DatanodeInfoWithStorage[10.209.146.156:50010,DS-71d8ae97-15d3-454e-a715-d9490e184989,DISK] > > DatanodeInfoWithStorage[10.209.145.174:50010,DS-c7c0a172-5ffa-4f90-bfb5-717fb1e9ecf2,DISK] > > DatanodeInfoWithStorage[10.3.22.142:50010,DS-a1ba9ac9-dc92-4131-a2c2-9f7d03b97caf,DISK] > {noformat} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org