kbendick commented on a change in pull request #3400:
URL: https://github.com/apache/iceberg/pull/3400#discussion_r739431383
##########
File path:
spark/v3.2/spark/src/main/java/org/apache/iceberg/spark/source/SparkBatchQueryScan.java
##########
@@ -74,17 +100,31 @@
throw new IllegalArgumentException("Cannot only specify option
end-snapshot-id to do incremental scan");
}
- // look for split behavior overrides in options
- this.splitSize = Spark3Util.propertyAsLong(options,
SparkReadOptions.SPLIT_SIZE, null);
- this.splitLookback = Spark3Util.propertyAsInt(options,
SparkReadOptions.LOOKBACK, null);
- this.splitOpenFileCost = Spark3Util.propertyAsLong(options,
SparkReadOptions.FILE_OPEN_COST, null);
+ this.splitSize = table instanceof BaseMetadataTable ?
readConf.metadataSplitSize() : readConf.splitSize();
+ this.splitLookback = readConf.splitLookback();
+ this.splitOpenFileCost = readConf.splitOpenFileCost();
+ this.runtimeFilterExpressions = Lists.newArrayList();
}
- @Override
- protected List<CombinedScanTask> tasks() {
- if (tasks == null) {
+ private Set<Integer> specIds() {
+ if (specIds == null) {
+ Set<Integer> specIdSet = Sets.newHashSet();
+ for (FileScanTask file : files()) {
+ specIdSet.add(file.spec().specId());
+ }
+ this.specIds = specIdSet;
+ }
+
+ return specIds;
+ }
+
+ private List<FileScanTask> files() {
Review comment:
Yeah I agree with that. I had to open up the file and scroll for a while
to ensure that we are returning the files as is if they were non-null.
I'd personally go with the early return if the files are present, as the
comment above `// lazy cache of files` makes the non-null value make sense. But
I don't have that strong of an opinion either way.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]