Github user paul-rogers commented on a diff in the pull request:
https://github.com/apache/drill/pull/838#discussion_r118045758
--- Diff:
exec/java-exec/src/main/java/org/apache/drill/exec/physical/impl/ScanBatch.java
---
@@ -173,9 +174,8 @@ public IterOutcome next() {
currentReader.allocate(mutator.fieldVectorMap());
} catch (OutOfMemoryException e) {
- logger.debug("Caught Out of Memory Exception", e);
clearFieldVectorMap();
- return IterOutcome.OUT_OF_MEMORY;
+ throw UserException.memoryError(e).build(logger);
--- End diff --
As it turns out, in addition to fixing ScanBatch, there is a need for a
parallel implementation to support a the size-aware vector "writer". That new
version was modified to allow extensive unit testing of all error conditions.
Can't do that as easily with ScanBatch itself because it has messy references
to other parts of Drill, which have references, which have references... Pretty
soon all of Drill is needed to do the tests, which means using the current
injection framework (which we could do.)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---