[
https://issues.apache.org/jira/browse/CRUNCH-601?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15427799#comment-15427799
]
Mikael Goldmann commented on CRUNCH-601:
----------------------------------------
[~mkwhitacre] I don't know if you can create this situation without explicitly
using emptyCollection(). I'm quite sure you know more about sparc than I do.
Incidentally, I guess the empty collection which has size 0 illustrates the
difference between Math.max(1L, newSize) and using parentSize when newSize is
0. Another case would be small scaleFactor, say 0.1, where it is the difference
between getting 9 or getting 1.
Using Math.max(1L, newSize) would probably make the last test pass as well
(whether it's a valid test for sparc or not, but I think I tried that and it
broke other tests).
> Short PCollections in SparkPipeline get length null.
> ----------------------------------------------------
>
> Key: CRUNCH-601
> URL: https://issues.apache.org/jira/browse/CRUNCH-601
> Project: Crunch
> Issue Type: Bug
> Components: Spark
> Affects Versions: 0.13.0
> Environment: Running in local mode on Mac as well as in a ubuntu
> 14.04 docker container
> Reporter: Mikael Goldmann
> Assignee: Micah Whitacre
> Priority: Minor
> Attachments: CRUNCH-601.patch, CRUNCH-601b.patch,
> SmallCollectionLengthTest.java
>
>
> I'll attach a file with a test that I would expect to pass but which fails.
> It creates five PCollection<String> of lengths 0, 1, 2, 3, 4 gets the
> lengths, runs the pipeline and prints the lengths. Finally it asserts that
> all lengths are non-null.
> I would expect it to print lengths 0, 1, 2, 3, 4 and pass.
> What it does is print lengths null, null, null, 3, 4 and fail.
> I think the underlying reason is the use of getSize() on an unmaterialized
> object and assuming that when the estimate that getSize() returns is 0, then
> the PCollection is guaranteed to be empty, which is false in some cases.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)