Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/11628#discussion_r55936088
  
    --- Diff: core/src/main/scala/org/apache/spark/rdd/PipedRDD.scala ---
    @@ -157,8 +167,16 @@ private[spark] class PipedRDD[T: ClassTag](
         val lines = Source.fromInputStream(proc.getInputStream).getLines()
         new Iterator[String] {
           def next(): String = lines.next()
    +
    +      private def propagateChildThreadException(): Unit = {
    --- End diff --
    
    Yeah I should revise that -- it's fine for `hasNext` to report an exception 
since it may have to do the read to figure out the answer. But `next` also 
needs to be able to do this, even though that could be a pretty minor issue in 
practice since `hasNext` is usually called before `next`. But if it isn't, I 
guess this means the cleanup isn't called, which isn't great.
    
    Yeah I see `getLines` doesn't block now, and see what it's trying to do, so 
nevermind that part. I suggest this could be a little more straightforward if 
maybe all of the cleanup logic (including exception handling were pulled out 
into a private method called from `hasNext`, and then also check the return 
value of `hasNext` from `next` (and throw `NoSuchElementException` if needed). 
That's nice form and also solves the problem of `next` needing to do the 
cleanup.
    
    All that is not directly related to change you're making but I wonder if 
it's worth cleaning up while you're modifying this anyway.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to