GitHub user ScrapCodes opened a pull request:

    https://github.com/apache/spark/pull/19096

    [SPARK-21869][SS] A cached Kafka producer should not be closed if any task 
is using it.

    
    ## What changes were proposed in this pull request?
    By updating the access time for the producer on each iteration, we can 
ensure that during long running task, we don't close a producer in use.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/ScrapCodes/spark 
SPARK-21869/long-running-kafka-producer

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/19096.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #19096
    
----
commit f21bf1685a5a38d6342bb6b612247a1fac6ef2ff
Author: Prashant Sharma <prash...@in.ibm.com>
Date:   2017-08-31T11:51:32Z

    [SPARK-21869][SS] A cached Kafka producer should not be closed if any task 
is using it.

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to