Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/12026
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user liyintang commented on the pull request:
https://github.com/apache/spark/pull/12026#issuecomment-203010916
I thought the back pressure/flow control handles how many message to fetch,
not when to start to generate the job. IMHO, adding the jitter in the start
time is more
Github user jerryshao commented on the pull request:
https://github.com/apache/spark/pull/12026#issuecomment-202773539
Is it better to handle this by back-pressure or something like flow control
mechanism? I'm just wondering if this `jitter` will break the internal
semantics of Spark
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/12026#issuecomment-202705472
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
GitHub user liyintang opened a pull request:
https://github.com/apache/spark/pull/12026
[Spark-14230][STREAMING] Config the start time (jitter) for streamingâ¦
## What changes were proposed in this pull request?
Currently, RecurringTimer will normalize the start time. For