[ https://issues.apache.org/jira/browse/SPARK-19524?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15858910#comment-15858910 ]
Egor Pahomov commented on SPARK-19524: -------------------------------------- [~srowen], based on documentation, which says "newFilesOnly - Should process only new files and ignore existing files in the directory", I expect, that files which are already exist in folder to which I connect streaming, would be processed. It's not true. In reality files, which were created after time X would be procesed. Time X: {code} private val durationToRemember = slideDuration * numBatchesToRemember val modTimeIgnoreThreshold = math.max( initialModTimeIgnoreThreshold, // initial threshold based on newFilesOnly setting currentTime - durationToRemember.milliseconds // trailing end of the remember window ) {code} First, this code contradicts with documentation as far as I understand it. Second, this code contradicts with the name "newFilesOnly" itself. There is probably motivation behind this code, but I think the only way to find this motivation - git history and go over all tickets. Sorry, that I wasn't more specific first time. > newFilesOnly does not work according to docs. > ---------------------------------------------- > > Key: SPARK-19524 > URL: https://issues.apache.org/jira/browse/SPARK-19524 > Project: Spark > Issue Type: Bug > Components: DStreams > Affects Versions: 2.0.2 > Reporter: Egor Pahomov > > Docs says: > newFilesOnly > Should process only new files and ignore existing files in the directory > It's not working. > http://stackoverflow.com/questions/29852249/how-spark-streaming-identifies-new-files > says, that it shouldn't work as expected. > https://github.com/apache/spark/blob/master/streaming/src/main/scala/org/apache/spark/streaming/dstream/FileInputDStream.scala > not clear at all in terms, what code tries to do -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org