Spark Streaming only process the NEW files after it started, so you
should point it to a directory, and copy the file into it after
started.

On Fri, Sep 4, 2015 at 5:15 AM, Kamilbek <kamilh...@gmail.com> wrote:
> I use spark 1.3.1 and Python 2.7
>
> It is my first experience with Spark Streaming.
>
> I try example of code, which reads data from file using spark streaming.
>
> This is link to example:
> https://github.com/apache/spark/blob/master/examples/src/main/python/streaming/hdfs_wordcount.py
>
> My code is the following:
>
>     conf = (SparkConf()
>          .setMaster("local")
>          .setAppName("My app")
>          .set("spark.executor.memory", "1g"))
>     sc = SparkContext(conf = conf)
>     ssc = StreamingContext(sc, 1)
>     lines = ssc.textFileStream('../inputs/2.txt')
>     counts = lines.flatMap(lambda line: line.split(" "))\
>               .map(lambda x: (x, 1))\
>               .reduceByKey(lambda a, b: a+b)
>     counts.pprint()
>     ssc.start()
>     ssc.awaitTermination()
>
>
> content of 2.txt file is following:
>
> a1 b1 c1 d1 e1 f1 g1
> a2 b2 c2 d2 e2 f2 g2
> a3 b3 c3 d3 e3 f3 g3
>
>
> I expect that something related to file content will be in console, but
> there are nothing. Nothing except text like this each second:
>
> -------------------------------------------
> Time: 2015-09-03 15:08:18
> -------------------------------------------
>
> and Spark's logs.
>
> Do I do some thing wrong? Otherwise why it does not work?
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Python-Spark-Streaming-example-with-textFileStream-does-not-work-Why-tp24579.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to