any way to control memory usage when streaming input's speed is faster than the speed of handled by spark streaming ?

2014-05-20 Thread Francis . Hu
sparkers, Is there a better way to control memory usage when streaming input's speed is faster than the speed of handled by spark streaming ? Thanks, Francis.Hu

help me: Out of memory when spark streaming

2014-05-16 Thread Francis . Hu
hi, All I encountered OOM when streaming. I send data to spark streaming through Zeromq at a speed of 600 records per second, but the spark streaming only handle 10 records per 5 seconds( set it in streaming program) my two workers have 4 cores CPU and 1G RAM. These workers always occur Out

No configuration setting found for key 'akka.zeromq'

2014-05-14 Thread Francis . Hu
hi,all When i run ZeroMQWordCount example on cluster, the worker log says: Caused by: com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.zeromq' Actually, i can see that the reference.conf in spark-examples-assembly-0.9.1.jar contains below

答复: 答复: java.io.FileNotFoundException: /test/spark-0.9.1/work/app-20140505053550-0000/2/stdout (No such file or directory)

2014-05-11 Thread Francis . Hu
I have just the problem resolved via running master and work daemons individually on where they are. if I execute the shell: sbin/start-all.sh , the problem always exist. 发件人: Francis.Hu [mailto:francis...@reachjunction.com] 发送时间: Tuesday, May 06, 2014 10:31 收件人: user@spark.apache.org

java.io.FileNotFoundException: /test/spark-0.9.1/work/app-20140505053550-0000/2/stdout (No such file or directory)

2014-05-05 Thread Francis . Hu
Hi,All We run a spark cluster with three workers. created a spark streaming application, then run the spark project using below command: shell sbt run spark://192.168.219.129:7077 tcp://192.168.20.118:5556 foo we looked at the webui of workers, jobs failed without any error or

答复: java.io.FileNotFoundException: /test/spark-0.9.1/work/app-20140505053550-0000/2/stdout (No such file or directory)

2014-05-05 Thread Francis . Hu
The file does not exist in fact and no permission issue. francis@ubuntu-4:/test/spark-0.9.1$ ll work/app-20140505053550-/ total 24 drwxrwxr-x 6 francis francis 4096 May 5 05:35 ./ drwxrwxr-x 11 francis francis 4096 May 5 06:18 ../ drwxrwxr-x 2 francis francis 4096 May 5 05:35 2/

Issue during Spark streaming with ZeroMQ source

2014-04-29 Thread Francis . Hu
Hi, all I installed spark-0.9.1 and zeromq 4.0.1 , and then run below example: ./bin/run-example org.apache.spark.streaming.examples.SimpleZeroMQPublisher tcp://127.0.1.1:1234 foo.bar` ./bin/run-example org.apache.spark.streaming.examples.ZeroMQWordCount local[2] tcp://127.0.1.1:1234 foo`

答复: java.lang.NoClassDefFoundError: scala/tools/nsc/transform/UnCurry$UnCurryTransformer...

2014-04-07 Thread Francis . Hu
Great!!! When i built it on another disk whose format is ext4, it works right now. hadoop@ubuntu-1:~$ df -Th FilesystemType Size Used Avail Use% Mounted on /dev/sdb6 ext4 135G 8.6G 119G 7% / udev devtmpfs 7.7G 4.0K 7.7G 1% /dev tmpfs