java.lang.NoClassDefFoundError: scala/tools/nsc/transform/UnCurry$UnCurryTransformer...

2014-04-02 Thread Francis . Hu
Hi, All 

 

I stuck in a NoClassDefFoundError.  Any helps that would be appreciated.

I download spark 0.9.0 source, and then run this command to build it :
SPARK_HADOOP_VERSION=2.2.0 SPARK_YARN=true sbt/sbt assembly 

then no error during the build of spark.

After that I run the spark-shell for testing, it always say below:

 

error after run spark-shell
--

testuser@ubuntu-1:~/softs/spark-0.9.0-incubating$ ./bin/spark-shell

 

14/04/02 00:11:52 INFO HttpServer: Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties

14/04/02 00:11:52 INFO HttpServer: Starting HTTP Server

error: 

 while compiling: 

during phase: uncurry

 library version: version 2.10.3

compiler version: version 2.10.3Z

  reconstructed args: 

 

  last tree to typer: EmptyTree

  symbol: null

   symbol definition: null

 tpe: 

   symbol owners: 

  context owners: constructor $repl_$init -> class $repl_$init ->
package 

 

== Enclosing template or block ==

 

"scala" // final package scala, tree.tpe=scala.type

 

== Expanded type of tree ==

 



 

uncaught exception during compilation: java.lang.NoClassDefFoundError

 

Failed to initialize compiler: NoClassDefFoundError.

This is most often remedied by a full clean and recompile.

Otherwise, your classpath may continue bytecode compiled by

different and incompatible versions of scala.

 

java.lang.NoClassDefFoundError:
scala/tools/nsc/transform/UnCurry$UnCurryTransformer$$anonfun$14$$anonfun$ap
ply$5$$anonfun$scala$tools$nsc$transform$UnCurry$UnCurryTransformer$$anonfun
$$anonfun$$transformInConstructor$1$1

at
scala.tools.nsc.transform.UnCurry$UnCurryTransformer$$anonfun$14$$anonfun$ap
ply$5.scala$tools$nsc$transform$UnCurry$UnCurryTransformer$$anonfun$$anonfun
$$transformInConstructor$1(UnCurry.scala:601)

at
scala.tools.nsc.transform.UnCurry$UnCurryTransformer$$anonfun$14$$anonfun$ap
ply$5$$anonfun$16.apply(UnCurry.scala:604)

at
scala.tools.nsc.transform.UnCurry$UnCurryTransformer$$anonfun$14$$anonfun$ap
ply$5$$anonfun$16.apply(UnCurry.scala:604)

at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:
244)

at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:
244)

at scala.collection.immutable.List.foreach(List.scala:318)

at
scala.collection.TraversableLike$class.map(TraversableLike.scala:244)

at scala.collection.AbstractTraversable.map(Traversable.scala:105)

at
scala.tools.nsc.transform.UnCurry$UnCurryTransformer$$anonfun$14$$anonfun$ap
ply$5.apply(UnCurry.scala:604)

at
scala.tools.nsc.transform.UnCurry$UnCurryTransformer$$anonfun$14$$anonfun$ap
ply$5.apply(UnCurry.scala:597)

at scala.reflect.api.Trees$Transformer.atOwner(Trees.scala:2936)

at
scala.tools.nsc.transform.TypingTransformers$TypingTransformer.atOwner(Typin
gTransformers.scala:34)

at
scala.tools.nsc.transform.UnCurry$UnCurryTransformer.mainTransform(UnCurry.s
cala:595)

at
scala.tools.nsc.transform.UnCurry$UnCurryTransformer.transform(UnCurry.scala
:122)

at
scala.tools.nsc.transform.UnCurry$UnCurryTransformer.transform(UnCurry.scala
:82)

at
scala.reflect.api.Trees$Transformer$$anonfun$transformStats$1.apply(Trees.sc
ala:2927)

at
scala.reflect.api.Trees$Transformer$$anonfun$transformStats$1.apply(Trees.sc
ala:2925)

at scala.collection.immutable.List.loop$1(List.scala:170)

at scala.collection.immutable.List.mapConserve(List.scala:186)

at
scala.reflect.api.Trees$Transformer.transformStats(Trees.scala:2925)

at scala.reflect.internal.Trees$class.itransform(Trees.scala:1276)

at
scala.reflect.internal.SymbolTable.itransform(SymbolTable.scala:13)

at
scala.reflect.internal.SymbolTable.itransform(SymbolTable.scala:13)

at scala.reflect.api.Trees$Transformer.transform(Trees.scala:2897)

at
scala.tools.nsc.transform.TypingTransformers$TypingTransformer.scala$tools$n
sc$transform$TypingTransformers$TypingTransformer$$super$transform(TypingTra
nsformers.scala:44)

at
scala.tools.nsc.transform.TypingTransformers$TypingTransformer$$anonfun$tran
sform$1.apply(TypingTransformers.scala:44)

at
scala.tools.nsc.transform.TypingTransformers$TypingTransformer$$anonfun$tran
sform$1.apply(TypingTransformers.scala:44)

at scala.reflect.api.Trees$Transformer.atOwner(Trees.scala:2936)

at
scala.tools.nsc.transform.TypingTransformers$TypingTransformer.atOwner(Typin
gTransformers.scala:34)

at
scala.tools.nsc.transform.TypingTransformers$TypingTransformer.transform(Typ
ingTransformers.scala:44)

at
scala.tools.nsc.transform.UnCurry$UnCurryTransformer.scala$tools$nsc$transfo
rm$UnCurry$UnCurryTransformer$$super$transform(UnCurry.scala:613)

at
scala.tools.nsc.transform.UnCurry$UnCurry

答复: java.lang.NoClassDefFoundError: scala/tools/nsc/transform/UnCurry$UnCurryTransformer...

2014-04-07 Thread Francis . Hu
Great!!!

When i built it on another disk whose format is ext4, it works right now.

hadoop@ubuntu-1:~$ df -Th
FilesystemType  Size  Used Avail Use% Mounted on
/dev/sdb6 ext4  135G  8.6G  119G   7% /
udev  devtmpfs  7.7G  4.0K  7.7G   1% /dev
tmpfs tmpfs 3.1G  316K  3.1G   1% /run
none  tmpfs 5.0M 0  5.0M   0% /run/lock
none  tmpfs 7.8G  4.0K  7.8G   1% /run/shm
/dev/sda1 ext4  112G  3.7G  103G   4% /faststore
/home/hadoop/.Private ecryptfs  135G  8.6G  119G   7% /home/hadoop

Thanks again, Marcelo Vanzin.


Francis.Hu

-邮件原件-
发件人: Marcelo Vanzin [mailto:van...@cloudera.com] 
发送时间: Saturday, April 05, 2014 1:13
收件人: user@spark.apache.org
主题: Re: java.lang.NoClassDefFoundError: 
scala/tools/nsc/transform/UnCurry$UnCurryTransformer...

Hi Francis,

This might be a long shot, but do you happen to have built spark on an
encrypted home dir?

(I was running into the same error when I was doing that. Rebuilding
on an unencrypted disk fixed the issue. This is a known issue /
limitation with ecryptfs. It's weird that the build doesn't fail, but
you do get warnings about the long file names.)


On Wed, Apr 2, 2014 at 3:26 AM, Francis.Hu  wrote:
> I stuck in a NoClassDefFoundError.  Any helps that would be appreciated.
>
> I download spark 0.9.0 source, and then run this command to build it :
> SPARK_HADOOP_VERSION=2.2.0 SPARK_YARN=true sbt/sbt assembly

>
> java.lang.NoClassDefFoundError:
> scala/tools/nsc/transform/UnCurry$UnCurryTransformer$$anonfun$14$$anonfun$apply$5$$anonfun$scala$tools$nsc$transform$UnCurry$UnCurryTransformer$$anonfun$$anonfun$$transformInConstructor$1$1

-- 
Marcelo



Issue during Spark streaming with ZeroMQ source

2014-04-29 Thread Francis . Hu
Hi, all

 

I installed spark-0.9.1 and zeromq 4.0.1 , and then run below example:

 

./bin/run-example org.apache.spark.streaming.examples.SimpleZeroMQPublisher
tcp://127.0.1.1:1234 foo.bar`

./bin/run-example org.apache.spark.streaming.examples.ZeroMQWordCount
local[2] tcp://127.0.1.1:1234 foo`

 

No any message was received in ZeroMQWordCount side. 

 

Does anyone know what the issue is ? 

 

 

Thanks,

Francis

 



答复: Issue during Spark streaming with ZeroMQ source

2014-04-29 Thread Francis . Hu
Thanks, Prashant Sharma

 

 

It works right now after degrade zeromq from 4.0.1 to  2.2. 

Do you know the new release of spark  whether it will upgrade zeromq ?

Many of our programs are using zeromq 4.0.1, so if in next release ,spark 
streaming can release with a newer zeromq  that would be better for us.

 

 

Francis.

 

发件人: Prashant Sharma [mailto:scrapco...@gmail.com] 
发送时间: Tuesday, April 29, 2014 15:53
收件人: user@spark.apache.org
主题: Re: Issue during Spark streaming with ZeroMQ source

 

Unfortunately zeromq 4.0.1 is not supported. 
https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/streaming/examples/ZeroMQWordCount.scala#L63
 Says about the version. You will need that version of zeromq to see it work. 
Basically I have seen it working nicely with zeromq 2.2.0 and if you have jzmq 
libraries installed performance is much better.




Prashant Sharma

 

On Tue, Apr 29, 2014 at 12:29 PM, Francis.Hu  
wrote:

Hi, all

 

I installed spark-0.9.1 and zeromq 4.0.1 , and then run below example:

 

./bin/run-example org.apache.spark.streaming.examples.SimpleZeroMQPublisher 
tcp://127.0.1.1:1234 foo.bar`

./bin/run-example org.apache.spark.streaming.examples.ZeroMQWordCount local[2] 
tcp://127.0.1.1:1234 foo`

 

No any message was received in ZeroMQWordCount side. 

 

Does anyone know what the issue is ? 

 

 

Thanks,

Francis

 

 



java.io.FileNotFoundException: /test/spark-0.9.1/work/app-20140505053550-0000/2/stdout (No such file or directory)

2014-05-05 Thread Francis . Hu
Hi,All

 

 

We run a spark cluster with three workers. 

created a spark streaming application,

then run the spark project using below command:

 

shell> sbt run spark://192.168.219.129:7077 tcp://192.168.20.118:5556 foo

 

we looked at the webui of workers, jobs failed without any error or info,
but FileNotFoundException occurred in workers' log file as below:

Is this an existent issue of spark? 

 

 

-in workers'
logs/spark-francis-org.apache.spark.deploy.worker.Worker-1-ubuntu-4.out-
---

 

14/05/05 02:39:39 WARN AbstractHttpConnection:
/logPage/?appId=app-20140505053550-&executorId=2&logType=stdout

java.io.FileNotFoundException:
/test/spark-0.9.1/work/app-20140505053550-/2/stdout (No such file or
directory)

at java.io.FileInputStream.open(Native Method)

at java.io.FileInputStream.(FileInputStream.java:138)

at org.apache.spark.util.Utils$.offsetBytes(Utils.scala:687)

at
org.apache.spark.deploy.worker.ui.WorkerWebUI.logPage(WorkerWebUI.scala:119)

at
org.apache.spark.deploy.worker.ui.WorkerWebUI$$anonfun$6.apply(WorkerWebUI.s
cala:52)

at
org.apache.spark.deploy.worker.ui.WorkerWebUI$$anonfun$6.apply(WorkerWebUI.s
cala:52)

at
org.apache.spark.ui.JettyUtils$$anon$1.handle(JettyUtils.scala:61)

at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java
:1040)

at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:
976)

at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135
)

at
org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:52)

at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:1
16)

at org.eclipse.jetty.server.Server.handle(Server.java:363)

at
org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpCo
nnection.java:483)

at
org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpC
onnection.java:920)

at
org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplet
e(AbstractHttpConnection.java:982)

at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:635)

at
org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)

at
org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java
:82)

at
org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.
java:628)

at
org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.j
ava:52)

at
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:
608)

at
org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:5
43)

at java.lang.Thread.run(Thread.java:722)

14/05/05 02:39:41 WARN AbstractHttpConnection:
/logPage/?appId=app-20140505053550-&executorId=9&logType=stderr

java.io.FileNotFoundException:
/test/spark-0.9.1/work/app-20140505053550-/9/stderr (No such file or
directory)

at java.io.FileInputStream.open(Native Method)

at java.io.FileInputStream.(FileInputStream.java:138)

at org.apache.spark.util.Utils$.offsetBytes(Utils.scala:687)

at
org.apache.spark.deploy.worker.ui.WorkerWebUI.logPage(WorkerWebUI.scala:119)

at
org.apache.spark.deploy.worker.ui.WorkerWebUI$$anonfun$6.apply(WorkerWebUI.s
cala:52)

at
org.apache.spark.deploy.worker.ui.WorkerWebUI$$anonfun$6.apply(WorkerWebUI.s
cala:52)

at
org.apache.spark.ui.JettyUtils$$anon$1.handle(JettyUtils.scala:61)

at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java
:1040)

at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:
976)

at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135
)

at
org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:52)

at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:1
16)

at org.eclipse.jetty.server.Server.handle(Server.java:363)

at
org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpCo
nnection.java:483)

at
org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpC
onnection.java:920)

at
org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplet
e(AbstractHttpConnection.java:982)

at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:635)

at
org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)

at
org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java
:82)

at
org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.
java:628)

at
org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.j
ava:52)

答复: java.io.FileNotFoundException: /test/spark-0.9.1/work/app-20140505053550-0000/2/stdout (No such file or directory)

2014-05-05 Thread Francis . Hu
The file does not exist in fact and no permission issue. 

 

francis@ubuntu-4:/test/spark-0.9.1$ ll work/app-20140505053550-/

total 24

drwxrwxr-x  6 francis francis 4096 May  5 05:35 ./

drwxrwxr-x 11 francis francis 4096 May  5 06:18 ../

drwxrwxr-x  2 francis francis 4096 May  5 05:35 2/

drwxrwxr-x  2 francis francis 4096 May  5 05:35 4/

drwxrwxr-x  2 francis francis 4096 May  5 05:35 7/

drwxrwxr-x  2 francis francis 4096 May  5 05:35 9/

 

Francis

 

发件人: Tathagata Das [mailto:tathagata.das1...@gmail.com] 
发送时间: Tuesday, May 06, 2014 3:45
收件人: user@spark.apache.org
主题: Re: java.io.FileNotFoundException: 
/test/spark-0.9.1/work/app-20140505053550-/2/stdout (No such file or 
directory)

 

Do those file actually exist? Those stdout/stderr should have the output of the 
spark's executors running in the workers, and its weird that they dont exist. 
Could be permission issue - maybe the directories/files are not being generated 
because it cannot?

 

TD

 

On Mon, May 5, 2014 at 3:06 AM, Francis.Hu  wrote:

Hi,All

 

 

We run a spark cluster with three workers. 

created a spark streaming application,

then run the spark project using below command:

 

shell> sbt run spark://192.168.219.129:7077 tcp://192.168.20.118:5556 foo

 

we looked at the webui of workers, jobs failed without any error or info, but 
FileNotFoundException occurred in workers' log file as below:

Is this an existent issue of spark? 

 

 

-in workers' 
logs/spark-francis-org.apache.spark.deploy.worker.Worker-1-ubuntu-4.out

 

14/05/05 02:39:39 WARN AbstractHttpConnection: 
/logPage/?appId=app-20140505053550-&executorId=2&logType=stdout

java.io.FileNotFoundException: 
/test/spark-0.9.1/work/app-20140505053550-/2/stdout (No such file or 
directory)

at java.io.FileInputStream.open(Native Method)

at java.io.FileInputStream.(FileInputStream.java:138)

at org.apache.spark.util.Utils$.offsetBytes(Utils.scala:687)

at 
org.apache.spark.deploy.worker.ui.WorkerWebUI.logPage(WorkerWebUI.scala:119)

at 
org.apache.spark.deploy.worker.ui.WorkerWebUI$$anonfun$6.apply(WorkerWebUI.scala:52)

at 
org.apache.spark.deploy.worker.ui.WorkerWebUI$$anonfun$6.apply(WorkerWebUI.scala:52)

at org.apache.spark.ui.JettyUtils$$anon$1.handle(JettyUtils.scala:61)

at 
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1040)

at 
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:976)

at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)

at 
org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:52)

at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)

at org.eclipse.jetty.server.Server.handle(Server.java:363)

at 
org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:483)

at 
org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:920)

at 
org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:982)

at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:635)

at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)

at 
org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)

at 
org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:628)

at 
org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52)

at 
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)

at 
org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)

at java.lang.Thread.run(Thread.java:722)

14/05/05 02:39:41 WARN AbstractHttpConnection: 
/logPage/?appId=app-20140505053550-&executorId=9&logType=stderr

java.io.FileNotFoundException: 
/test/spark-0.9.1/work/app-20140505053550-/9/stderr (No such file or 
directory)

at java.io.FileInputStream.open(Native Method)

at java.io.FileInputStream.(FileInputStream.java:138)

at org.apache.spark.util.Utils$.offsetBytes(Utils.scala:687)

at 
org.apache.spark.deploy.worker.ui.WorkerWebUI.logPage(WorkerWebUI.scala:119)

at 
org.apache.spark.deploy.worker.ui.WorkerWebUI$$anonfun$6.apply(WorkerWebUI.scala:52)

at 
org.apache.spark.deploy.worker.ui.WorkerWebUI$$anonfun$6.apply(WorkerWebUI.scala:52)

at org.apache.spark.ui.JettyUtils$$anon$1.handle(JettyUtils.scala:61)

at 
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1040)

at 
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:976)

at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHand

答复: 答复: java.io.FileNotFoundException: /test/spark-0.9.1/work/app-20140505053550-0000/2/stdout (No such file or directory)

2014-05-11 Thread Francis . Hu
I  have just the problem resolved via running master and work daemons 
individually on where they are.

if I execute the shell: sbin/start-all.sh , the problem always exist.  

 

 

发件人: Francis.Hu [mailto:francis...@reachjunction.com] 
发送时间: Tuesday, May 06, 2014 10:31
收件人: user@spark.apache.org
主题: 答复: 答复: java.io.FileNotFoundException: 
/test/spark-0.9.1/work/app-20140505053550-/2/stdout (No such file or 
directory)

 

i looked into the log again, all exceptions are about FileNotFoundException . 
In the Webui, no anymore info I can check except for the basic description of 
job.  

Attached the log file, could you help to take a look ? Thanks.

 

Francis.Hu

 

发件人: Tathagata Das [mailto:tathagata.das1...@gmail.com] 
发送时间: Tuesday, May 06, 2014 10:16
收件人: user@spark.apache.org
主题: Re: 答复: java.io.FileNotFoundException: 
/test/spark-0.9.1/work/app-20140505053550-/2/stdout (No such file or 
directory)

 

Can you check the Spark worker logs on that machine. Either from the web ui, or 
directly. Should be /test/spark-XXX/logs/  See if that has any error.

If there is not permission issue, I am not why stdout and stderr is not being 
generated. 

 

TD

 

On Mon, May 5, 2014 at 7:13 PM, Francis.Hu  wrote:

The file does not exist in fact and no permission issue. 

 

francis@ubuntu-4:/test/spark-0.9.1$ ll work/app-20140505053550-/

total 24

drwxrwxr-x  6 francis francis 4096 May  5 05:35 ./

drwxrwxr-x 11 francis francis 4096 May  5 06:18 ../

drwxrwxr-x  2 francis francis 4096 May  5 05:35 2/

drwxrwxr-x  2 francis francis 4096 May  5 05:35 4/

drwxrwxr-x  2 francis francis 4096 May  5 05:35 7/

drwxrwxr-x  2 francis francis 4096 May  5 05:35 9/

 

Francis

 

发件人: Tathagata Das [mailto:tathagata.das1...@gmail.com] 
发送时间: Tuesday, May 06, 2014 3:45
收件人: user@spark.apache.org
主题: Re: java.io.FileNotFoundException: 
/test/spark-0.9.1/work/app-20140505053550-/2/stdout (No such file or 
directory)

 

Do those file actually exist? Those stdout/stderr should have the output of the 
spark's executors running in the workers, and its weird that they dont exist. 
Could be permission issue - maybe the directories/files are not being generated 
because it cannot?

 

TD

 

On Mon, May 5, 2014 at 3:06 AM, Francis.Hu  wrote:

Hi,All

 

 

We run a spark cluster with three workers. 

created a spark streaming application,

then run the spark project using below command:

 

shell> sbt run spark://192.168.219.129:7077 tcp://192.168.20.118:5556 foo

 

we looked at the webui of workers, jobs failed without any error or info, but 
FileNotFoundException occurred in workers' log file as below:

Is this an existent issue of spark? 

 

 

-in workers' 
logs/spark-francis-org.apache.spark.deploy.worker.Worker-1-ubuntu-4.out

 

14/05/05 02:39:39 WARN AbstractHttpConnection: 
/logPage/?appId=app-20140505053550-&executorId=2&logType=stdout

java.io.FileNotFoundException: 
/test/spark-0.9.1/work/app-20140505053550-/2/stdout (No such file or 
directory)

at java.io.FileInputStream.open(Native Method)

at java.io.FileInputStream.(FileInputStream.java:138)

at org.apache.spark.util.Utils$.offsetBytes(Utils.scala:687)

at 
org.apache.spark.deploy.worker.ui.WorkerWebUI.logPage(WorkerWebUI.scala:119)

at 
org.apache.spark.deploy.worker.ui.WorkerWebUI$$anonfun$6.apply(WorkerWebUI.scala:52)

at 
org.apache.spark.deploy.worker.ui.WorkerWebUI$$anonfun$6.apply(WorkerWebUI.scala:52)

at org.apache.spark.ui.JettyUtils$$anon$1.handle(JettyUtils.scala:61)

at 
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1040)

at 
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:976)

at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)

at 
org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:52)

at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)

at org.eclipse.jetty.server.Server.handle(Server.java:363)

at 
org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:483)

at 
org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:920)

at 
org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:982)

at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:635)

at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)

at 
org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)

at 
org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:628)

at 
org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52)

at 
org.eclipse.jetty.util.thread.QueuedThreadP

No configuration setting found for key 'akka.zeromq'

2014-05-14 Thread Francis . Hu
hi,all

 

When i run ZeroMQWordCount example on cluster, the worker log says:   Caused
by: com.typesafe.config.ConfigException$Missing: No configuration setting
found for key 'akka.zeromq'

 

Actually, i can see that the reference.conf in
spark-examples-assembly-0.9.1.jar contains below configurations: 

Anyone know what happen ?

 

#

# Akka ZeroMQ Reference Config File #

#

 

# This is the reference config file that contains all the default settings.

# Make your edits/overrides in your application.conf.

 

akka {

 

  zeromq {

 

# The default timeout for a poll on the actual zeromq socket.

poll-timeout = 100ms

 

# Timeout for creating a new socket

new-socket-timeout = 5s

 

socket-dispatcher {

  # A zeromq socket needs to be pinned to the thread that created it.

  # Changing this value results in weird errors and race conditions
within

  # zeromq

  executor = thread-pool-executor

  type = "PinnedDispatcher"

  thread-pool-executor.allow-core-timeout = off

}

  }

}

 

Exception in worker

 

akka.actor.ActorInitializationException: exception during creation

at akka.actor.ActorInitializationException$.apply(Actor.scala:218)

Caused by: com.typesafe.config.ConfigException$Missing: No configuration
setting found for key 'akka.zeromq'

14/05/06 21:26:19 ERROR actor.ActorCell: changing Recreate into Create after
akka.actor.ActorInitializationException: exception during creation

 

 

Thanks,

Francis.Hu



help me: Out of memory when spark streaming

2014-05-16 Thread Francis . Hu
hi, All

 

I encountered OOM when streaming.

I send data to spark streaming through Zeromq at a speed of 600 records per
second, but the spark streaming only handle 10 records per 5 seconds( set it
in streaming program)

my two workers have 4 cores CPU and 1G RAM.

These workers always occur Out Of Memory after moments.

I tried to adjust JVM GC arguments to speed up GC process.  Actually, it
made a little bit change of performance, but workers finally occur OOM.

 

Is there any way to resolve it?

 

it would be appreciated if anyone can help me to get it fixed !

 

 

Thanks,

Francis.Hu



any way to control memory usage when streaming input's speed is faster than the speed of handled by spark streaming ?

2014-05-20 Thread Francis . Hu
sparkers,

 

Is there a better way to control memory usage when streaming input's speed
is faster than the speed of handled by spark streaming ?

 

Thanks,

Francis.Hu