Hi,

I run the following simple Java spark standalone app with maven command
"exec:java -Dexec.mainClass=SimpleApp"

public class SimpleApp {
    public static void main(String[] args) {
        System.out.println("Reading and Connecting with Spark.....");
        try {
            String logFile = "/home/asif/spark-1.4.0/README.md"; // Should
be some file on your system
            SparkConf conf = new SparkConf().setAppName("Simple
Application").setMaster("local");
            JavaSparkContext sc = new JavaSparkContext(conf);
            JavaRDD<String> logData = sc.textFile(logFile).cache();

            long numAs = logData.filter(new Function<String, Boolean>() {
                public Boolean call(String s) { return s.contains("a"); }
            }).count();

            long numBs = logData.filter(new Function<String, Boolean>() {
                public Boolean call(String s) { return s.contains("b"); }
            }).count();

            System.out.println("Lines with a: " + numAs + ", lines with b: "
+ numBs);
        }
        catch(Exception e){
           System.out.println ("Error in connecting with Spark");
          }
        }
}

Well, it builds successfully and also giving results but with thread
exception. What is the reason of  the thread exception and how to solve it
in standalone mode because in spark shell with spark commit command, it is
running fine.
Log trace is:

[INFO] Scanning for projects...
[INFO]
[INFO]
------------------------------------------------------------------------
[INFO] Building standAloneSparkApp 1.0-SNAPSHOT
[INFO]
------------------------------------------------------------------------
[INFO]
[INFO] --- exec-maven-plugin:1.4.0:java (default-cli) @ standAloneSparkApp
---
Reading and Connecting with Spark.....
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
15/07/07 03:28:34 INFO SparkContext: Running Spark version 1.4.0
15/07/07 03:28:34 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
15/07/07 03:28:34 WARN Utils: Your hostname, ubuntu resolves to a loopback
address: 127.0.1.1; using 192.***.***.*** instead (on interface eth0)
15/07/07 03:28:34 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
another address
15/07/07 03:28:34 INFO SecurityManager: Changing view acls to: myusername
15/07/07 03:28:34 INFO SecurityManager: Changing modify acls to: myusername
15/07/07 03:28:34 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(myusername);
users
with modify permissions: Set(myusername)
15/07/07 03:28:35 INFO Slf4jLogger: Slf4jLogger started
15/07/07 03:28:36 INFO Remoting: Starting remoting
15/07/07 03:28:36 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkDriver@192.***.***.***:*****]
15/07/07 03:28:36 INFO Utils: Successfully started service 'sparkDriver' on
port 34863.
15/07/07 03:28:36 INFO SparkEnv: Registering MapOutputTracker
15/07/07 03:28:36 INFO SparkEnv: Registering BlockManagerMaster
15/07/07 03:28:36 INFO MemoryStore: MemoryStore started with capacity 534.5
MB
15/07/07 03:28:36 INFO HttpFileServer: HTTP File server directory is
/tmp/spark-b8a40d9c-d470-404e-bcd7-5763791ffeca/httpd-6f024c0f-60c4-413b-bb29-eee93e697651
15/07/07 03:28:36 INFO HttpServer: Starting HTTP Server
15/07/07 03:28:36 INFO Utils: Successfully started service 'HTTP file
server' on port 46189.
15/07/07 03:28:37 INFO SparkEnv: Registering OutputCommitCoordinator
15/07/07 03:28:37 INFO Utils: Successfully started service 'SparkUI' on port
4040.
15/07/07 03:28:37 INFO SparkUI: Started SparkUI at
http://192.***.***.***:****
15/07/07 03:28:37 INFO Executor: Starting executor ID driver on host
localhost
15/07/07 03:28:38 INFO Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port *****.
15/07/07 03:28:38 INFO NettyBlockTransferService: Server created on *****
15/07/07 03:28:38 INFO BlockManagerMaster: Trying to register BlockManager
15/07/07 03:28:38 INFO BlockManagerMasterEndpoint: Registering block manager
localhost:*****with 534.5 MB RAM, BlockManagerId(driver, localhost, *****)
15/07/07 03:28:38 INFO BlockManagerMaster: Registered BlockManager
15/07/07 03:28:39 INFO MemoryStore: ensureFreeSpace(*******) called with
curMem=0, maxMem=560497950
15/07/07 03:28:39 INFO MemoryStore: Block broadcast_0 stored as values in
memory (estimated size 107.7 KB, free 534.4 MB)
15/07/07 03:28:40 INFO MemoryStore: ensureFreeSpace(10090) called with
curMem=110248, maxMem=560497950
15/07/07 03:28:40 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes
in memory (estimated size 9.9 KB, free 534.4 MB)
15/07/07 03:28:40 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory
on localhost:36884 (size: 9.9 KB, free: 534.5 MB)
15/07/07 03:28:40 INFO SparkContext: Created broadcast 0 from textFile at
SimpleApp.java:19
15/07/07 03:28:40 INFO FileInputFormat: Total input paths to process : 1
15/07/07 03:28:40 INFO SparkContext: Starting job: count at
SimpleApp.java:23
15/07/07 03:28:40 INFO DAGScheduler: Got job 0 (count at SimpleApp.java:23)
with 1 output partitions (allowLocal=false)
15/07/07 03:28:40 INFO DAGScheduler: Final stage: ResultStage 0(count at
SimpleApp.java:23)
15/07/07 03:28:40 INFO DAGScheduler: Parents of final stage: List()
15/07/07 03:28:40 INFO DAGScheduler: Missing parents: List()
15/07/07 03:28:40 INFO DAGScheduler: Submitting ResultStage 0
(MapPartitionsRDD[2] at filter at SimpleApp.java:21), which has no missing
parents
15/07/07 03:28:41 INFO MemoryStore: ensureFreeSpace(3280) called with
curMem=120338, maxMem=560497950
15/07/07 03:28:41 INFO MemoryStore: Block broadcast_1 stored as values in
memory (estimated size 3.2 KB, free 534.4 MB)
15/07/07 03:28:41 INFO MemoryStore: ensureFreeSpace(1933) called with
curMem=123618, maxMem=560497950
15/07/07 03:28:41 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes
in memory (estimated size 1933.0 B, free 534.4 MB)
15/07/07 03:28:41 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory
on localhost:36884 (size: 1933.0 B, free: 534.5 MB)
15/07/07 03:28:41 INFO SparkContext: Created broadcast 1 from broadcast at
DAGScheduler.scala:874
15/07/07 03:28:41 INFO DAGScheduler: Submitting 1 missing tasks from
ResultStage 0 (MapPartitionsRDD[2] at filter at SimpleApp.java:21)
15/07/07 03:28:41 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
15/07/07 03:28:41 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID
0, localhost, PROCESS_LOCAL, 1410 bytes)
15/07/07 03:28:41 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
15/07/07 03:28:41 INFO CacheManager: Partition rdd_1_0 not found, computing
it
15/07/07 03:28:41 INFO HadoopRDD: Input split:
file:/home/asif/spark-1.4.0/README.md:0+3624
15/07/07 03:28:41 INFO deprecation: mapred.tip.id is deprecated. Instead,
use mapreduce.task.id
15/07/07 03:28:41 INFO deprecation: mapred.task.id is deprecated. Instead,
use mapreduce.task.attempt.id
15/07/07 03:28:41 INFO deprecation: mapred.task.is.map is deprecated.
Instead, use mapreduce.task.ismap
15/07/07 03:28:41 INFO deprecation: mapred.task.partition is deprecated.
Instead, use mapreduce.task.partition
15/07/07 03:28:41 INFO deprecation: mapred.job.id is deprecated. Instead,
use mapreduce.job.id
15/07/07 03:28:41 INFO MemoryStore: ensureFreeSpace(*****) called with
curMem=125551, maxMem=560497950
15/07/07 03:28:41 INFO MemoryStore: Block rdd_1_0 stored as values in memory
(estimated size 11.3 KB, free 534.4 MB)
15/07/07 03:28:41 INFO BlockManagerInfo: Added rdd_1_0 in memory on
localhost:36884 (size: 11.3 KB, free: 534.5 MB)
15/07/07 03:28:41 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0).
2410 bytes result sent to driver
15/07/07 03:28:41 INFO DAGScheduler: ResultStage 0 (count at
SimpleApp.java:23) finished in 0.542 s
15/07/07 03:28:41 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID
0) in 520 ms on localhost (1/1)
15/07/07 03:28:41 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks
have all completed, from pool
15/07/07 03:28:41 INFO DAGScheduler: Job 0 finished: count at
SimpleApp.java:23, took 0.918158 s
15/07/07 03:28:41 INFO SparkContext: Starting job: count at
SimpleApp.java:27
15/07/07 03:28:41 INFO DAGScheduler: Got job 1 (count at SimpleApp.java:27)
with 1 output partitions (allowLocal=false)
15/07/07 03:28:41 INFO DAGScheduler: Final stage: ResultStage 1(count at
SimpleApp.java:27)
15/07/07 03:28:41 INFO DAGScheduler: Parents of final stage: List()
15/07/07 03:28:41 INFO DAGScheduler: Missing parents: List()
15/07/07 03:28:41 INFO DAGScheduler: Submitting ResultStage 1
(MapPartitionsRDD[3] at filter at SimpleApp.java:25), which has no missing
parents
15/07/07 03:28:41 INFO MemoryStore: ensureFreeSpace(3280) called with
curMem=137127, maxMem=560497950
15/07/07 03:28:41 INFO MemoryStore: Block broadcast_2 stored as values in
memory (estimated size 3.2 KB, free 534.4 MB)
15/07/07 03:28:41 INFO MemoryStore: ensureFreeSpace(1933) called with
curMem=140407, maxMem=560497950
15/07/07 03:28:41 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes
in memory (estimated size 1933.0 B, free 534.4 MB)
15/07/07 03:28:41 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory
on localhost:36884 (size: 1933.0 B, free: 534.5 MB)
15/07/07 03:28:41 INFO SparkContext: Created broadcast 2 from broadcast at
DAGScheduler.scala:874
15/07/07 03:28:41 INFO DAGScheduler: Submitting 1 missing tasks from
ResultStage 1 (MapPartitionsRDD[3] at filter at SimpleApp.java:25)
15/07/07 03:28:41 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
15/07/07 03:28:41 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID
1, localhost, PROCESS_LOCAL, 1410 bytes)
15/07/07 03:28:41 INFO Executor: Running task 0.0 in stage 1.0 (TID 1)
15/07/07 03:28:41 INFO BlockManager: Found block rdd_1_0 locally
15/07/07 03:28:41 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1).
1830 bytes result sent to driver
15/07/07 03:28:41 INFO DAGScheduler: ResultStage 1 (count at
SimpleApp.java:27) finished in 0.020 s
15/07/07 03:28:41 INFO DAGScheduler: Job 1 finished: count at
SimpleApp.java:27, took 0.070602 s
Lines with a: 60, lines with b: 29
15/07/07 03:28:41 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID
1) in 24 ms on localhost (1/1)
15/07/07 03:28:41 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks
have all completed, from pool
15/07/07 03:28:41 ERROR ContextCleaner: Error in cleaning thread
java.lang.InterruptedException
        at java.lang.Object.wait(Native Method)
        at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143)
        at
org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:157)
        at
org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1215)
        at
org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:154)
        at
org.apache.spark.ContextCleaner$$anon$3.run(ContextCleaner.scala:67)
15/07/07 03:28:41 ERROR Utils: uncaught error in thread SparkListenerBus,
stopping SparkContext
java.lang.InterruptedException
        at
java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1302)
        at java.util.concurrent.Semaphore.acquire(Semaphore.java:312)
        at
org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(AsynchronousListenerBus.scala:65)
        at
org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1215)
        at
org.apache.spark.util.AsynchronousListenerBus$$anon$1.run(AsynchronousListenerBus.scala:63)
15/07/07 03:28:42 INFO SparkUI: Stopped Spark web UI at
http://192.***.***.***:****
15/07/07 03:28:42 INFO DAGScheduler: Stopping DAGScheduler
[WARNING] thread Thread[sparkDriver-scheduler-1,5,SimpleApp] was interrupted
but is still alive after waiting at least 14999msecs
[WARNING] thread Thread[sparkDriver-scheduler-1,5,SimpleApp] will linger
despite being asked to die via interruption
[WARNING] thread
Thread[sparkDriver-akka.actor.default-dispatcher-3,5,SimpleApp] will linger
despite being asked to die via interruption
[WARNING] thread
Thread[sparkDriver-akka.actor.default-dispatcher-4,5,SimpleApp] will linger
despite being asked to die via interruption
[WARNING] thread
Thread[sparkDriver-akka.actor.default-dispatcher-5,5,SimpleApp] will linger
despite being asked to die via interruption
[WARNING] thread
Thread[sparkDriver-akka.actor.default-dispatcher-6,5,SimpleApp] will linger
despite being asked to die via interruption
[WARNING] thread
Thread[sparkDriver-akka.remote.default-remote-dispatcher-7,5,SimpleApp] will
linger despite being asked to die via interruption
[WARNING] thread Thread[New I/O worker #1,5,SimpleApp] will linger despite
being asked to die via interruption
[WARNING] thread Thread[New I/O worker #2,5,SimpleApp] will linger despite
being asked to die via interruption
[WARNING] thread Thread[New I/O boss #3,5,SimpleApp] will linger despite
being asked to die via interruption
[WARNING] thread Thread[New I/O worker #4,5,SimpleApp] will linger despite
being asked to die via interruption
[WARNING] thread Thread[New I/O worker #5,5,SimpleApp] will linger despite
being asked to die via interruption
[WARNING] thread Thread[New I/O server boss #6,5,SimpleApp] will linger
despite being asked to die via interruption
[WARNING] thread
Thread[sparkDriver-akka.remote.default-remote-dispatcher-15,5,SimpleApp]
will linger despite being asked to die via interruption
[WARNING] thread Thread[MAP_OUTPUT_TRACKER cleanup timer,5,SimpleApp] will
linger despite being asked to die via interruption
[WARNING] thread Thread[BLOCK_MANAGER cleanup timer,5,SimpleApp] will linger
despite being asked to die via interruption
[WARNING] thread Thread[BROADCAST_VARS cleanup timer,5,SimpleApp] will
linger despite being asked to die via interruption
[WARNING] thread Thread[qtp1611265991-33 Acceptor0
SocketConnector@0.0.0.0:46189,5,SimpleApp] will linger despite being asked
to die via interruption
[WARNING] thread Thread[heartbeat-receiver-event-loop-thread,5,SimpleApp]
will linger despite being asked to die via interruption
[WARNING] thread Thread[shuffle-server-0,5,SimpleApp] will linger despite
being asked to die via interruption
[WARNING] thread Thread[SparkListenerBus,5,SimpleApp] will linger despite
being asked to die via interruption
[WARNING] NOTE: 20 thread(s) did not finish despite being asked to  via
interruption. This is not a problem with exec:java, it is a problem with the
running code. Although not serious, it should be remedied.
[WARNING] Couldn't destroy threadgroup
org.codehaus.mojo.exec.ExecJavaMojo$IsolatedThreadGroup[name=SimpleApp,maxpri=10]
java.lang.IllegalThreadStateException
        at java.lang.ThreadGroup.destroy(ThreadGroup.java:778)
        at
org.codehaus.mojo.exec.ExecJavaMojo.execute(ExecJavaMojo.java:328)
        at
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
        at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
        at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
        at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
        at
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
        at
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
        at
org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
        at
org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
        at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
        at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
        at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
        at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
        at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at
org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
        at
org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
        at
org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
        at
org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
        at org.codehaus.classworlds.Launcher.main(Launcher.java:47)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at
com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
[INFO]
------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO]
------------------------------------------------------------------------
[INFO] Total time: 26.998s
[INFO] Finished at: Tue Jul 07 03:28:56 PDT 2015
[INFO] Final Memory: 20M/60M
[INFO]
------------------------------------------------------------------------
15/07/07 03:28:57 INFO DiskBlockManager: Shutdown hook called
15/07/07 03:28:57 INFO Utils: path =
/tmp/spark-b8a40d9c-d470-404e-bcd7-5763791ffeca/blockmgr-422d1fcb-bda4-4a2a-93a6-4d49c28cdf28,
already present as root for deletion.
15/07/07 03:28:57 INFO Utils: Shutdown hook called
15/07/07 03:28:57 INFO Utils: Deleting directory
/tmp/spark-b8a40d9c-d470-404e-bcd7-5763791ffeca/userFiles-3d0c9f72-b6f0-431f-9ca4-c82d040411c7
15/07/07 03:28:57 INFO Utils: Deleting directory
/tmp/spark-b8a40d9c-d470-404e-bcd7-5763791ffeca

Process finished with exit code 0









--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-solve-ThreadException-in-Apache-Spark-standalone-Java-Application-tp23675.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to