Hello;

I'm trying to use a native library in Spark.

I was using a simple standalone cluster with one master and worker.

According to the documentation I edited the spark-defautls.conf by setting:

spark.driver.extraClassPath=/opt/eTOX_spark/lib/org.RDKit.jar
spark.driver.extraLibraryPath=/opt/eTOX_spark/lib/
spark.executor.extraLibraryPath=/opt/eTOX_spark/lib/

In the path /opt/eTOX_spark/lib/ there are 3 so files wich are wrapped in
org.RDKit.jar.

But when I try so submit a job that uses the native library I get:

Exception in thread "main" java.lang.UnsatisfiedLinkError:
org.RDKit.RDKFuncsJNI.RWMol_MolFromSmiles__SWIG_3(Ljava/lang/String;)J
    at org.RDKit.RDKFuncsJNI.RWMol_MolFromSmiles__SWIG_3(Native Method)
    at org.RDKit.RWMol.MolFromSmiles(RWMol.java:426)
    at models.spark.sources.eTOX_DB$.main(eTOX.scala:54)
    at models.spark.sources.eTOX_DB.main(eTOX.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:727)
    at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

I use the submit.sh with the following parameters:

 /opt/spark/bin/spark-submit --verbose --class
"models.spark.sources.eTOX_DB"  --master
spark://localhost.localdomain:7077
target/scala-2.10/etox_spark_2.10-1.0.jar

the full output is:

Using properties file: /opt/spark/conf/spark-defaults.conf
Adding default property: spark.driver.extraLibraryPath=/opt/eTOX_spark/lib/
Adding default property:
spark.driver.extraClassPath=/opt/eTOX_spark/lib/org.RDKit.jar
Adding default property:
spark.executor.extraLibraryPath=/opt/eTOX_spark/lib/
Parsed arguments:
  master                  spark://localhost.localdomain:7077
  deployMode              null
  executorMemory          null
  executorCores           null
  totalExecutorCores      null
  propertiesFile          /opt/spark/conf/spark-defaults.conf
  driverMemory            null
  driverCores             null
  driverExtraClassPath    /opt/eTOX_spark/lib/org.RDKit.jar
  driverExtraLibraryPath  /opt/eTOX_spark/lib/
  driverExtraJavaOptions  null
  supervise               false
  queue                   null
  numExecutors            null
  files                   null
  pyFiles                 null
  archives                null
  mainClass               models.spark.sources.eTOX_DB
  primaryResource
file:/opt/eTOX_spark/target/scala-2.10/etox_spark_2.10-1.0.jar
  name                    models.spark.sources.eTOX_DB
  childArgs               []
  jars                    null
  packages                null
  packagesExclusions      null
  repositories            null
  verbose                 true

Spark properties used, including those specified through
 --conf and those from the properties file
/opt/spark/conf/spark-defaults.conf:
  spark.executor.extraLibraryPath -> /opt/eTOX_spark/lib/
  spark.driver.extraLibraryPath -> /opt/eTOX_spark/lib/
  spark.driver.extraClassPath -> /opt/eTOX_spark/lib/org.RDKit.jar


Main class:
models.spark.sources.eTOX_DB
Arguments:

System properties:
spark.executor.extraLibraryPath -> /opt/eTOX_spark/lib/
spark.driver.extraLibraryPath -> /opt/eTOX_spark/lib/
SPARK_SUBMIT -> true
spark.app.name -> models.spark.sources.eTOX_DB
spark.jars -> file:/opt/eTOX_spark/target/scala-2.10/etox_spark_2.10-1.0.jar
spark.submit.deployMode -> client
spark.master -> spark://localhost.localdomain:7077
spark.driver.extraClassPath -> /opt/eTOX_spark/lib/org.RDKit.jar
Classpath elements:
file:/opt/eTOX_spark/target/scala-2.10/etox_spark_2.10-1.0.jar


Buffer(/opt/jdk1.8.0_45/jre/lib/amd64/libzip.so)
Loading libraries
Buffer(/opt/jdk1.8.0_45/jre/lib/amd64/libzip.so, /opt/eTOX_spark/lib/
libboost_thread.1.48.0.so, /opt/eTOX_spark/lib/libboost_system.1.48.0.so,
/opt/eTOX_spark/lib/libGraphMolWrap.so)
Loading libraries
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
15/11/25 16:27:32 INFO SparkContext: Running Spark version 1.6.0-SNAPSHOT
15/11/25 16:27:33 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
15/11/25 16:27:33 WARN Utils: Your hostname, localhost.localdomain resolves
to a loopback address: 127.0.0.1; using 10.0.2.15 instead (on interface
enp0s3)
15/11/25 16:27:33 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
another address
15/11/25 16:27:33 INFO SecurityManager: Changing view acls to: user
15/11/25 16:27:33 INFO SecurityManager: Changing modify acls to: user
15/11/25 16:27:33 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(user); users
with modify permissions: Set(user)
15/11/25 16:27:34 INFO Utils: Successfully started service 'sparkDriver' on
port 35799.
15/11/25 16:27:34 INFO Slf4jLogger: Slf4jLogger started
15/11/25 16:27:34 INFO Remoting: Starting remoting
15/11/25 16:27:34 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkDriverActorSystem@10.0.2.15:48344]
15/11/25 16:27:34 INFO Utils: Successfully started service
'sparkDriverActorSystem' on port 48344.
15/11/25 16:27:34 INFO SparkEnv: Registering MapOutputTracker
15/11/25 16:27:34 INFO SparkEnv: Registering BlockManagerMaster
15/11/25 16:27:34 INFO DiskBlockManager: Created local directory at
/tmp/blockmgr-ffbf1759-2e79-4ecd-a0d8-5d7f28d3c132
15/11/25 16:27:34 INFO MemoryStore: MemoryStore started with capacity 736.1
MB
15/11/25 16:27:34 INFO HttpFileServer: HTTP File server directory is
/tmp/spark-3f269496-c3aa-4bbc-8955-1ab88a179420/httpd-4ea1a9f9-2ce6-41b3-82ef-96e9094f5aa4
15/11/25 16:27:34 INFO HttpServer: Starting HTTP Server
15/11/25 16:27:35 INFO Utils: Successfully started service 'HTTP file
server' on port 50803.
15/11/25 16:27:35 INFO SparkEnv: Registering OutputCommitCoordinator
15/11/25 16:27:35 INFO Utils: Successfully started service 'SparkUI' on
port 4040.
15/11/25 16:27:35 INFO SparkUI: Started SparkUI at http://10.0.2.15:4040
15/11/25 16:27:35 INFO SparkContext: Added JAR
file:/opt/eTOX_spark/target/scala-2.10/etox_spark_2.10-1.0.jar at
http://10.0.2.15:50803/jars/etox_spark_2.10-1.0.jar with timestamp
1448465255275
15/11/25 16:27:35 INFO AppClient$ClientEndpoint: Connecting to master
spark://localhost.localdomain:7077...
15/11/25 16:27:35 INFO SparkDeploySchedulerBackend: Connected to Spark
cluster with app ID app-20151125162735-0003
15/11/25 16:27:35 INFO AppClient$ClientEndpoint: Executor added:
app-20151125162735-0003/0 on worker-20151125161159-10.0.2.15-48408 (
10.0.2.15:48408) with 4 cores
15/11/25 16:27:35 INFO SparkDeploySchedulerBackend: Granted executor ID
app-20151125162735-0003/0 on hostPort 10.0.2.15:48408 with 4 cores, 1024.0
MB RAM
15/11/25 16:27:35 INFO Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 35769.
15/11/25 16:27:35 INFO NettyBlockTransferService: Server created on 35769
15/11/25 16:27:35 INFO BlockManagerMaster: Trying to register BlockManager
15/11/25 16:27:35 INFO BlockManagerMasterEndpoint: Registering block
manager 10.0.2.15:35769 with 736.1 MB RAM, BlockManagerId(driver,
10.0.2.15, 35769)
15/11/25 16:27:35 INFO AppClient$ClientEndpoint: Executor updated:
app-20151125162735-0003/0 is now LOADING
15/11/25 16:27:35 INFO AppClient$ClientEndpoint: Executor updated:
app-20151125162735-0003/0 is now RUNNING
15/11/25 16:27:35 INFO BlockManagerMaster: Registered BlockManager
15/11/25 16:27:35 INFO SparkDeploySchedulerBackend: SchedulerBackend is
ready for scheduling beginning after reached minRegisteredResourcesRatio:
0.0
15/11/25 16:27:36 INFO ParquetRelation: Listing
file:/opt/etox_reports_2015_1/spark_data/etox/allfindings.parquet on driver
15/11/25 16:27:37 INFO SparkContext: Starting job: load at eTOX.scala:52
15/11/25 16:27:37 INFO DAGScheduler: Got job 0 (load at eTOX.scala:52) with
2 output partitions
15/11/25 16:27:37 INFO DAGScheduler: Final stage: ResultStage 0 (load at
eTOX.scala:52)
15/11/25 16:27:37 INFO DAGScheduler: Parents of final stage: List()
15/11/25 16:27:37 INFO DAGScheduler: Missing parents: List()
15/11/25 16:27:37 INFO DAGScheduler: Submitting ResultStage 0
(MapPartitionsRDD[1] at load at eTOX.scala:52), which has no missing parents
15/11/25 16:27:37 INFO MemoryStore: Ensuring 1048576 bytes of free space
for block broadcast_0(free: 771883008, max: 771883008)
15/11/25 16:27:37 INFO MemoryStore: Ensuring 63000 bytes of free space for
block broadcast_0(free: 771883008, max: 771883008)
15/11/25 16:27:37 INFO MemoryStore: Block broadcast_0 stored as values in
memory (estimated size 61.5 KB, free 61.5 KB)
15/11/25 16:27:37 INFO MemoryStore: Ensuring 21090 bytes of free space for
block broadcast_0_piece0(free: 771820008, max: 771883008)
15/11/25 16:27:37 INFO MemoryStore: Block broadcast_0_piece0 stored as
bytes in memory (estimated size 20.6 KB, free 82.1 KB)
15/11/25 16:27:37 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory
on 10.0.2.15:35769 (size: 20.6 KB, free: 736.1 MB)
15/11/25 16:27:37 INFO SparkContext: Created broadcast 0 from broadcast at
DAGScheduler.scala:1003
15/11/25 16:27:37 INFO DAGScheduler: Submitting 2 missing tasks from
ResultStage 0 (MapPartitionsRDD[1] at load at eTOX.scala:52)
15/11/25 16:27:37 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks
15/11/25 16:27:39 INFO SparkDeploySchedulerBackend: Registered executor
NettyRpcEndpointRef(null) (10.0.2.15:50472) with ID 0
15/11/25 16:27:39 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID
0, 10.0.2.15, partition 0,PROCESS_LOCAL, 2137 bytes)
15/11/25 16:27:39 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID
1, 10.0.2.15, partition 1,PROCESS_LOCAL, 2285 bytes)
15/11/25 16:27:39 INFO BlockManagerMasterEndpoint: Registering block
manager 10.0.2.15:39373 with 736.1 MB RAM, BlockManagerId(0, 10.0.2.15,
39373)
15/11/25 16:27:39 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory
on 10.0.2.15:39373 (size: 20.6 KB, free: 736.1 MB)
15/11/25 16:27:40 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID
0) in 1097 ms on 10.0.2.15 (1/2)
15/11/25 16:27:42 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID
1) in 3050 ms on 10.0.2.15 (2/2)
15/11/25 16:27:42 INFO DAGScheduler: ResultStage 0 (load at eTOX.scala:52)
finished in 4.739 s
15/11/25 16:27:42 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks
have all completed, from pool
15/11/25 16:27:42 INFO DAGScheduler: Job 0 finished: load at eTOX.scala:52,
took 5.157662 s
15/11/25 16:27:42 INFO MemoryStore: Ensuring 1048576 bytes of free space
for block broadcast_1(free: 771798918, max: 771883008)
15/11/25 16:27:42 INFO MemoryStore: Ensuring 63280 bytes of free space for
block broadcast_1(free: 771798918, max: 771883008)
15/11/25 16:27:42 INFO MemoryStore: Block broadcast_1 stored as values in
memory (estimated size 61.8 KB, free 143.9 KB)
15/11/25 16:27:42 INFO MemoryStore: Ensuring 19788 bytes of free space for
block broadcast_1_piece0(free: 771735638, max: 771883008)
15/11/25 16:27:42 INFO MemoryStore: Block broadcast_1_piece0 stored as
bytes in memory (estimated size 19.3 KB, free 163.2 KB)
15/11/25 16:27:42 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory
on 10.0.2.15:35769 (size: 19.3 KB, free: 736.1 MB)
15/11/25 16:27:42 INFO SparkContext: Created broadcast 1 from count at
eTOX.scala:53
15/11/25 16:27:42 INFO deprecation: mapred.min.split.size is deprecated.
Instead, use mapreduce.input.fileinputformat.split.minsize
15/11/25 16:27:43 INFO ParquetRelation: Reading Parquet file(s) from
file:/opt/etox_reports_2015_1/spark_data/etox/allfindings.parquet/part-r-00000-a0018ffe-9aaf-4446-a718-79097835d08d.gz.parquet
15/11/25 16:27:43 INFO SparkContext: Starting job: count at eTOX.scala:53
15/11/25 16:27:43 INFO DAGScheduler: Registering RDD 4 (count at
eTOX.scala:53)
15/11/25 16:27:43 INFO DAGScheduler: Got job 1 (count at eTOX.scala:53)
with 1 output partitions
15/11/25 16:27:43 INFO DAGScheduler: Final stage: ResultStage 2 (count at
eTOX.scala:53)
15/11/25 16:27:43 INFO DAGScheduler: Parents of final stage:
List(ShuffleMapStage 1)
15/11/25 16:27:43 INFO DAGScheduler: Missing parents: List(ShuffleMapStage
1)
15/11/25 16:27:43 INFO DAGScheduler: Submitting ShuffleMapStage 1
(MapPartitionsRDD[4] at count at eTOX.scala:53), which has no missing
parents
15/11/25 16:27:43 INFO MemoryStore: Ensuring 1048576 bytes of free space
for block broadcast_2(free: 771715850, max: 771883008)
15/11/25 16:27:43 INFO MemoryStore: Ensuring 12840 bytes of free space for
block broadcast_2(free: 771715850, max: 771883008)
15/11/25 16:27:43 INFO MemoryStore: Block broadcast_2 stored as values in
memory (estimated size 12.5 KB, free 175.8 KB)
15/11/25 16:27:43 INFO MemoryStore: Ensuring 6175 bytes of free space for
block broadcast_2_piece0(free: 771703010, max: 771883008)
15/11/25 16:27:43 INFO MemoryStore: Block broadcast_2_piece0 stored as
bytes in memory (estimated size 6.0 KB, free 181.8 KB)
15/11/25 16:27:43 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory
on 10.0.2.15:35769 (size: 6.0 KB, free: 736.1 MB)
15/11/25 16:27:43 INFO SparkContext: Created broadcast 2 from broadcast at
DAGScheduler.scala:1003
15/11/25 16:27:43 INFO DAGScheduler: Submitting 1 missing tasks from
ShuffleMapStage 1 (MapPartitionsRDD[4] at count at eTOX.scala:53)
15/11/25 16:27:43 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
15/11/25 16:27:43 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID
2, 10.0.2.15, partition 0,PROCESS_LOCAL, 2300 bytes)
15/11/25 16:27:43 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory
on 10.0.2.15:39373 (size: 6.0 KB, free: 736.1 MB)
15/11/25 16:27:43 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory
on 10.0.2.15:39373 (size: 19.3 KB, free: 736.1 MB)
15/11/25 16:27:44 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID
2) in 1000 ms on 10.0.2.15 (1/1)
15/11/25 16:27:44 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks
have all completed, from pool
15/11/25 16:27:44 INFO DAGScheduler: ShuffleMapStage 1 (count at
eTOX.scala:53) finished in 1.003 s
15/11/25 16:27:44 INFO DAGScheduler: looking for newly runnable stages
15/11/25 16:27:44 INFO DAGScheduler: running: Set()
15/11/25 16:27:44 INFO DAGScheduler: waiting: Set(ResultStage 2)
15/11/25 16:27:44 INFO DAGScheduler: failed: Set()
15/11/25 16:27:44 INFO DAGScheduler: Submitting ResultStage 2
(MapPartitionsRDD[7] at count at eTOX.scala:53), which has no missing
parents
15/11/25 16:27:44 INFO MemoryStore: Ensuring 1048576 bytes of free space
for block broadcast_3(free: 771696835, max: 771883008)
15/11/25 16:27:44 INFO MemoryStore: Ensuring 14032 bytes of free space for
block broadcast_3(free: 771696835, max: 771883008)
15/11/25 16:27:44 INFO MemoryStore: Block broadcast_3 stored as values in
memory (estimated size 13.7 KB, free 195.5 KB)
15/11/25 16:27:44 INFO MemoryStore: Ensuring 6698 bytes of free space for
block broadcast_3_piece0(free: 771682803, max: 771883008)
15/11/25 16:27:44 INFO MemoryStore: Block broadcast_3_piece0 stored as
bytes in memory (estimated size 6.5 KB, free 202.1 KB)
15/11/25 16:27:44 INFO BlockManagerInfo: Added broadcast_3_piece0 in memory
on 10.0.2.15:35769 (size: 6.5 KB, free: 736.1 MB)
15/11/25 16:27:44 INFO SparkContext: Created broadcast 3 from broadcast at
DAGScheduler.scala:1003
15/11/25 16:27:44 INFO DAGScheduler: Submitting 1 missing tasks from
ResultStage 2 (MapPartitionsRDD[7] at count at eTOX.scala:53)
15/11/25 16:27:44 INFO TaskSchedulerImpl: Adding task set 2.0 with 1 tasks
15/11/25 16:27:44 INFO TaskSetManager: Starting task 0.0 in stage 2.0 (TID
3, 10.0.2.15, partition 0,NODE_LOCAL, 2060 bytes)
15/11/25 16:27:44 INFO BlockManagerInfo: Added broadcast_3_piece0 in memory
on 10.0.2.15:39373 (size: 6.5 KB, free: 736.1 MB)
15/11/25 16:27:44 INFO MapOutputTrackerMasterEndpoint: Asked to send map
output locations for shuffle 0 to 10.0.2.15:50472
15/11/25 16:27:44 INFO MapOutputTrackerMaster: Size of output statuses for
shuffle 0 is 138 bytes
15/11/25 16:27:44 INFO TaskSetManager: Finished task 0.0 in stage 2.0 (TID
3) in 190 ms on 10.0.2.15 (1/1)
15/11/25 16:27:44 INFO TaskSchedulerImpl: Removed TaskSet 2.0, whose tasks
have all completed, from pool
15/11/25 16:27:44 INFO DAGScheduler: ResultStage 2 (count at eTOX.scala:53)
finished in 0.195 s
15/11/25 16:27:44 INFO DAGScheduler: Job 1 finished: count at
eTOX.scala:53, took 1.249107 s
Num Findings: 2775141
Exception in thread "main" java.lang.UnsatisfiedLinkError:
org.RDKit.RDKFuncsJNI.RWMol_MolFromSmiles__SWIG_3(Ljava/lang/String;)J
    at org.RDKit.RDKFuncsJNI.RWMol_MolFromSmiles__SWIG_3(Native Method)
    at org.RDKit.RWMol.MolFromSmiles(RWMol.java:426)
    at models.spark.sources.eTOX_DB$.main(eTOX.scala:54)
    at models.spark.sources.eTOX_DB.main(eTOX.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:727)
    at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/11/25 16:27:44 INFO SparkContext: Invoking stop() from shutdown hook
15/11/25 16:27:44 INFO BlockManagerInfo: Removed broadcast_3_piece0 on
10.0.2.15:35769 in memory (size: 6.5 KB, free: 736.1 MB)
15/11/25 16:27:44 INFO BlockManagerInfo: Removed broadcast_3_piece0 on
10.0.2.15:39373 in memory (size: 6.5 KB, free: 736.1 MB)
15/11/25 16:27:44 INFO ContextCleaner: Cleaned accumulator 12
15/11/25 16:27:44 INFO BlockManagerInfo: Removed broadcast_2_piece0 on
10.0.2.15:35769 in memory (size: 6.0 KB, free: 736.1 MB)
15/11/25 16:27:44 INFO BlockManagerInfo: Removed broadcast_2_piece0 on
10.0.2.15:39373 in memory (size: 6.0 KB, free: 736.1 MB)
15/11/25 16:27:44 INFO SparkUI: Stopped Spark web UI at
http://10.0.2.15:4040
15/11/25 16:27:44 INFO ContextCleaner: Cleaned accumulator 11
15/11/25 16:27:44 INFO ContextCleaner: Cleaned shuffle 0
15/11/25 16:27:44 INFO DAGScheduler: Stopping DAGScheduler
15/11/25 16:27:44 INFO SparkDeploySchedulerBackend: Shutting down all
executors
15/11/25 16:27:44 INFO SparkDeploySchedulerBackend: Asking each executor to
shut down
15/11/25 16:27:44 INFO MapOutputTrackerMasterEndpoint:
MapOutputTrackerMasterEndpoint stopped!
15/11/25 16:27:44 INFO MemoryStore: MemoryStore cleared
15/11/25 16:27:44 INFO BlockManager: BlockManager stopped
15/11/25 16:27:44 INFO BlockManagerMaster: BlockManagerMaster stopped
15/11/25 16:27:44 INFO
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!
15/11/25 16:27:44 INFO RemoteActorRefProvider$RemotingTerminator: Shutting
down remote daemon.
15/11/25 16:27:44 INFO SparkContext: Successfully stopped SparkContext
15/11/25 16:27:44 INFO ShutdownHookManager: Shutdown hook called
15/11/25 16:27:44 INFO RemoteActorRefProvider$RemotingTerminator: Remote
daemon shut down; proceeding with flushing remote transports.
15/11/25 16:27:44 INFO ShutdownHookManager: Deleting directory
/tmp/spark-3f269496-c3aa-4bbc-8955-1ab88a179420



Oriol.

Reply via email to