Sure - I wanted to check with admin before sharing. I’ve attached it now, does 
this help?

Many thanks again,

G

Container: container_e34_1479877553404_0174_01_000003 on 
hdp-node12.xcat.cluster_45454_1481228528201
====================================================================================================
LogType:directory.info
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:5138
Log Contents:
ls -l:
total 28
lrwxrwxrwx 1 my_user_name hadoop   70 Dec  8 20:21 __app__.jar -> 
/hadoop/yarn/local/usercache/my_user_name/filecache/26/graphx_sp_2.10-1.0.jar
lrwxrwxrwx 1 my_user_name hadoop   63 Dec  8 20:21 __spark__.jar -> 
/hadoop_1/hadoop/yarn/local/filecache/11/spark-hdp-assembly.jar
lrwxrwxrwx 1 my_user_name hadoop   94 Dec  8 20:21 __spark_conf__ -> 
/hadoop_1/hadoop/yarn/local/usercache/my_user_name/filecache/25/__spark_conf__2528926660896665250.zip
-rw------- 1 my_user_name hadoop  340 Dec  8 20:21 container_tokens
-rwx------ 1 my_user_name hadoop 6195 Dec  8 20:21 launch_container.sh
drwxr-s--- 2 my_user_name hadoop 4096 Dec  8 20:21 tmp
find -L . -maxdepth 5 -ls:
2752703    4 drwxr-s---   3 my_user_name    hadoop       4096 Dec  8 20:21 .
106430595 184304 -r-xr-xr-x   1 yarn     hadoop   188727178 Dec  5 18:22 
./__spark__.jar
106431527    4 drwx------   2 my_user_name    my_user_name        4096 Dec  8 
20:21 ./__spark_conf__
106431559    4 -r-x------   1 my_user_name    my_user_name         951 Dec  8 
20:21 ./__spark_conf__/mapred-env.cmd
106431558    4 -r-x------   1 my_user_name    my_user_name        1000 Dec  8 
20:21 ./__spark_conf__/ssl-server.xml
106431528    8 -r-x------   1 my_user_name    my_user_name        5410 Dec  8 
20:21 ./__spark_conf__/hadoop-env.sh
106431553    4 -r-x------   1 my_user_name    my_user_name        2316 Dec  8 
20:21 ./__spark_conf__/ssl-client.xml.example
106431532    4 -r-x------   1 my_user_name    my_user_name        3979 Dec  8 
20:21 ./__spark_conf__/hadoop-env.cmd
106431546   12 -r-x------   1 my_user_name    my_user_name        8217 Dec  8 
20:21 ./__spark_conf__/hdfs-site.xml
106431545    8 -r-x------   1 my_user_name    my_user_name        5637 Dec  8 
20:21 ./__spark_conf__/yarn-env.sh
106431552    4 -r-x------   1 my_user_name    my_user_name        1602 Dec  8 
20:21 ./__spark_conf__/health_check
106431537    4 -r-x------   1 my_user_name    my_user_name        1631 Dec  8 
20:21 ./__spark_conf__/kms-log4j.properties
106431563    8 -r-x------   1 my_user_name    my_user_name        5511 Dec  8 
20:21 ./__spark_conf__/kms-site.xml
106431530    8 -r-x------   1 my_user_name    my_user_name        7353 Dec  8 
20:21 ./__spark_conf__/mapred-site.xml
106431548    4 -r-x------   1 my_user_name    my_user_name        1072 Dec  8 
20:21 ./__spark_conf__/container-executor.cfg
106431536    0 -r-x------   1 my_user_name    my_user_name           0 Dec  8 
20:21 ./__spark_conf__/yarn.exclude
106431562    8 -r-x------   1 my_user_name    my_user_name        4113 Dec  8 
20:21 ./__spark_conf__/mapred-queues.xml.template
106431538    4 -r-x------   1 my_user_name    my_user_name        2250 Dec  8 
20:21 ./__spark_conf__/yarn-env.cmd
106431547    4 -r-x------   1 my_user_name    my_user_name        1020 Dec  8 
20:21 ./__spark_conf__/commons-logging.properties
106431543    4 -r-x------   1 my_user_name    my_user_name         758 Dec  8 
20:21 ./__spark_conf__/mapred-site.xml.template
106431554    4 -r-x------   1 my_user_name    my_user_name        1527 Dec  8 
20:21 ./__spark_conf__/kms-env.sh
106431556    4 -r-x------   1 my_user_name    my_user_name         760 Dec  8 
20:21 ./__spark_conf__/slaves
106431561    4 -r-x------   1 my_user_name    my_user_name         945 Dec  8 
20:21 ./__spark_conf__/taskcontroller.cfg
106431542    4 -r-x------   1 my_user_name    my_user_name        2358 Dec  8 
20:21 ./__spark_conf__/topology_script.py
106431539    4 -r-x------   1 my_user_name    my_user_name         884 Dec  8 
20:21 ./__spark_conf__/ssl-client.xml
106431531    4 -r-x------   1 my_user_name    my_user_name        2207 Dec  8 
20:21 ./__spark_conf__/hadoop-metrics2.properties
106431564    4 -r-x------   1 my_user_name    my_user_name         506 Dec  8 
20:21 ./__spark_conf__/__spark_conf__.properties
106431550    8 -r-x------   1 my_user_name    my_user_name        4221 Dec  8 
20:21 ./__spark_conf__/task-log4j.properties
106431551    4 -r-x------   1 my_user_name    my_user_name         856 Dec  8 
20:21 ./__spark_conf__/mapred-env.sh
106431529   12 -r-x------   1 my_user_name    my_user_name        9313 Dec  8 
20:21 ./__spark_conf__/log4j.properties
106431541    4 -r-x------   1 my_user_name    my_user_name        3518 Dec  8 
20:21 ./__spark_conf__/kms-acls.xml
106431534    8 -r-x------   1 my_user_name    my_user_name        7634 Dec  8 
20:21 ./__spark_conf__/core-site.xml
106431557    4 -r-x------   1 my_user_name    my_user_name        2081 Dec  8 
20:21 ./__spark_conf__/topology_mappings.data
106431549    4 -r-x------   1 my_user_name    my_user_name        2490 Dec  8 
20:21 ./__spark_conf__/hadoop-metrics.properties
106431535    4 -r-x------   1 my_user_name    my_user_name           1 Dec  8 
20:21 ./__spark_conf__/dfs.exclude
106431560    4 -r-x------   1 my_user_name    my_user_name        2268 Dec  8 
20:21 ./__spark_conf__/ssl-server.xml.example
106431544    4 -r-x------   1 my_user_name    my_user_name        1335 Dec  8 
20:21 ./__spark_conf__/configuration.xsl
106431555    4 -r-x------   1 my_user_name    my_user_name        1308 Dec  8 
20:21 ./__spark_conf__/hadoop-policy.xml
106431540    4 -r-x------   1 my_user_name    my_user_name        2313 Dec  8 
20:21 ./__spark_conf__/capacity-scheduler.xml
106431533   24 -r-x------   1 my_user_name    my_user_name       21013 Dec  8 
20:21 ./__spark_conf__/yarn-site.xml
2752706    4 -rw-------   1 my_user_name    hadoop        340 Dec  8 20:21 
./container_tokens
2752704    4 drwxr-s---   2 my_user_name    hadoop       4096 Dec  8 20:21 ./tmp
2752705    8 -rwx------   1 my_user_name    hadoop       6195 Dec  8 20:21 
./launch_container.sh
2752697   16 -r-x------   1 my_user_name    my_user_name       14787 Dec  8 
20:21 ./__app__.jar
broken symlinks(find -L . -maxdepth 5 -type l -ls):

End of LogType:directory.info

LogType:launch_container.sh
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:6195
Log Contents:
#!/bin/bash

export SPARK_YARN_STAGING_DIR=".sparkStaging/application_1479877553404_0174"
export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/hdp/current/hadoop-client/conf"}
export JAVA_HOME=${JAVA_HOME:-"/usr/jdk64/jdk1.8.0_60"}
export 
SPARK_YARN_CACHE_FILES="hdfs://hdp-master1.xcat.cluster:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar#__spark__.jar,hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/graphx_sp_2.10-1.0.jar#__app__.jar"
export 
SPARK_LOG_URL_STDOUT="http://hdp-node12.xcat.cluster:8042/node/containerlogs/container_e34_1479877553404_0174_01_000003/my_user_name/stdout?start=-4096";
export NM_HOST="hdp-node12.xcat.cluster"
export SPARK_YARN_CACHE_FILES_FILE_SIZES="188727178,14787"
export SPARK_YARN_CACHE_ARCHIVES_TIME_STAMPS="1481228482907"
export LOGNAME="my_user_name"
export JVM_PID="$$"
export 
PWD="/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/container_e34_1479877553404_0174_01_000003"
export 
LOCAL_DIRS="/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174,/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174"
export NM_HTTP_PORT="8042"
export 
LOG_DIRS="/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000003,/hadoop_1/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000003"
export 
NM_AUX_SERVICE_mapreduce_shuffle="AAA0+gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=
"
export NM_PORT="45454"
export SPARK_YARN_CACHE_FILES_TIME_STAMPS="1478064369306,1481228482852"
export USER="my_user_name"
export 
HADOOP_YARN_HOME=${HADOOP_YARN_HOME:-"/usr/hdp/current/hadoop-yarn-nodemanager"}
export 
CLASSPATH="$PWD:$PWD/__spark_conf__:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.5.0.0-1245/hadoop/lib/hadoop-lzo-0.6.0.2.5.0.0-1245.jar:/etc/hadoop/conf/secure"
export 
SPARK_YARN_CACHE_ARCHIVES="hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/__spark_conf__2528926660896665250.zip#__spark_conf__"
export SPARK_YARN_CACHE_ARCHIVES_FILE_SIZES="125215"
export SPARK_YARN_MODE="true"
export SPARK_YARN_CACHE_FILES_VISIBILITIES="PUBLIC,PRIVATE"
export 
HADOOP_TOKEN_FILE_LOCATION="/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/container_e34_1479877553404_0174_01_000003/container_tokens"
export NM_AUX_SERVICE_spark_shuffle=""
export SPARK_USER="my_user_name"
export 
LOCAL_USER_DIRS="/hadoop/yarn/local/usercache/my_user_name/,/hadoop_1/hadoop/yarn/local/usercache/my_user_name/"
export 
SPARK_LOG_URL_STDERR="http://hdp-node12.xcat.cluster:8042/node/containerlogs/container_e34_1479877553404_0174_01_000003/my_user_name/stderr?start=-4096";
export SPARK_YARN_CACHE_ARCHIVES_VISIBILITIES="PRIVATE"
export HOME="/home/"
export NM_AUX_SERVICE_spark2_shuffle=""
export CONTAINER_ID="container_e34_1479877553404_0174_01_000003"
export MALLOC_ARENA_MAX="4"
ln -sf "/hadoop_1/hadoop/yarn/local/filecache/11/spark-hdp-assembly.jar" 
"__spark__.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf 
"/hadoop_1/hadoop/yarn/local/usercache/my_user_name/filecache/25/__spark_conf__2528926660896665250.zip"
 "__spark_conf__"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf 
"/hadoop/yarn/local/usercache/my_user_name/filecache/26/graphx_sp_2.10-1.0.jar" 
"__app__.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
# Creating copy of launch script
cp "launch_container.sh" 
"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000003/launch_container.sh"
chmod 640 
"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000003/launch_container.sh"
# Determining directory contents
echo "ls -l:" 
1>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000003/directory.info"
ls -l 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000003/directory.info"
echo "find -L . -maxdepth 5 -ls:" 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000003/directory.info"
find -L . -maxdepth 5 -ls 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000003/directory.info"
echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000003/directory.info"
find -L . -maxdepth 5 -type l -ls 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000003/directory.info"
exec /bin/bash -c "$JAVA_HOME/bin/java -server -XX:OnOutOfMemoryError='kill %p' 
-Xms13312m -Xmx13312m -Djava.io.tmpdir=$PWD/tmp '-Dspark.driver.port=33211' 
'-Dspark.ui.port=0' 
-Dspark.yarn.app.container.log.dir=/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000003
 org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url 
spark://CoarseGrainedScheduler@10.143.65.28:33211 --executor-id 2 --hostname 
hdp-node12.xcat.cluster --cores 1 --app-id application_1479877553404_0174 
--user-class-path file:$PWD/__app__.jar 1> 
/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000003/stdout
 2> 
/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000003/stderr"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi

End of LogType:launch_container.sh

LogType:stderr
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:4118
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/hadoop_1/hadoop/yarn/local/filecache/11/spark-hdp-assembly.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/usr/hdp/2.5.0.0-1245/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/12/08 20:21:38 INFO executor.CoarseGrainedExecutorBackend: Registered signal 
handlers for [TERM, HUP, INT]
16/12/08 20:21:38 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
16/12/08 20:21:39 INFO spark.SecurityManager: Changing view acls to: 
my_user_name
16/12/08 20:21:39 INFO spark.SecurityManager: Changing modify acls to: 
my_user_name
16/12/08 20:21:39 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(my_user_name); 
users with modify permissions: Set(my_user_name)
16/12/08 20:21:40 INFO executor.CoarseGrainedExecutorBackend: Will periodically 
update credentials from: 
hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/credentials-b4345c5b-2f21-4de7-ab92-fea01c49a1c6
16/12/08 20:21:41 WARN shortcircuit.DomainSocketFactory: The short-circuit 
local reads feature cannot be used because libhadoop cannot be loaded.
16/12/08 20:21:41 INFO yarn.ExecutorDelegationTokenUpdater: Scheduling token 
refresh from HDFS in 69100441 millis.
16/12/08 20:21:41 INFO spark.SecurityManager: Changing view acls to: 
my_user_name
16/12/08 20:21:41 INFO spark.SecurityManager: Changing modify acls to: 
my_user_name
16/12/08 20:21:41 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(my_user_name); 
users with modify permissions: Set(my_user_name)
16/12/08 20:21:42 INFO slf4j.Slf4jLogger: Slf4jLogger started
16/12/08 20:21:42 INFO Remoting: Starting remoting
16/12/08 20:21:42 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://sparkexecutoractorsys...@hdp-node12.xcat.cluster:39124]
16/12/08 20:21:42 INFO util.Utils: Successfully started service 
'sparkExecutorActorSystem' on port 39124.
16/12/08 20:21:42 INFO storage.DiskBlockManager: Created local directory at 
/hadoop_1/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/blockmgr-a231266c-d020-45f8-87a7-609d731de6e6
16/12/08 20:21:42 INFO storage.DiskBlockManager: Created local directory at 
/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/blockmgr-c5d3ecd6-3885-47cf-ad6b-66b66a027ae2
16/12/08 20:21:42 INFO storage.MemoryStore: MemoryStore started with capacity 
9.1 GB
16/12/08 20:21:43 INFO executor.CoarseGrainedExecutorBackend: Connecting to 
driver: spark://CoarseGrainedScheduler@10.143.65.28:33211
16/12/08 20:21:43 INFO executor.CoarseGrainedExecutorBackend: Successfully 
registered with driver
16/12/08 20:21:43 INFO executor.Executor: Starting executor ID 2 on host 
hdp-node12.xcat.cluster
16/12/08 20:21:43 INFO util.Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 45874.
16/12/08 20:21:43 INFO netty.NettyBlockTransferService: Server created on 45874
16/12/08 20:21:43 INFO storage.BlockManagerMaster: Trying to register 
BlockManager
16/12/08 20:21:43 INFO storage.BlockManagerMaster: Registered BlockManager
16/12/08 20:21:44 INFO executor.CoarseGrainedExecutorBackend: Driver commanded 
a shutdown
16/12/08 20:21:44 INFO storage.MemoryStore: MemoryStore cleared
16/12/08 20:21:44 INFO storage.BlockManager: BlockManager stopped
16/12/08 20:21:44 WARN executor.CoarseGrainedExecutorBackend: An unknown 
(hdp-node28.xcat.cluster:33211) driver disconnected.
16/12/08 20:21:44 ERROR executor.CoarseGrainedExecutorBackend: Driver 
10.143.65.28:33211 disassociated! Shutting down.
16/12/08 20:21:44 INFO remote.RemoteActorRefProvider$RemotingTerminator: 
Shutting down remote daemon.
16/12/08 20:21:44 INFO util.ShutdownHookManager: Shutdown hook called

End of LogType:stderr

LogType:stdout
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:0
Log Contents:

End of LogType:stdout

Container: container_e34_1479877553404_0174_02_000002 on 
hdp-node26.xcat.cluster_45454_1481228528271
====================================================================================================
LogType:directory.info
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:5061
Log Contents:
ls -l:
total 28
lrwxrwxrwx 1 my_user_name hadoop   70 Dec  8 20:21 __app__.jar -> 
/hadoop/yarn/local/usercache/my_user_name/filecache/28/graphx_sp_2.10-1.0.jar
lrwxrwxrwx 1 my_user_name hadoop   63 Dec  8 20:21 __spark__.jar -> 
/hadoop_1/hadoop/yarn/local/filecache/11/spark-hdp-assembly.jar
lrwxrwxrwx 1 my_user_name hadoop   94 Dec  8 20:21 __spark_conf__ -> 
/hadoop_1/hadoop/yarn/local/usercache/my_user_name/filecache/27/__spark_conf__2528926660896665250.zip
-rw------- 1 my_user_name hadoop  340 Dec  8 20:21 container_tokens
-rwx------ 1 my_user_name hadoop 6212 Dec  8 20:21 launch_container.sh
drwxr-s--- 2 my_user_name hadoop 4096 Dec  8 20:21 tmp
find -L . -maxdepth 5 -ls:
7733681    4 drwxr-s---   3 my_user_name    hadoop       4096 Dec  8 20:21 .
7733682    4 drwxr-s---   2 my_user_name    hadoop       4096 Dec  8 20:21 ./tmp
7733684    4 -rw-------   1 my_user_name    hadoop        340 Dec  8 20:21 
./container_tokens
18219403   16 -r-x------   1 my_user_name    my_user_name       14787 Dec  8 
20:21 ./__app__.jar
7733639    4 drwx------   2 my_user_name    my_user_name        4096 Dec  8 
20:21 ./__spark_conf__
7733659    4 -r-x------   1 my_user_name    my_user_name        1020 Dec  8 
20:21 ./__spark_conf__/commons-logging.properties
7733648    0 -r-x------   1 my_user_name    my_user_name           0 Dec  8 
20:21 ./__spark_conf__/yarn.exclude
7733643    4 -r-x------   1 my_user_name    my_user_name        2207 Dec  8 
20:21 ./__spark_conf__/hadoop-metrics2.properties
7733671    4 -r-x------   1 my_user_name    my_user_name         951 Dec  8 
20:21 ./__spark_conf__/mapred-env.cmd
7733650    4 -r-x------   1 my_user_name    my_user_name        2250 Dec  8 
20:21 ./__spark_conf__/yarn-env.cmd
7733642    8 -r-x------   1 my_user_name    my_user_name        7353 Dec  8 
20:21 ./__spark_conf__/mapred-site.xml
7733641   12 -r-x------   1 my_user_name    my_user_name        9313 Dec  8 
20:21 ./__spark_conf__/log4j.properties
7733673    4 -r-x------   1 my_user_name    my_user_name         945 Dec  8 
20:21 ./__spark_conf__/taskcontroller.cfg
7733670    4 -r-x------   1 my_user_name    my_user_name        1000 Dec  8 
20:21 ./__spark_conf__/ssl-server.xml
7733672    4 -r-x------   1 my_user_name    my_user_name        2268 Dec  8 
20:21 ./__spark_conf__/ssl-server.xml.example
7733676    4 -r-x------   1 my_user_name    my_user_name         506 Dec  8 
20:21 ./__spark_conf__/__spark_conf__.properties
7733656    4 -r-x------   1 my_user_name    my_user_name        1335 Dec  8 
20:21 ./__spark_conf__/configuration.xsl
7733665    4 -r-x------   1 my_user_name    my_user_name        2316 Dec  8 
20:21 ./__spark_conf__/ssl-client.xml.example
7733662    8 -r-x------   1 my_user_name    my_user_name        4221 Dec  8 
20:21 ./__spark_conf__/task-log4j.properties
7733640    8 -r-x------   1 my_user_name    my_user_name        5410 Dec  8 
20:21 ./__spark_conf__/hadoop-env.sh
7733653    4 -r-x------   1 my_user_name    my_user_name        3518 Dec  8 
20:21 ./__spark_conf__/kms-acls.xml
7733666    4 -r-x------   1 my_user_name    my_user_name        1527 Dec  8 
20:21 ./__spark_conf__/kms-env.sh
7733668    4 -r-x------   1 my_user_name    my_user_name         760 Dec  8 
20:21 ./__spark_conf__/slaves
7733655    4 -r-x------   1 my_user_name    my_user_name         758 Dec  8 
20:21 ./__spark_conf__/mapred-site.xml.template
7733658   12 -r-x------   1 my_user_name    my_user_name        8217 Dec  8 
20:21 ./__spark_conf__/hdfs-site.xml
7733661    4 -r-x------   1 my_user_name    my_user_name        2490 Dec  8 
20:21 ./__spark_conf__/hadoop-metrics.properties
7733651    4 -r-x------   1 my_user_name    my_user_name         884 Dec  8 
20:21 ./__spark_conf__/ssl-client.xml
7733646    8 -r-x------   1 my_user_name    my_user_name        7634 Dec  8 
20:21 ./__spark_conf__/core-site.xml
7733675    8 -r-x------   1 my_user_name    my_user_name        5511 Dec  8 
20:21 ./__spark_conf__/kms-site.xml
7733647    4 -r-x------   1 my_user_name    my_user_name           1 Dec  8 
20:21 ./__spark_conf__/dfs.exclude
7733667    4 -r-x------   1 my_user_name    my_user_name        1308 Dec  8 
20:21 ./__spark_conf__/hadoop-policy.xml
7733645   24 -r-x------   1 my_user_name    my_user_name       21013 Dec  8 
20:21 ./__spark_conf__/yarn-site.xml
7733649    4 -r-x------   1 my_user_name    my_user_name        1631 Dec  8 
20:21 ./__spark_conf__/kms-log4j.properties
7733660    4 -r-x------   1 my_user_name    my_user_name        1072 Dec  8 
20:21 ./__spark_conf__/container-executor.cfg
7733654    4 -r-x------   1 my_user_name    my_user_name        2358 Dec  8 
20:21 ./__spark_conf__/topology_script.py
7733657    8 -r-x------   1 my_user_name    my_user_name        5637 Dec  8 
20:21 ./__spark_conf__/yarn-env.sh
7733674    8 -r-x------   1 my_user_name    my_user_name        4113 Dec  8 
20:21 ./__spark_conf__/mapred-queues.xml.template
7733663    4 -r-x------   1 my_user_name    my_user_name         856 Dec  8 
20:21 ./__spark_conf__/mapred-env.sh
7733664    4 -r-x------   1 my_user_name    my_user_name        1602 Dec  8 
20:21 ./__spark_conf__/health_check
7733652    4 -r-x------   1 my_user_name    my_user_name        2313 Dec  8 
20:21 ./__spark_conf__/capacity-scheduler.xml
7733644    4 -r-x------   1 my_user_name    my_user_name        3979 Dec  8 
20:21 ./__spark_conf__/hadoop-env.cmd
7733669    4 -r-x------   1 my_user_name    my_user_name        2081 Dec  8 
20:21 ./__spark_conf__/topology_mappings.data
7602391 184308 -r-xr-xr-x   1 yarn     hadoop   188727178 Dec  6 20:02 
./__spark__.jar
7733683    8 -rwx------   1 my_user_name    hadoop       6212 Dec  8 20:21 
./launch_container.sh
broken symlinks(find -L . -maxdepth 5 -type l -ls):

End of LogType:directory.info

LogType:launch_container.sh
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:6212
Log Contents:
#!/bin/bash

export SPARK_YARN_STAGING_DIR=".sparkStaging/application_1479877553404_0174"
export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/hdp/current/hadoop-client/conf"}
export JAVA_HOME=${JAVA_HOME:-"/usr/jdk64/jdk1.8.0_60"}
export 
SPARK_YARN_CACHE_FILES="hdfs://hdp-master1.xcat.cluster:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar#__spark__.jar,hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/graphx_sp_2.10-1.0.jar#__app__.jar"
export 
SPARK_LOG_URL_STDOUT="http://hdp-node26.xcat.cluster:8042/node/containerlogs/container_e34_1479877553404_0174_02_000002/my_user_name/stdout?start=-4096";
export NM_HOST="hdp-node26.xcat.cluster"
export SPARK_YARN_CACHE_FILES_FILE_SIZES="188727178,14787"
export SPARK_YARN_CACHE_ARCHIVES_TIME_STAMPS="1481228482907"
export LOGNAME="my_user_name"
export JVM_PID="$$"
export 
PWD="/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/container_e34_1479877553404_0174_02_000002"
export 
LOCAL_DIRS="/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174,/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174"
export NM_HTTP_PORT="8042"
export 
LOG_DIRS="/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000002,/hadoop_1/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000002"
export 
NM_AUX_SERVICE_mapreduce_shuffle="AAA0+gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=
"
export NM_PORT="45454"
export SPARK_YARN_CACHE_FILES_TIME_STAMPS="1478064369306,1481228482852"
export USER="my_user_name"
export 
HADOOP_YARN_HOME=${HADOOP_YARN_HOME:-"/usr/hdp/current/hadoop-yarn-nodemanager"}
export 
CLASSPATH="$PWD:$PWD/__spark_conf__:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.5.0.0-1245/hadoop/lib/hadoop-lzo-0.6.0.2.5.0.0-1245.jar:/etc/hadoop/conf/secure"
export 
SPARK_YARN_CACHE_ARCHIVES="hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/__spark_conf__2528926660896665250.zip#__spark_conf__"
export SPARK_YARN_CACHE_ARCHIVES_FILE_SIZES="125215"
export SPARK_YARN_MODE="true"
export SPARK_YARN_CACHE_FILES_VISIBILITIES="PUBLIC,PRIVATE"
export 
HADOOP_TOKEN_FILE_LOCATION="/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/container_e34_1479877553404_0174_02_000002/container_tokens"
export NM_AUX_SERVICE_spark_shuffle=""
export SPARK_USER="my_user_name"
export 
LOCAL_USER_DIRS="/hadoop/yarn/local/usercache/my_user_name/,/hadoop_1/hadoop/yarn/local/usercache/my_user_name/"
export 
SPARK_LOG_URL_STDERR="http://hdp-node26.xcat.cluster:8042/node/containerlogs/container_e34_1479877553404_0174_02_000002/my_user_name/stderr?start=-4096";
export SPARK_YARN_CACHE_ARCHIVES_VISIBILITIES="PRIVATE"
export HOME="/home/"
export NM_AUX_SERVICE_spark2_shuffle=""
export CONTAINER_ID="container_e34_1479877553404_0174_02_000002"
export MALLOC_ARENA_MAX="4"
ln -sf 
"/hadoop/yarn/local/usercache/my_user_name/filecache/28/graphx_sp_2.10-1.0.jar" 
"__app__.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf 
"/hadoop_1/hadoop/yarn/local/usercache/my_user_name/filecache/27/__spark_conf__2528926660896665250.zip"
 "__spark_conf__"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop_1/hadoop/yarn/local/filecache/11/spark-hdp-assembly.jar" 
"__spark__.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
# Creating copy of launch script
cp "launch_container.sh" 
"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000002/launch_container.sh"
chmod 640 
"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000002/launch_container.sh"
# Determining directory contents
echo "ls -l:" 
1>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000002/directory.info"
ls -l 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000002/directory.info"
echo "find -L . -maxdepth 5 -ls:" 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000002/directory.info"
find -L . -maxdepth 5 -ls 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000002/directory.info"
echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000002/directory.info"
find -L . -maxdepth 5 -type l -ls 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000002/directory.info"
exec /bin/bash -c "$JAVA_HOME/bin/java -server -XX:OnOutOfMemoryError='kill %p' 
-Xms13312m -Xmx13312m -Djava.io.tmpdir=$PWD/tmp '-Dspark.driver.port=43398' 
'-Dspark.ui.port=0' 
-Dspark.yarn.app.container.log.dir=/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000002
 org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url 
spark://CoarseGrainedScheduler@10.143.65.9:43398 --executor-id 1 --hostname 
hdp-node26.xcat.cluster --cores 1 --app-id application_1479877553404_0174 
--user-class-path file:$PWD/__app__.jar 1> 
/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000002/stdout
 2> 
/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000002/stderr"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi

End of LogType:launch_container.sh

LogType:stderr
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:4115
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/hadoop_1/hadoop/yarn/local/filecache/11/spark-hdp-assembly.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/usr/hdp/2.5.0.0-1245/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/12/08 20:22:00 INFO executor.CoarseGrainedExecutorBackend: Registered signal 
handlers for [TERM, HUP, INT]
16/12/08 20:22:01 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
16/12/08 20:22:01 INFO spark.SecurityManager: Changing view acls to: 
my_user_name
16/12/08 20:22:01 INFO spark.SecurityManager: Changing modify acls to: 
my_user_name
16/12/08 20:22:01 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(my_user_name); 
users with modify permissions: Set(my_user_name)
16/12/08 20:22:02 INFO executor.CoarseGrainedExecutorBackend: Will periodically 
update credentials from: 
hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/credentials-b4345c5b-2f21-4de7-ab92-fea01c49a1c6
16/12/08 20:22:03 WARN shortcircuit.DomainSocketFactory: The short-circuit 
local reads feature cannot be used because libhadoop cannot be loaded.
16/12/08 20:22:04 INFO yarn.ExecutorDelegationTokenUpdater: Scheduling token 
refresh from HDFS in 69077955 millis.
16/12/08 20:22:04 INFO spark.SecurityManager: Changing view acls to: 
my_user_name
16/12/08 20:22:04 INFO spark.SecurityManager: Changing modify acls to: 
my_user_name
16/12/08 20:22:04 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(my_user_name); 
users with modify permissions: Set(my_user_name)
16/12/08 20:22:04 INFO slf4j.Slf4jLogger: Slf4jLogger started
16/12/08 20:22:04 INFO Remoting: Starting remoting
16/12/08 20:22:05 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://sparkexecutoractorsys...@hdp-node26.xcat.cluster:42986]
16/12/08 20:22:05 INFO util.Utils: Successfully started service 
'sparkExecutorActorSystem' on port 42986.
16/12/08 20:22:05 INFO storage.DiskBlockManager: Created local directory at 
/hadoop_1/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/blockmgr-7b665979-bda6-443a-a4ad-d8a435de886e
16/12/08 20:22:05 INFO storage.DiskBlockManager: Created local directory at 
/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/blockmgr-9227390d-b699-42cf-a635-a6979ef12265
16/12/08 20:22:05 INFO storage.MemoryStore: MemoryStore started with capacity 
9.1 GB
16/12/08 20:22:05 INFO executor.CoarseGrainedExecutorBackend: Connecting to 
driver: spark://CoarseGrainedScheduler@10.143.65.9:43398
16/12/08 20:22:05 INFO executor.CoarseGrainedExecutorBackend: Successfully 
registered with driver
16/12/08 20:22:05 INFO executor.Executor: Starting executor ID 1 on host 
hdp-node26.xcat.cluster
16/12/08 20:22:05 INFO util.Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 44549.
16/12/08 20:22:05 INFO netty.NettyBlockTransferService: Server created on 44549
16/12/08 20:22:05 INFO storage.BlockManagerMaster: Trying to register 
BlockManager
16/12/08 20:22:05 INFO storage.BlockManagerMaster: Registered BlockManager
16/12/08 20:22:06 INFO executor.CoarseGrainedExecutorBackend: Driver commanded 
a shutdown
16/12/08 20:22:06 INFO storage.MemoryStore: MemoryStore cleared
16/12/08 20:22:06 INFO storage.BlockManager: BlockManager stopped
16/12/08 20:22:06 WARN executor.CoarseGrainedExecutorBackend: An unknown 
(hdp-node9.xcat.cluster:43398) driver disconnected.
16/12/08 20:22:06 ERROR executor.CoarseGrainedExecutorBackend: Driver 
10.143.65.9:43398 disassociated! Shutting down.
16/12/08 20:22:06 INFO remote.RemoteActorRefProvider$RemotingTerminator: 
Shutting down remote daemon.
16/12/08 20:22:06 INFO util.ShutdownHookManager: Shutdown hook called

End of LogType:stderr

LogType:stdout
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:0
Log Contents:

End of LogType:stdout

Container: container_e34_1479877553404_0174_02_000003 on 
hdp-node28.xcat.cluster_45454_1481228528273
====================================================================================================
LogType:directory.info
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:5056
Log Contents:
ls -l:
total 24
lrwxrwxrwx 1 my_user_name hadoop   79 Dec  8 20:21 __app__.jar -> 
/hadoop_1/hadoop/yarn/local/usercache/my_user_name/filecache/17/graphx_sp_2.10-1.0.jar
lrwxrwxrwx 1 my_user_name hadoop   54 Dec  8 20:21 __spark__.jar -> 
/hadoop/yarn/local/filecache/12/spark-hdp-assembly.jar
lrwxrwxrwx 1 my_user_name hadoop   85 Dec  8 20:21 __spark_conf__ -> 
/hadoop/yarn/local/usercache/my_user_name/filecache/16/__spark_conf__2528926660896665250.zip
-rw------- 1 my_user_name hadoop  340 Dec  8 20:21 container_tokens
-rwx------ 1 my_user_name hadoop 6230 Dec  8 20:21 launch_container.sh
drwxr-s--- 2 my_user_name hadoop 4096 Dec  8 20:21 tmp
find -L . -maxdepth 5 -ls:
69468230    4 drwxr-s---   3 my_user_name    hadoop       4096 Dec  8 20:21 .
69468231    4 drwxr-s---   2 my_user_name    hadoop       4096 Dec  8 20:21 
./tmp
7865232    4 drwx------   2 my_user_name    my_user_name        4096 Dec  8 
20:21 ./__spark_conf__
7865265    4 -r-x------   1 my_user_name    my_user_name        2268 Dec  8 
20:21 ./__spark_conf__/ssl-server.xml.example
7865239    8 -r-x------   1 my_user_name    my_user_name        7634 Dec  8 
20:21 ./__spark_conf__/core-site.xml
7865253    4 -r-x------   1 my_user_name    my_user_name        1072 Dec  8 
20:21 ./__spark_conf__/container-executor.cfg
7865256    4 -r-x------   1 my_user_name    my_user_name         856 Dec  8 
20:21 ./__spark_conf__/mapred-env.sh
7865262    4 -r-x------   1 my_user_name    my_user_name        2081 Dec  8 
20:21 ./__spark_conf__/topology_mappings.data
7865260    4 -r-x------   1 my_user_name    my_user_name        1308 Dec  8 
20:21 ./__spark_conf__/hadoop-policy.xml
7865255    8 -r-x------   1 my_user_name    my_user_name        4221 Dec  8 
20:21 ./__spark_conf__/task-log4j.properties
7865268    8 -r-x------   1 my_user_name    my_user_name        5511 Dec  8 
20:21 ./__spark_conf__/kms-site.xml
7865245    4 -r-x------   1 my_user_name    my_user_name        2313 Dec  8 
20:21 ./__spark_conf__/capacity-scheduler.xml
7865237    4 -r-x------   1 my_user_name    my_user_name        3979 Dec  8 
20:21 ./__spark_conf__/hadoop-env.cmd
7865236    4 -r-x------   1 my_user_name    my_user_name        2207 Dec  8 
20:21 ./__spark_conf__/hadoop-metrics2.properties
7865235    8 -r-x------   1 my_user_name    my_user_name        7353 Dec  8 
20:21 ./__spark_conf__/mapred-site.xml
7865257    4 -r-x------   1 my_user_name    my_user_name        1602 Dec  8 
20:21 ./__spark_conf__/health_check
7865248    4 -r-x------   1 my_user_name    my_user_name         758 Dec  8 
20:21 ./__spark_conf__/mapred-site.xml.template
7865247    4 -r-x------   1 my_user_name    my_user_name        2358 Dec  8 
20:21 ./__spark_conf__/topology_script.py
7865254    4 -r-x------   1 my_user_name    my_user_name        2490 Dec  8 
20:21 ./__spark_conf__/hadoop-metrics.properties
7865252    4 -r-x------   1 my_user_name    my_user_name        1020 Dec  8 
20:21 ./__spark_conf__/commons-logging.properties
7865250    8 -r-x------   1 my_user_name    my_user_name        5637 Dec  8 
20:21 ./__spark_conf__/yarn-env.sh
7865249    4 -r-x------   1 my_user_name    my_user_name        1335 Dec  8 
20:21 ./__spark_conf__/configuration.xsl
7865246    4 -r-x------   1 my_user_name    my_user_name        3518 Dec  8 
20:21 ./__spark_conf__/kms-acls.xml
7865269    4 -r-x------   1 my_user_name    my_user_name         506 Dec  8 
20:21 ./__spark_conf__/__spark_conf__.properties
7865263    4 -r-x------   1 my_user_name    my_user_name        1000 Dec  8 
20:21 ./__spark_conf__/ssl-server.xml
7865259    4 -r-x------   1 my_user_name    my_user_name        1527 Dec  8 
20:21 ./__spark_conf__/kms-env.sh
7865233    8 -r-x------   1 my_user_name    my_user_name        5410 Dec  8 
20:21 ./__spark_conf__/hadoop-env.sh
7865267    8 -r-x------   1 my_user_name    my_user_name        4113 Dec  8 
20:21 ./__spark_conf__/mapred-queues.xml.template
7865238   24 -r-x------   1 my_user_name    my_user_name       21013 Dec  8 
20:21 ./__spark_conf__/yarn-site.xml
7865264    4 -r-x------   1 my_user_name    my_user_name         951 Dec  8 
20:21 ./__spark_conf__/mapred-env.cmd
7865244    4 -r-x------   1 my_user_name    my_user_name         884 Dec  8 
20:21 ./__spark_conf__/ssl-client.xml
7865241    0 -r-x------   1 my_user_name    my_user_name           0 Dec  8 
20:21 ./__spark_conf__/yarn.exclude
7865258    4 -r-x------   1 my_user_name    my_user_name        2316 Dec  8 
20:21 ./__spark_conf__/ssl-client.xml.example
7865251   12 -r-x------   1 my_user_name    my_user_name        8217 Dec  8 
20:21 ./__spark_conf__/hdfs-site.xml
7865240    4 -r-x------   1 my_user_name    my_user_name           1 Dec  8 
20:21 ./__spark_conf__/dfs.exclude
7865266    4 -r-x------   1 my_user_name    my_user_name         945 Dec  8 
20:21 ./__spark_conf__/taskcontroller.cfg
7865242    4 -r-x------   1 my_user_name    my_user_name        1631 Dec  8 
20:21 ./__spark_conf__/kms-log4j.properties
7865234   12 -r-x------   1 my_user_name    my_user_name        9313 Dec  8 
20:21 ./__spark_conf__/log4j.properties
7865243    4 -r-x------   1 my_user_name    my_user_name        2250 Dec  8 
20:21 ./__spark_conf__/yarn-env.cmd
7865261    4 -r-x------   1 my_user_name    my_user_name         760 Dec  8 
20:21 ./__spark_conf__/slaves
69468183   16 -r-x------   1 my_user_name    my_user_name       14787 Dec  8 
20:21 ./__app__.jar
69468233    4 -rw-------   1 my_user_name    hadoop        340 Dec  8 20:21 
./container_tokens
69468232    8 -rwx------   1 my_user_name    hadoop       6230 Dec  8 20:21 
./launch_container.sh
7864362 184304 -r-xr-xr-x   1 yarn     hadoop   188727178 Nov 10 10:03 
./__spark__.jar
broken symlinks(find -L . -maxdepth 5 -type l -ls):

End of LogType:directory.info

LogType:launch_container.sh
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:6230
Log Contents:
#!/bin/bash

export SPARK_YARN_STAGING_DIR=".sparkStaging/application_1479877553404_0174"
export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/hdp/current/hadoop-client/conf"}
export JAVA_HOME=${JAVA_HOME:-"/usr/jdk64/jdk1.8.0_60"}
export 
SPARK_YARN_CACHE_FILES="hdfs://hdp-master1.xcat.cluster:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar#__spark__.jar,hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/graphx_sp_2.10-1.0.jar#__app__.jar"
export 
SPARK_LOG_URL_STDOUT="http://hdp-node28.xcat.cluster:8042/node/containerlogs/container_e34_1479877553404_0174_02_000003/my_user_name/stdout?start=-4096";
export NM_HOST="hdp-node28.xcat.cluster"
export SPARK_YARN_CACHE_FILES_FILE_SIZES="188727178,14787"
export SPARK_YARN_CACHE_ARCHIVES_TIME_STAMPS="1481228482907"
export LOGNAME="my_user_name"
export JVM_PID="$$"
export 
PWD="/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/container_e34_1479877553404_0174_02_000003"
export 
LOCAL_DIRS="/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174,/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174"
export NM_HTTP_PORT="8042"
export 
LOG_DIRS="/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000003,/hadoop_1/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000003"
export 
NM_AUX_SERVICE_mapreduce_shuffle="AAA0+gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=
"
export NM_PORT="45454"
export SPARK_YARN_CACHE_FILES_TIME_STAMPS="1478064369306,1481228482852"
export USER="my_user_name"
export 
HADOOP_YARN_HOME=${HADOOP_YARN_HOME:-"/usr/hdp/current/hadoop-yarn-nodemanager"}
export 
CLASSPATH="$PWD:$PWD/__spark_conf__:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.5.0.0-1245/hadoop/lib/hadoop-lzo-0.6.0.2.5.0.0-1245.jar:/etc/hadoop/conf/secure"
export 
SPARK_YARN_CACHE_ARCHIVES="hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/__spark_conf__2528926660896665250.zip#__spark_conf__"
export SPARK_YARN_CACHE_ARCHIVES_FILE_SIZES="125215"
export SPARK_YARN_MODE="true"
export SPARK_YARN_CACHE_FILES_VISIBILITIES="PUBLIC,PRIVATE"
export 
HADOOP_TOKEN_FILE_LOCATION="/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/container_e34_1479877553404_0174_02_000003/container_tokens"
export NM_AUX_SERVICE_spark_shuffle=""
export SPARK_USER="my_user_name"
export 
LOCAL_USER_DIRS="/hadoop/yarn/local/usercache/my_user_name/,/hadoop_1/hadoop/yarn/local/usercache/my_user_name/"
export 
SPARK_LOG_URL_STDERR="http://hdp-node28.xcat.cluster:8042/node/containerlogs/container_e34_1479877553404_0174_02_000003/my_user_name/stderr?start=-4096";
export SPARK_YARN_CACHE_ARCHIVES_VISIBILITIES="PRIVATE"
export HOME="/home/"
export NM_AUX_SERVICE_spark2_shuffle=""
export CONTAINER_ID="container_e34_1479877553404_0174_02_000003"
export MALLOC_ARENA_MAX="4"
ln -sf 
"/hadoop_1/hadoop/yarn/local/usercache/my_user_name/filecache/17/graphx_sp_2.10-1.0.jar"
 "__app__.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf 
"/hadoop/yarn/local/usercache/my_user_name/filecache/16/__spark_conf__2528926660896665250.zip"
 "__spark_conf__"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/filecache/12/spark-hdp-assembly.jar" "__spark__.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
# Creating copy of launch script
cp "launch_container.sh" 
"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000003/launch_container.sh"
chmod 640 
"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000003/launch_container.sh"
# Determining directory contents
echo "ls -l:" 
1>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000003/directory.info"
ls -l 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000003/directory.info"
echo "find -L . -maxdepth 5 -ls:" 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000003/directory.info"
find -L . -maxdepth 5 -ls 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000003/directory.info"
echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000003/directory.info"
find -L . -maxdepth 5 -type l -ls 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000003/directory.info"
exec /bin/bash -c "$JAVA_HOME/bin/java -server -XX:OnOutOfMemoryError='kill %p' 
-Xms13312m -Xmx13312m -Djava.io.tmpdir=$PWD/tmp '-Dspark.driver.port=43398' 
'-Dspark.ui.port=0' 
-Dspark.yarn.app.container.log.dir=/hadoop_1/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000003
 org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url 
spark://CoarseGrainedScheduler@10.143.65.9:43398 --executor-id 2 --hostname 
hdp-node28.xcat.cluster --cores 1 --app-id application_1479877553404_0174 
--user-class-path file:$PWD/__app__.jar 1> 
/hadoop_1/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000003/stdout
 2> 
/hadoop_1/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000003/stderr"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi

End of LogType:launch_container.sh

LogType:stderr
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:4108
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/hadoop_1/yarn/local/filecache/12/spark-hdp-assembly.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/usr/hdp/2.5.0.0-1245/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/12/08 20:21:56 INFO executor.CoarseGrainedExecutorBackend: Registered signal 
handlers for [TERM, HUP, INT]
16/12/08 20:21:56 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
16/12/08 20:21:57 INFO spark.SecurityManager: Changing view acls to: 
my_user_name
16/12/08 20:21:57 INFO spark.SecurityManager: Changing modify acls to: 
my_user_name
16/12/08 20:21:57 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(my_user_name); 
users with modify permissions: Set(my_user_name)
16/12/08 20:21:58 INFO executor.CoarseGrainedExecutorBackend: Will periodically 
update credentials from: 
hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/credentials-b4345c5b-2f21-4de7-ab92-fea01c49a1c6
16/12/08 20:21:58 WARN shortcircuit.DomainSocketFactory: The short-circuit 
local reads feature cannot be used because libhadoop cannot be loaded.
16/12/08 20:21:59 INFO yarn.ExecutorDelegationTokenUpdater: Scheduling token 
refresh from HDFS in 69082991 millis.
16/12/08 20:21:59 INFO spark.SecurityManager: Changing view acls to: 
my_user_name
16/12/08 20:21:59 INFO spark.SecurityManager: Changing modify acls to: 
my_user_name
16/12/08 20:21:59 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(my_user_name); 
users with modify permissions: Set(my_user_name)
16/12/08 20:21:59 INFO slf4j.Slf4jLogger: Slf4jLogger started
16/12/08 20:21:59 INFO Remoting: Starting remoting
16/12/08 20:22:00 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://sparkexecutoractorsys...@hdp-node28.xcat.cluster:44512]
16/12/08 20:22:00 INFO util.Utils: Successfully started service 
'sparkExecutorActorSystem' on port 44512.
16/12/08 20:22:00 INFO storage.DiskBlockManager: Created local directory at 
/hadoop_1/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/blockmgr-ebf3ca45-f7f6-42be-b0ef-9b8c8fcc731b
16/12/08 20:22:00 INFO storage.DiskBlockManager: Created local directory at 
/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/blockmgr-1854d52c-ab30-4147-a27e-cb94247df02e
16/12/08 20:22:00 INFO storage.MemoryStore: MemoryStore started with capacity 
9.1 GB
16/12/08 20:22:01 INFO executor.CoarseGrainedExecutorBackend: Connecting to 
driver: spark://CoarseGrainedScheduler@10.143.65.9:43398
16/12/08 20:22:01 INFO executor.CoarseGrainedExecutorBackend: Successfully 
registered with driver
16/12/08 20:22:01 INFO executor.Executor: Starting executor ID 2 on host 
hdp-node28.xcat.cluster
16/12/08 20:22:01 INFO util.Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 36248.
16/12/08 20:22:01 INFO netty.NettyBlockTransferService: Server created on 36248
16/12/08 20:22:01 INFO storage.BlockManagerMaster: Trying to register 
BlockManager
16/12/08 20:22:01 INFO storage.BlockManagerMaster: Registered BlockManager
16/12/08 20:22:06 INFO executor.CoarseGrainedExecutorBackend: Driver commanded 
a shutdown
16/12/08 20:22:06 INFO storage.MemoryStore: MemoryStore cleared
16/12/08 20:22:06 INFO storage.BlockManager: BlockManager stopped
16/12/08 20:22:06 WARN executor.CoarseGrainedExecutorBackend: An unknown 
(hdp-node9.xcat.cluster:43398) driver disconnected.
16/12/08 20:22:06 ERROR executor.CoarseGrainedExecutorBackend: Driver 
10.143.65.9:43398 disassociated! Shutting down.
16/12/08 20:22:06 INFO remote.RemoteActorRefProvider$RemotingTerminator: 
Shutting down remote daemon.
16/12/08 20:22:06 INFO util.ShutdownHookManager: Shutdown hook called

End of LogType:stderr

LogType:stdout
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:0
Log Contents:

End of LogType:stdout

Container: container_e34_1479877553404_0174_01_000001 on 
hdp-node28.xcat.cluster_45454_1481228528273
====================================================================================================
LogType:directory.info
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:5316
Log Contents:
ls -l:
total 24
lrwxrwxrwx 1 my_user_name hadoop   79 Dec  8 20:21 __app__.jar -> 
/hadoop_1/hadoop/yarn/local/usercache/my_user_name/filecache/17/graphx_sp_2.10-1.0.jar
lrwxrwxrwx 1 my_user_name hadoop   54 Dec  8 20:21 __spark__.jar -> 
/hadoop/yarn/local/filecache/12/spark-hdp-assembly.jar
lrwxrwxrwx 1 my_user_name hadoop   85 Dec  8 20:21 __spark_conf__ -> 
/hadoop/yarn/local/usercache/my_user_name/filecache/16/__spark_conf__2528926660896665250.zip
-rw------- 1 my_user_name hadoop  322 Dec  8 20:21 container_tokens
lrwxrwxrwx 1 my_user_name hadoop   54 Dec  8 20:21 
keytab-a663e8b4-c007-4964-b12b-2d698412a057 -> 
/hadoop/yarn/local/usercache/my_user_name/filecache/18/keytab
-rwx------ 1 my_user_name hadoop 6185 Dec  8 20:21 launch_container.sh
drwxr-s--- 2 my_user_name hadoop 4096 Dec  8 20:21 tmp
find -L . -maxdepth 5 -ls:
69468226    4 drwxr-s---   3 my_user_name    hadoop       4096 Dec  8 20:21 .
69468227    4 drwxr-s---   2 my_user_name    hadoop       4096 Dec  8 20:21 
./tmp
7865232    4 drwx------   2 my_user_name    my_user_name        4096 Dec  8 
20:21 ./__spark_conf__
7865265    4 -r-x------   1 my_user_name    my_user_name        2268 Dec  8 
20:21 ./__spark_conf__/ssl-server.xml.example
7865239    8 -r-x------   1 my_user_name    my_user_name        7634 Dec  8 
20:21 ./__spark_conf__/core-site.xml
7865253    4 -r-x------   1 my_user_name    my_user_name        1072 Dec  8 
20:21 ./__spark_conf__/container-executor.cfg
7865256    4 -r-x------   1 my_user_name    my_user_name         856 Dec  8 
20:21 ./__spark_conf__/mapred-env.sh
7865262    4 -r-x------   1 my_user_name    my_user_name        2081 Dec  8 
20:21 ./__spark_conf__/topology_mappings.data
7865260    4 -r-x------   1 my_user_name    my_user_name        1308 Dec  8 
20:21 ./__spark_conf__/hadoop-policy.xml
7865255    8 -r-x------   1 my_user_name    my_user_name        4221 Dec  8 
20:21 ./__spark_conf__/task-log4j.properties
7865268    8 -r-x------   1 my_user_name    my_user_name        5511 Dec  8 
20:21 ./__spark_conf__/kms-site.xml
7865245    4 -r-x------   1 my_user_name    my_user_name        2313 Dec  8 
20:21 ./__spark_conf__/capacity-scheduler.xml
7865237    4 -r-x------   1 my_user_name    my_user_name        3979 Dec  8 
20:21 ./__spark_conf__/hadoop-env.cmd
7865236    4 -r-x------   1 my_user_name    my_user_name        2207 Dec  8 
20:21 ./__spark_conf__/hadoop-metrics2.properties
7865235    8 -r-x------   1 my_user_name    my_user_name        7353 Dec  8 
20:21 ./__spark_conf__/mapred-site.xml
7865257    4 -r-x------   1 my_user_name    my_user_name        1602 Dec  8 
20:21 ./__spark_conf__/health_check
7865248    4 -r-x------   1 my_user_name    my_user_name         758 Dec  8 
20:21 ./__spark_conf__/mapred-site.xml.template
7865247    4 -r-x------   1 my_user_name    my_user_name        2358 Dec  8 
20:21 ./__spark_conf__/topology_script.py
7865254    4 -r-x------   1 my_user_name    my_user_name        2490 Dec  8 
20:21 ./__spark_conf__/hadoop-metrics.properties
7865252    4 -r-x------   1 my_user_name    my_user_name        1020 Dec  8 
20:21 ./__spark_conf__/commons-logging.properties
7865250    8 -r-x------   1 my_user_name    my_user_name        5637 Dec  8 
20:21 ./__spark_conf__/yarn-env.sh
7865249    4 -r-x------   1 my_user_name    my_user_name        1335 Dec  8 
20:21 ./__spark_conf__/configuration.xsl
7865246    4 -r-x------   1 my_user_name    my_user_name        3518 Dec  8 
20:21 ./__spark_conf__/kms-acls.xml
7865269    4 -r-x------   1 my_user_name    my_user_name         506 Dec  8 
20:21 ./__spark_conf__/__spark_conf__.properties
7865263    4 -r-x------   1 my_user_name    my_user_name        1000 Dec  8 
20:21 ./__spark_conf__/ssl-server.xml
7865259    4 -r-x------   1 my_user_name    my_user_name        1527 Dec  8 
20:21 ./__spark_conf__/kms-env.sh
7865233    8 -r-x------   1 my_user_name    my_user_name        5410 Dec  8 
20:21 ./__spark_conf__/hadoop-env.sh
7865267    8 -r-x------   1 my_user_name    my_user_name        4113 Dec  8 
20:21 ./__spark_conf__/mapred-queues.xml.template
7865238   24 -r-x------   1 my_user_name    my_user_name       21013 Dec  8 
20:21 ./__spark_conf__/yarn-site.xml
7865264    4 -r-x------   1 my_user_name    my_user_name         951 Dec  8 
20:21 ./__spark_conf__/mapred-env.cmd
7865244    4 -r-x------   1 my_user_name    my_user_name         884 Dec  8 
20:21 ./__spark_conf__/ssl-client.xml
7865241    0 -r-x------   1 my_user_name    my_user_name           0 Dec  8 
20:21 ./__spark_conf__/yarn.exclude
7865258    4 -r-x------   1 my_user_name    my_user_name        2316 Dec  8 
20:21 ./__spark_conf__/ssl-client.xml.example
7865251   12 -r-x------   1 my_user_name    my_user_name        8217 Dec  8 
20:21 ./__spark_conf__/hdfs-site.xml
7865240    4 -r-x------   1 my_user_name    my_user_name           1 Dec  8 
20:21 ./__spark_conf__/dfs.exclude
7865266    4 -r-x------   1 my_user_name    my_user_name         945 Dec  8 
20:21 ./__spark_conf__/taskcontroller.cfg
7865242    4 -r-x------   1 my_user_name    my_user_name        1631 Dec  8 
20:21 ./__spark_conf__/kms-log4j.properties
7865234   12 -r-x------   1 my_user_name    my_user_name        9313 Dec  8 
20:21 ./__spark_conf__/log4j.properties
7865243    4 -r-x------   1 my_user_name    my_user_name        2250 Dec  8 
20:21 ./__spark_conf__/yarn-env.cmd
7865261    4 -r-x------   1 my_user_name    my_user_name         760 Dec  8 
20:21 ./__spark_conf__/slaves
69468183   16 -r-x------   1 my_user_name    my_user_name       14787 Dec  8 
20:21 ./__app__.jar
7865231    4 -r-x------   1 my_user_name    my_user_name          75 Dec  8 
20:21 ./keytab-a663e8b4-c007-4964-b12b-2d698412a057
69468229    4 -rw-------   1 my_user_name    hadoop        322 Dec  8 20:21 
./container_tokens
69468228    8 -rwx------   1 my_user_name    hadoop       6185 Dec  8 20:21 
./launch_container.sh
7864362 184304 -r-xr-xr-x   1 yarn     hadoop   188727178 Nov 10 10:03 
./__spark__.jar
broken symlinks(find -L . -maxdepth 5 -type l -ls):

End of LogType:directory.info

LogType:launch_container.sh
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:6185
Log Contents:
#!/bin/bash

export SPARK_YARN_STAGING_DIR=".sparkStaging/application_1479877553404_0174"
export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/hdp/current/hadoop-client/conf"}
export MAX_APP_ATTEMPTS="2"
export JAVA_HOME=${JAVA_HOME:-"/usr/jdk64/jdk1.8.0_60"}
export 
SPARK_YARN_CACHE_FILES="hdfs://hdp-master1.xcat.cluster:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar#__spark__.jar,hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/graphx_sp_2.10-1.0.jar#__app__.jar"
export APP_SUBMIT_TIME_ENV="1481228483091"
export NM_HOST="hdp-node28.xcat.cluster"
export SPARK_YARN_CACHE_FILES_FILE_SIZES="188727178,14787"
export SPARK_YARN_CACHE_ARCHIVES_TIME_STAMPS="1481228482907"
export LOGNAME="my_user_name"
export JVM_PID="$$"
export 
PWD="/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/container_e34_1479877553404_0174_01_000001"
export 
LOCAL_DIRS="/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174,/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174"
export APPLICATION_WEB_PROXY_BASE="/proxy/application_1479877553404_0174"
export NM_HTTP_PORT="8042"
export 
LOG_DIRS="/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000001,/hadoop_1/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000001"
export 
NM_AUX_SERVICE_mapreduce_shuffle="AAA0+gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=
"
export NM_PORT="45454"
export SPARK_YARN_CACHE_FILES_TIME_STAMPS="1478064369306,1481228482852"
export USER="my_user_name"
export 
HADOOP_YARN_HOME=${HADOOP_YARN_HOME:-"/usr/hdp/current/hadoop-yarn-nodemanager"}
export 
SPARK_YARN_CACHE_ARCHIVES="hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/__spark_conf__2528926660896665250.zip#__spark_conf__"
export 
CLASSPATH="$PWD:$PWD/__spark_conf__:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.5.0.0-1245/hadoop/lib/hadoop-lzo-0.6.0.2.5.0.0-1245.jar:/etc/hadoop/conf/secure"
export SPARK_YARN_CACHE_ARCHIVES_FILE_SIZES="125215"
export SPARK_YARN_MODE="true"
export SPARK_YARN_CACHE_FILES_VISIBILITIES="PUBLIC,PRIVATE"
export 
HADOOP_TOKEN_FILE_LOCATION="/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/container_e34_1479877553404_0174_01_000001/container_tokens"
export NM_AUX_SERVICE_spark_shuffle=""
export SPARK_USER="my_user_name"
export 
LOCAL_USER_DIRS="/hadoop/yarn/local/usercache/my_user_name/,/hadoop_1/hadoop/yarn/local/usercache/my_user_name/"
export SPARK_YARN_CACHE_ARCHIVES_VISIBILITIES="PRIVATE"
export HOME="/home/"
export NM_AUX_SERVICE_spark2_shuffle=""
export CONTAINER_ID="container_e34_1479877553404_0174_01_000001"
export MALLOC_ARENA_MAX="4"
ln -sf 
"/hadoop_1/hadoop/yarn/local/usercache/my_user_name/filecache/17/graphx_sp_2.10-1.0.jar"
 "__app__.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf 
"/hadoop/yarn/local/usercache/my_user_name/filecache/16/__spark_conf__2528926660896665250.zip"
 "__spark_conf__"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/filecache/12/spark-hdp-assembly.jar" "__spark__.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/usercache/my_user_name/filecache/18/keytab" 
"keytab-a663e8b4-c007-4964-b12b-2d698412a057"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
# Creating copy of launch script
cp "launch_container.sh" 
"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000001/launch_container.sh"
chmod 640 
"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000001/launch_container.sh"
# Determining directory contents
echo "ls -l:" 
1>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000001/directory.info"
ls -l 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000001/directory.info"
echo "find -L . -maxdepth 5 -ls:" 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000001/directory.info"
find -L . -maxdepth 5 -ls 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000001/directory.info"
echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000001/directory.info"
find -L . -maxdepth 5 -type l -ls 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000001/directory.info"
exec /bin/bash -c "$JAVA_HOME/bin/java -server -Xmx1024m 
-Djava.io.tmpdir=$PWD/tmp -Dhdp.version=2.5.0.0-1245 
-Dspark.yarn.app.container.log.dir=/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000001
 org.apache.spark.deploy.yarn.ApplicationMaster --class 'graphx_sp' --jar 
file:/home/my_user_name/Aoife/spark-abm/target/scala-2.10/graphx_sp_2.10-1.0.jar
 --executor-memory 13312m --executor-cores 1 --properties-file 
$PWD/__spark_conf__/__spark_conf__.properties 1> 
/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000001/stdout
 2> 
/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000001/stderr"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi

End of LogType:launch_container.sh

LogType:stderr
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:20037
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/hadoop_1/yarn/local/filecache/12/spark-hdp-assembly.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/usr/hdp/2.5.0.0-1245/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/12/08 20:21:27 INFO yarn.ApplicationMaster: Registered signal handlers for 
[TERM, HUP, INT]
16/12/08 20:21:28 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
16/12/08 20:21:28 INFO yarn.ApplicationMaster: ApplicationAttemptId: 
appattempt_1479877553404_0174_000001
16/12/08 20:21:30 WARN shortcircuit.DomainSocketFactory: The short-circuit 
local reads feature cannot be used because libhadoop cannot be loaded.
16/12/08 20:21:30 INFO spark.SecurityManager: Changing view acls to: 
my_user_name
16/12/08 20:21:30 INFO spark.SecurityManager: Changing modify acls to: 
my_user_name
16/12/08 20:21:30 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(my_user_name); 
users with modify permissions: Set(my_user_name)
16/12/08 20:21:30 INFO yarn.AMDelegationTokenRenewer: Scheduling login from 
keytab in 64791596 millis.
16/12/08 20:21:30 INFO yarn.ApplicationMaster: Starting the user application in 
a separate Thread
16/12/08 20:21:30 INFO yarn.ApplicationMaster: Waiting for spark context 
initialization
16/12/08 20:21:30 INFO yarn.ApplicationMaster: Waiting for spark context 
initialization ... 
16/12/08 20:21:30 INFO spark.SparkContext: Running Spark version 1.6.2
16/12/08 20:21:30 INFO spark.SecurityManager: Changing view acls to: 
my_user_name
16/12/08 20:21:30 INFO spark.SecurityManager: Changing modify acls to: 
my_user_name
16/12/08 20:21:30 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(my_user_name); 
users with modify permissions: Set(my_user_name)
16/12/08 20:21:30 INFO util.Utils: Successfully started service 'sparkDriver' 
on port 33211.
16/12/08 20:21:31 INFO slf4j.Slf4jLogger: Slf4jLogger started
16/12/08 20:21:31 INFO Remoting: Starting remoting
16/12/08 20:21:31 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://sparkDriverActorSystem@10.143.65.28:42080]
16/12/08 20:21:31 INFO util.Utils: Successfully started service 
'sparkDriverActorSystem' on port 42080.
16/12/08 20:21:31 INFO spark.SparkEnv: Registering MapOutputTracker
16/12/08 20:21:31 INFO spark.SparkEnv: Registering BlockManagerMaster
16/12/08 20:21:31 INFO storage.DiskBlockManager: Created local directory at 
/hadoop_1/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/blockmgr-360291cd-32f7-452f-9c4a-db79992b1d9b
16/12/08 20:21:31 INFO storage.DiskBlockManager: Created local directory at 
/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/blockmgr-2fc5160d-33fd-40f9-a260-71b55917ec02
16/12/08 20:21:31 INFO storage.MemoryStore: MemoryStore started with capacity 
505.5 MB
16/12/08 20:21:31 INFO spark.SparkEnv: Registering OutputCommitCoordinator
16/12/08 20:21:31 INFO ui.JettyUtils: Adding filter: 
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
16/12/08 20:21:32 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/12/08 20:21:32 INFO server.AbstractConnector: Started 
SelectChannelConnector@0.0.0.0:45342
16/12/08 20:21:32 INFO util.Utils: Successfully started service 'SparkUI' on 
port 45342.
16/12/08 20:21:32 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at 
http://10.143.65.28:45342
16/12/08 20:21:32 INFO cluster.YarnClusterScheduler: Created 
YarnClusterScheduler
16/12/08 20:21:32 INFO cluster.SchedulerExtensionServices: Starting Yarn 
extension services with app application_1479877553404_0174 and attemptId 
Some(appattempt_1479877553404_0174_000001)
16/12/08 20:21:32 INFO util.Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 37599.
16/12/08 20:21:32 INFO netty.NettyBlockTransferService: Server created on 37599
16/12/08 20:21:32 INFO storage.BlockManagerMaster: Trying to register 
BlockManager
16/12/08 20:21:32 INFO storage.BlockManagerMasterEndpoint: Registering block 
manager 10.143.65.28:37599 with 505.5 MB RAM, BlockManagerId(driver, 
10.143.65.28, 37599)
16/12/08 20:21:32 INFO storage.BlockManagerMaster: Registered BlockManager
16/12/08 20:21:32 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: 
ApplicationMaster registered as 
NettyRpcEndpointRef(spark://YarnAM@10.143.65.28:33211)
16/12/08 20:21:32 INFO client.RMProxy: Connecting to ResourceManager at 
hdp-master1.xcat.cluster/10.143.65.254:8030
16/12/08 20:21:32 INFO yarn.YarnRMClient: Registering the ApplicationMaster
16/12/08 20:21:32 INFO yarn.YarnAllocator: Will request 2 executor containers, 
each with 1 cores and 14643 MB memory including 1331 MB overhead
16/12/08 20:21:32 INFO yarn.YarnAllocator: Container request (host: Any, 
capability: <memory:14643, vCores:1>)
16/12/08 20:21:32 INFO yarn.YarnAllocator: Container request (host: Any, 
capability: <memory:14643, vCores:1>)
16/12/08 20:21:32 INFO yarn.ApplicationMaster: Started progress reporter thread 
with (heartbeat : 3000, initial allocation : 200) intervals
16/12/08 20:21:33 INFO impl.AMRMClientImpl: Received new token for : 
hdp-node30.xcat.cluster:45454
16/12/08 20:21:33 INFO impl.AMRMClientImpl: Received new token for : 
hdp-node12.xcat.cluster:45454
16/12/08 20:21:33 INFO yarn.YarnAllocator: Launching container 
container_e34_1479877553404_0174_01_000002 for on host hdp-node30.xcat.cluster
16/12/08 20:21:33 INFO yarn.YarnAllocator: Launching ExecutorRunnable. 
driverUrl: spark://CoarseGrainedScheduler@10.143.65.28:33211,  
executorHostname: hdp-node30.xcat.cluster
16/12/08 20:21:33 INFO yarn.YarnAllocator: Launching container 
container_e34_1479877553404_0174_01_000003 for on host hdp-node12.xcat.cluster
16/12/08 20:21:33 INFO yarn.ExecutorRunnable: Starting Executor Container
16/12/08 20:21:33 INFO yarn.YarnAllocator: Launching ExecutorRunnable. 
driverUrl: spark://CoarseGrainedScheduler@10.143.65.28:33211,  
executorHostname: hdp-node12.xcat.cluster
16/12/08 20:21:33 INFO yarn.ExecutorRunnable: Starting Executor Container
16/12/08 20:21:33 INFO yarn.YarnAllocator: Received 2 containers from YARN, 
launching executors on 2 of them.
16/12/08 20:21:33 INFO impl.ContainerManagementProtocolProxy: 
yarn.client.max-cached-nodemanagers-proxies : 0
16/12/08 20:21:33 INFO impl.ContainerManagementProtocolProxy: 
yarn.client.max-cached-nodemanagers-proxies : 0
16/12/08 20:21:33 INFO yarn.ExecutorRunnable: Setting up ContainerLaunchContext
16/12/08 20:21:33 INFO yarn.ExecutorRunnable: Setting up ContainerLaunchContext
16/12/08 20:21:33 INFO yarn.ExecutorRunnable: Preparing Local resources
16/12/08 20:21:33 INFO yarn.ExecutorRunnable: Preparing Local resources
16/12/08 20:21:33 INFO yarn.ExecutorRunnable: Prepared Local resources 
Map(__app__.jar -> resource { scheme: "hdfs" host: "hdp-master1.xcat.cluster" 
port: 8020 file: 
"/user/my_user_name/.sparkStaging/application_1479877553404_0174/graphx_sp_2.10-1.0.jar"
 } size: 14787 timestamp: 1481228482852 type: FILE visibility: PRIVATE, 
__spark__.jar -> resource { scheme: "hdfs" host: "hdp-master1.xcat.cluster" 
port: 8020 file: "/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar" } size: 
188727178 timestamp: 1478064369306 type: FILE visibility: PUBLIC, 
__spark_conf__ -> resource { scheme: "hdfs" host: "hdp-master1.xcat.cluster" 
port: 8020 file: 
"/user/my_user_name/.sparkStaging/application_1479877553404_0174/__spark_conf__2528926660896665250.zip"
 } size: 125215 timestamp: 1481228482907 type: ARCHIVE visibility: PRIVATE)
16/12/08 20:21:33 INFO yarn.ExecutorRunnable: Prepared Local resources 
Map(__app__.jar -> resource { scheme: "hdfs" host: "hdp-master1.xcat.cluster" 
port: 8020 file: 
"/user/my_user_name/.sparkStaging/application_1479877553404_0174/graphx_sp_2.10-1.0.jar"
 } size: 14787 timestamp: 1481228482852 type: FILE visibility: PRIVATE, 
__spark__.jar -> resource { scheme: "hdfs" host: "hdp-master1.xcat.cluster" 
port: 8020 file: "/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar" } size: 
188727178 timestamp: 1478064369306 type: FILE visibility: PUBLIC, 
__spark_conf__ -> resource { scheme: "hdfs" host: "hdp-master1.xcat.cluster" 
port: 8020 file: 
"/user/my_user_name/.sparkStaging/application_1479877553404_0174/__spark_conf__2528926660896665250.zip"
 } size: 125215 timestamp: 1481228482907 type: ARCHIVE visibility: PRIVATE)
16/12/08 20:21:33 INFO yarn.Client: Using the spark assembly jar on HDFS 
because you are using HDP, 
defaultSparkAssembly:hdfs://hdp-master1.xcat.cluster:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar
16/12/08 20:21:33 INFO yarn.Client: Using the spark assembly jar on HDFS 
because you are using HDP, 
defaultSparkAssembly:hdfs://hdp-master1.xcat.cluster:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar
16/12/08 20:21:33 INFO yarn.ExecutorRunnable: 
===============================================================================
YARN executor launch context:
  env:
    CLASSPATH -> 
{{PWD}}<CPS>{{PWD}}/__spark_conf__<CPS>{{PWD}}/__spark__.jar<CPS>$HADOOP_CONF_DIR<CPS>/usr/hdp/current/hadoop-client/*<CPS>/usr/hdp/current/hadoop-client/lib/*<CPS>/usr/hdp/current/hadoop-hdfs-client/*<CPS>/usr/hdp/current/hadoop-hdfs-client/lib/*<CPS>/usr/hdp/current/hadoop-yarn-client/*<CPS>/usr/hdp/current/hadoop-yarn-client/lib/*<CPS>$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.5.0.0-1245/hadoop/lib/hadoop-lzo-0.6.0.2.5.0.0-1245.jar:/etc/hadoop/conf/secure
    SPARK_YARN_CACHE_ARCHIVES -> 
hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/__spark_conf__2528926660896665250.zip#__spark_conf__
    SPARK_LOG_URL_STDERR -> 
http://hdp-node30.xcat.cluster:8042/node/containerlogs/container_e34_1479877553404_0174_01_000002/my_user_name/stderr?start=-4096
    SPARK_YARN_CACHE_FILES_FILE_SIZES -> 188727178,14787
    SPARK_YARN_STAGING_DIR -> .sparkStaging/application_1479877553404_0174
    SPARK_YARN_CACHE_ARCHIVES_FILE_SIZES -> 125215
    SPARK_YARN_CACHE_FILES_VISIBILITIES -> PUBLIC,PRIVATE
    SPARK_USER -> my_user_name
    SPARK_YARN_CACHE_ARCHIVES_TIME_STAMPS -> 1481228482907
    SPARK_YARN_MODE -> true
    SPARK_YARN_CACHE_FILES_TIME_STAMPS -> 1478064369306,1481228482852
    SPARK_LOG_URL_STDOUT -> 
http://hdp-node30.xcat.cluster:8042/node/containerlogs/container_e34_1479877553404_0174_01_000002/my_user_name/stdout?start=-4096
    SPARK_YARN_CACHE_FILES -> 
hdfs://hdp-master1.xcat.cluster:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar#__spark__.jar,hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/graphx_sp_2.10-1.0.jar#__app__.jar
    SPARK_YARN_CACHE_ARCHIVES_VISIBILITIES -> PRIVATE

  command:
    {{JAVA_HOME}}/bin/java -server -XX:OnOutOfMemoryError='kill %p' -Xms13312m 
-Xmx13312m -Djava.io.tmpdir={{PWD}}/tmp '-Dspark.driver.port=33211' 
'-Dspark.ui.port=0' -Dspark.yarn.app.container.log.dir=<LOG_DIR> 
org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url 
spark://CoarseGrainedScheduler@10.143.65.28:33211 --executor-id 1 --hostname 
hdp-node30.xcat.cluster --cores 1 --app-id application_1479877553404_0174 
--user-class-path file:$PWD/__app__.jar 1> <LOG_DIR>/stdout 2> <LOG_DIR>/stderr
===============================================================================
      
16/12/08 20:21:33 INFO yarn.ExecutorRunnable: 
===============================================================================
YARN executor launch context:
  env:
    CLASSPATH -> 
{{PWD}}<CPS>{{PWD}}/__spark_conf__<CPS>{{PWD}}/__spark__.jar<CPS>$HADOOP_CONF_DIR<CPS>/usr/hdp/current/hadoop-client/*<CPS>/usr/hdp/current/hadoop-client/lib/*<CPS>/usr/hdp/current/hadoop-hdfs-client/*<CPS>/usr/hdp/current/hadoop-hdfs-client/lib/*<CPS>/usr/hdp/current/hadoop-yarn-client/*<CPS>/usr/hdp/current/hadoop-yarn-client/lib/*<CPS>$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.5.0.0-1245/hadoop/lib/hadoop-lzo-0.6.0.2.5.0.0-1245.jar:/etc/hadoop/conf/secure
    SPARK_YARN_CACHE_ARCHIVES -> 
hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/__spark_conf__2528926660896665250.zip#__spark_conf__
    SPARK_LOG_URL_STDERR -> 
http://hdp-node12.xcat.cluster:8042/node/containerlogs/container_e34_1479877553404_0174_01_000003/my_user_name/stderr?start=-4096
    SPARK_YARN_CACHE_FILES_FILE_SIZES -> 188727178,14787
    SPARK_YARN_STAGING_DIR -> .sparkStaging/application_1479877553404_0174
    SPARK_YARN_CACHE_ARCHIVES_FILE_SIZES -> 125215
    SPARK_YARN_CACHE_FILES_VISIBILITIES -> PUBLIC,PRIVATE
    SPARK_USER -> my_user_name
    SPARK_YARN_CACHE_ARCHIVES_TIME_STAMPS -> 1481228482907
    SPARK_YARN_MODE -> true
    SPARK_YARN_CACHE_FILES_TIME_STAMPS -> 1478064369306,1481228482852
    SPARK_LOG_URL_STDOUT -> 
http://hdp-node12.xcat.cluster:8042/node/containerlogs/container_e34_1479877553404_0174_01_000003/my_user_name/stdout?start=-4096
    SPARK_YARN_CACHE_FILES -> 
hdfs://hdp-master1.xcat.cluster:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar#__spark__.jar,hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/graphx_sp_2.10-1.0.jar#__app__.jar
    SPARK_YARN_CACHE_ARCHIVES_VISIBILITIES -> PRIVATE

  command:
    {{JAVA_HOME}}/bin/java -server -XX:OnOutOfMemoryError='kill %p' -Xms13312m 
-Xmx13312m -Djava.io.tmpdir={{PWD}}/tmp '-Dspark.driver.port=33211' 
'-Dspark.ui.port=0' -Dspark.yarn.app.container.log.dir=<LOG_DIR> 
org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url 
spark://CoarseGrainedScheduler@10.143.65.28:33211 --executor-id 2 --hostname 
hdp-node12.xcat.cluster --cores 1 --app-id application_1479877553404_0174 
--user-class-path file:$PWD/__app__.jar 1> <LOG_DIR>/stdout 2> <LOG_DIR>/stderr
===============================================================================
      
16/12/08 20:21:33 INFO impl.ContainerManagementProtocolProxy: Opening proxy : 
hdp-node12.xcat.cluster:45454
16/12/08 20:21:33 INFO impl.ContainerManagementProtocolProxy: Opening proxy : 
hdp-node30.xcat.cluster:45454
16/12/08 20:21:42 INFO cluster.YarnClusterSchedulerBackend: Registered executor 
NettyRpcEndpointRef(null) (hdp-node30.xcat.cluster:50002) with ID 1
16/12/08 20:21:43 INFO storage.BlockManagerMasterEndpoint: Registering block 
manager hdp-node30.xcat.cluster:45953 with 9.1 GB RAM, BlockManagerId(1, 
hdp-node30.xcat.cluster, 45953)
16/12/08 20:21:43 INFO cluster.YarnClusterSchedulerBackend: Registered executor 
NettyRpcEndpointRef(null) (hdp-node12.xcat.cluster:37572) with ID 2
16/12/08 20:21:43 INFO storage.BlockManagerMasterEndpoint: Registering block 
manager hdp-node12.xcat.cluster:45874 with 9.1 GB RAM, BlockManagerId(2, 
hdp-node12.xcat.cluster, 45874)
16/12/08 20:21:43 INFO cluster.YarnClusterSchedulerBackend: SchedulerBackend is 
ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
16/12/08 20:21:43 INFO cluster.YarnClusterScheduler: 
YarnClusterScheduler.postStartHook done
16/12/08 20:21:44 ERROR yarn.ApplicationMaster: User class threw exception: 
org.apache.hadoop.security.AccessControlException: Authentication required
org.apache.hadoop.security.AccessControlException: Authentication required
        at 
org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:457)
        at 
org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:113)
        at 
org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:738)
        at 
org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:582)
        at 
org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:612)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at 
org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:608)
        at 
org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getDelegationToken(WebHdfsFileSystem.java:1507)
        at 
org.apache.hadoop.fs.FileSystem.collectDelegationTokens(FileSystem.java:545)
        at 
org.apache.hadoop.fs.FileSystem.addDelegationTokens(FileSystem.java:523)
        at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:140)
        at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:100)
        at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodes(TokenCache.java:80)
        at 
org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:206)
        at 
org.apache.hadoop.mapred.SequenceFileInputFormat.listStatus(SequenceFileInputFormat.java:45)
        at 
org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315)
        at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:199)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:242)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:240)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:240)
        at 
org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:242)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:240)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:240)
        at 
org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:242)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:240)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:240)
        at org.apache.spark.graphx.EdgeRDD.getPartitions(EdgeRDD.scala:48)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:242)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:240)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:240)
        at 
org.apache.spark.graphx.impl.EdgeRDDImpl$$anonfun$1.apply(EdgeRDDImpl.scala:49)
        at 
org.apache.spark.graphx.impl.EdgeRDDImpl$$anonfun$1.apply(EdgeRDDImpl.scala:49)
        at scala.Option.orElse(Option.scala:257)
        at org.apache.spark.graphx.impl.EdgeRDDImpl.<init>(EdgeRDDImpl.scala:49)
        at 
org.apache.spark.graphx.EdgeRDD$.fromEdgePartitions(EdgeRDD.scala:124)
        at org.apache.spark.graphx.EdgeRDD$.fromEdges(EdgeRDD.scala:113)
        at org.apache.spark.graphx.impl.GraphImpl$.apply(GraphImpl.scala:334)
        at org.apache.spark.graphx.Graph$.apply(Graph.scala:576)
        at graphx_sp$.main(graphx_sp.scala:52)
        at graphx_sp.main(graphx_sp.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:559)

End of LogType:stderr

LogType:stdout
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:81
Log Contents:
Loading files from HDFS
vertices RDD loaded from HDFS
edges RDD loaded from HDFS

End of LogType:stdout

Container: container_e34_1479877553404_0174_01_000002 on 
hdp-node30.xcat.cluster_45454_1481228528183
====================================================================================================
LogType:directory.info
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:5147
Log Contents:
ls -l:
total 28
lrwxrwxrwx 1 my_user_name hadoop   70 Dec  8 20:21 __app__.jar -> 
/hadoop/yarn/local/usercache/my_user_name/filecache/38/graphx_sp_2.10-1.0.jar
lrwxrwxrwx 1 my_user_name hadoop   63 Dec  8 20:21 __spark__.jar -> 
/hadoop_1/hadoop/yarn/local/filecache/25/spark-hdp-assembly.jar
lrwxrwxrwx 1 my_user_name hadoop   94 Dec  8 20:21 __spark_conf__ -> 
/hadoop_1/hadoop/yarn/local/usercache/my_user_name/filecache/37/__spark_conf__2528926660896665250.zip
-rw------- 1 my_user_name hadoop  340 Dec  8 20:21 container_tokens
-rwx------ 1 my_user_name hadoop 6240 Dec  8 20:21 launch_container.sh
drwxr-s--- 2 my_user_name hadoop 4096 Dec  8 20:21 tmp
find -L . -maxdepth 5 -ls:
102236257    4 drwxr-s---   3 my_user_name    hadoop       4096 Dec  8 20:21 .
102236299    4 drwxr-s---   2 my_user_name    hadoop       4096 Dec  8 20:21 
./tmp
102237256 184304 -r-xr-xr-x   1 yarn     hadoop   188727178 Dec  3 15:17 
./__spark__.jar
20316290   16 -r-x------   1 my_user_name    my_user_name       14787 Dec  8 
20:21 ./__app__.jar
102236259    4 drwx------   2 my_user_name    my_user_name        4096 Dec  8 
20:21 ./__spark_conf__
102236276    4 -r-x------   1 my_user_name    my_user_name        1335 Dec  8 
20:21 ./__spark_conf__/configuration.xsl
102236281    4 -r-x------   1 my_user_name    my_user_name        2490 Dec  8 
20:21 ./__spark_conf__/hadoop-metrics.properties
102236283    4 -r-x------   1 my_user_name    my_user_name         856 Dec  8 
20:21 ./__spark_conf__/mapred-env.sh
102236262    8 -r-x------   1 my_user_name    my_user_name        7353 Dec  8 
20:21 ./__spark_conf__/mapred-site.xml
102236294    4 -r-x------   1 my_user_name    my_user_name        2268 Dec  8 
20:21 ./__spark_conf__/ssl-server.xml.example
102236266    8 -r-x------   1 my_user_name    my_user_name        7634 Dec  8 
20:21 ./__spark_conf__/core-site.xml
102236271    4 -r-x------   1 my_user_name    my_user_name         884 Dec  8 
20:21 ./__spark_conf__/ssl-client.xml
102236293    4 -r-x------   1 my_user_name    my_user_name         951 Dec  8 
20:21 ./__spark_conf__/mapred-env.cmd
102236272    4 -r-x------   1 my_user_name    my_user_name        2313 Dec  8 
20:21 ./__spark_conf__/capacity-scheduler.xml
102236269    4 -r-x------   1 my_user_name    my_user_name        1631 Dec  8 
20:21 ./__spark_conf__/kms-log4j.properties
102236286    4 -r-x------   1 my_user_name    my_user_name        1602 Dec  8 
20:21 ./__spark_conf__/health_check
102236273    4 -r-x------   1 my_user_name    my_user_name        3518 Dec  8 
20:21 ./__spark_conf__/kms-acls.xml
102236274    4 -r-x------   1 my_user_name    my_user_name        2358 Dec  8 
20:21 ./__spark_conf__/topology_script.py
102236291    4 -r-x------   1 my_user_name    my_user_name        2081 Dec  8 
20:21 ./__spark_conf__/topology_mappings.data
102236267    4 -r-x------   1 my_user_name    my_user_name           1 Dec  8 
20:21 ./__spark_conf__/dfs.exclude
102236260    8 -r-x------   1 my_user_name    my_user_name        5410 Dec  8 
20:21 ./__spark_conf__/hadoop-env.sh
102236292    4 -r-x------   1 my_user_name    my_user_name        1000 Dec  8 
20:21 ./__spark_conf__/ssl-server.xml
102236277    8 -r-x------   1 my_user_name    my_user_name        5637 Dec  8 
20:21 ./__spark_conf__/yarn-env.sh
102236270    4 -r-x------   1 my_user_name    my_user_name        2250 Dec  8 
20:21 ./__spark_conf__/yarn-env.cmd
102236275    4 -r-x------   1 my_user_name    my_user_name         758 Dec  8 
20:21 ./__spark_conf__/mapred-site.xml.template
102236261   12 -r-x------   1 my_user_name    my_user_name        9313 Dec  8 
20:21 ./__spark_conf__/log4j.properties
102236263    4 -r-x------   1 my_user_name    my_user_name        2207 Dec  8 
20:21 ./__spark_conf__/hadoop-metrics2.properties
102236298    4 -r-x------   1 my_user_name    my_user_name         506 Dec  8 
20:21 ./__spark_conf__/__spark_conf__.properties
102236288    4 -r-x------   1 my_user_name    my_user_name        1527 Dec  8 
20:21 ./__spark_conf__/kms-env.sh
102236282    8 -r-x------   1 my_user_name    my_user_name        4221 Dec  8 
20:21 ./__spark_conf__/task-log4j.properties
102236278   12 -r-x------   1 my_user_name    my_user_name        8217 Dec  8 
20:21 ./__spark_conf__/hdfs-site.xml
102236295    4 -r-x------   1 my_user_name    my_user_name         945 Dec  8 
20:21 ./__spark_conf__/taskcontroller.cfg
102236268    0 -r-x------   1 my_user_name    my_user_name           0 Dec  8 
20:21 ./__spark_conf__/yarn.exclude
102236290    4 -r-x------   1 my_user_name    my_user_name         760 Dec  8 
20:21 ./__spark_conf__/slaves
102236279    4 -r-x------   1 my_user_name    my_user_name        1020 Dec  8 
20:21 ./__spark_conf__/commons-logging.properties
102236265   24 -r-x------   1 my_user_name    my_user_name       21013 Dec  8 
20:21 ./__spark_conf__/yarn-site.xml
102236280    4 -r-x------   1 my_user_name    my_user_name        1072 Dec  8 
20:21 ./__spark_conf__/container-executor.cfg
102236289    4 -r-x------   1 my_user_name    my_user_name        1308 Dec  8 
20:21 ./__spark_conf__/hadoop-policy.xml
102236287    4 -r-x------   1 my_user_name    my_user_name        2316 Dec  8 
20:21 ./__spark_conf__/ssl-client.xml.example
102236296    8 -r-x------   1 my_user_name    my_user_name        4113 Dec  8 
20:21 ./__spark_conf__/mapred-queues.xml.template
102236264    4 -r-x------   1 my_user_name    my_user_name        3979 Dec  8 
20:21 ./__spark_conf__/hadoop-env.cmd
102236297    8 -r-x------   1 my_user_name    my_user_name        5511 Dec  8 
20:21 ./__spark_conf__/kms-site.xml
102236301    4 -rw-------   1 my_user_name    hadoop        340 Dec  8 20:21 
./container_tokens
102236300    8 -rwx------   1 my_user_name    hadoop       6240 Dec  8 20:21 
./launch_container.sh
broken symlinks(find -L . -maxdepth 5 -type l -ls):

End of LogType:directory.info

LogType:launch_container.sh
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:6240
Log Contents:
#!/bin/bash

export SPARK_YARN_STAGING_DIR=".sparkStaging/application_1479877553404_0174"
export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/hdp/current/hadoop-client/conf"}
export JAVA_HOME=${JAVA_HOME:-"/usr/jdk64/jdk1.8.0_60"}
export 
SPARK_YARN_CACHE_FILES="hdfs://hdp-master1.xcat.cluster:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar#__spark__.jar,hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/graphx_sp_2.10-1.0.jar#__app__.jar"
export 
SPARK_LOG_URL_STDOUT="http://hdp-node30.xcat.cluster:8042/node/containerlogs/container_e34_1479877553404_0174_01_000002/my_user_name/stdout?start=-4096";
export NM_HOST="hdp-node30.xcat.cluster"
export SPARK_YARN_CACHE_FILES_FILE_SIZES="188727178,14787"
export SPARK_YARN_CACHE_ARCHIVES_TIME_STAMPS="1481228482907"
export LOGNAME="my_user_name"
export JVM_PID="$$"
export 
PWD="/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/container_e34_1479877553404_0174_01_000002"
export 
LOCAL_DIRS="/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174,/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174"
export NM_HTTP_PORT="8042"
export 
LOG_DIRS="/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000002,/hadoop_1/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000002"
export 
NM_AUX_SERVICE_mapreduce_shuffle="AAA0+gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=
"
export NM_PORT="45454"
export SPARK_YARN_CACHE_FILES_TIME_STAMPS="1478064369306,1481228482852"
export USER="my_user_name"
export 
HADOOP_YARN_HOME=${HADOOP_YARN_HOME:-"/usr/hdp/current/hadoop-yarn-nodemanager"}
export 
CLASSPATH="$PWD:$PWD/__spark_conf__:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.5.0.0-1245/hadoop/lib/hadoop-lzo-0.6.0.2.5.0.0-1245.jar:/etc/hadoop/conf/secure"
export 
SPARK_YARN_CACHE_ARCHIVES="hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/__spark_conf__2528926660896665250.zip#__spark_conf__"
export SPARK_YARN_CACHE_ARCHIVES_FILE_SIZES="125215"
export SPARK_YARN_MODE="true"
export SPARK_YARN_CACHE_FILES_VISIBILITIES="PUBLIC,PRIVATE"
export 
HADOOP_TOKEN_FILE_LOCATION="/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/container_e34_1479877553404_0174_01_000002/container_tokens"
export NM_AUX_SERVICE_spark_shuffle=""
export SPARK_USER="my_user_name"
export 
LOCAL_USER_DIRS="/hadoop/yarn/local/usercache/my_user_name/,/hadoop_1/hadoop/yarn/local/usercache/my_user_name/"
export 
SPARK_LOG_URL_STDERR="http://hdp-node30.xcat.cluster:8042/node/containerlogs/container_e34_1479877553404_0174_01_000002/my_user_name/stderr?start=-4096";
export SPARK_YARN_CACHE_ARCHIVES_VISIBILITIES="PRIVATE"
export HOME="/home/"
export NM_AUX_SERVICE_spark2_shuffle=""
export CONTAINER_ID="container_e34_1479877553404_0174_01_000002"
export MALLOC_ARENA_MAX="4"
ln -sf 
"/hadoop_1/hadoop/yarn/local/usercache/my_user_name/filecache/37/__spark_conf__2528926660896665250.zip"
 "__spark_conf__"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf 
"/hadoop/yarn/local/usercache/my_user_name/filecache/38/graphx_sp_2.10-1.0.jar" 
"__app__.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop_1/hadoop/yarn/local/filecache/25/spark-hdp-assembly.jar" 
"__spark__.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
# Creating copy of launch script
cp "launch_container.sh" 
"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000002/launch_container.sh"
chmod 640 
"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000002/launch_container.sh"
# Determining directory contents
echo "ls -l:" 
1>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000002/directory.info"
ls -l 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000002/directory.info"
echo "find -L . -maxdepth 5 -ls:" 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000002/directory.info"
find -L . -maxdepth 5 -ls 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000002/directory.info"
echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000002/directory.info"
find -L . -maxdepth 5 -type l -ls 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000002/directory.info"
exec /bin/bash -c "$JAVA_HOME/bin/java -server -XX:OnOutOfMemoryError='kill %p' 
-Xms13312m -Xmx13312m -Djava.io.tmpdir=$PWD/tmp '-Dspark.driver.port=33211' 
'-Dspark.ui.port=0' 
-Dspark.yarn.app.container.log.dir=/hadoop_1/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000002
 org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url 
spark://CoarseGrainedScheduler@10.143.65.28:33211 --executor-id 1 --hostname 
hdp-node30.xcat.cluster --cores 1 --app-id application_1479877553404_0174 
--user-class-path file:$PWD/__app__.jar 1> 
/hadoop_1/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000002/stdout
 2> 
/hadoop_1/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_01_000002/stderr"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi

End of LogType:launch_container.sh

LogType:stderr
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:4118
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/hadoop_1/hadoop/yarn/local/filecache/25/spark-hdp-assembly.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/usr/hdp/2.5.0.0-1245/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/12/08 20:21:38 INFO executor.CoarseGrainedExecutorBackend: Registered signal 
handlers for [TERM, HUP, INT]
16/12/08 20:21:38 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
16/12/08 20:21:38 INFO spark.SecurityManager: Changing view acls to: 
my_user_name
16/12/08 20:21:38 INFO spark.SecurityManager: Changing modify acls to: 
my_user_name
16/12/08 20:21:38 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(my_user_name); 
users with modify permissions: Set(my_user_name)
16/12/08 20:21:40 INFO executor.CoarseGrainedExecutorBackend: Will periodically 
update credentials from: 
hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/credentials-b4345c5b-2f21-4de7-ab92-fea01c49a1c6
16/12/08 20:21:41 WARN shortcircuit.DomainSocketFactory: The short-circuit 
local reads feature cannot be used because libhadoop cannot be loaded.
16/12/08 20:21:41 INFO yarn.ExecutorDelegationTokenUpdater: Scheduling token 
refresh from HDFS in 69100735 millis.
16/12/08 20:21:41 INFO spark.SecurityManager: Changing view acls to: 
my_user_name
16/12/08 20:21:41 INFO spark.SecurityManager: Changing modify acls to: 
my_user_name
16/12/08 20:21:41 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(my_user_name); 
users with modify permissions: Set(my_user_name)
16/12/08 20:21:42 INFO slf4j.Slf4jLogger: Slf4jLogger started
16/12/08 20:21:42 INFO Remoting: Starting remoting
16/12/08 20:21:42 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://sparkexecutoractorsys...@hdp-node30.xcat.cluster:35963]
16/12/08 20:21:42 INFO util.Utils: Successfully started service 
'sparkExecutorActorSystem' on port 35963.
16/12/08 20:21:42 INFO storage.DiskBlockManager: Created local directory at 
/hadoop_1/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/blockmgr-aa96fcb1-a2c7-414d-a662-31000c32fbc5
16/12/08 20:21:42 INFO storage.DiskBlockManager: Created local directory at 
/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/blockmgr-ec8be19d-4081-40c1-aa10-1d5c2f841f5e
16/12/08 20:21:42 INFO storage.MemoryStore: MemoryStore started with capacity 
9.1 GB
16/12/08 20:21:42 INFO executor.CoarseGrainedExecutorBackend: Connecting to 
driver: spark://CoarseGrainedScheduler@10.143.65.28:33211
16/12/08 20:21:42 INFO executor.CoarseGrainedExecutorBackend: Successfully 
registered with driver
16/12/08 20:21:42 INFO executor.Executor: Starting executor ID 1 on host 
hdp-node30.xcat.cluster
16/12/08 20:21:43 INFO util.Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 45953.
16/12/08 20:21:43 INFO netty.NettyBlockTransferService: Server created on 45953
16/12/08 20:21:43 INFO storage.BlockManagerMaster: Trying to register 
BlockManager
16/12/08 20:21:43 INFO storage.BlockManagerMaster: Registered BlockManager
16/12/08 20:21:44 INFO executor.CoarseGrainedExecutorBackend: Driver commanded 
a shutdown
16/12/08 20:21:44 INFO storage.MemoryStore: MemoryStore cleared
16/12/08 20:21:44 INFO storage.BlockManager: BlockManager stopped
16/12/08 20:21:44 WARN executor.CoarseGrainedExecutorBackend: An unknown 
(hdp-node28.xcat.cluster:33211) driver disconnected.
16/12/08 20:21:44 ERROR executor.CoarseGrainedExecutorBackend: Driver 
10.143.65.28:33211 disassociated! Shutting down.
16/12/08 20:21:44 INFO util.ShutdownHookManager: Shutdown hook called
16/12/08 20:21:44 INFO remote.RemoteActorRefProvider$RemotingTerminator: 
Shutting down remote daemon.

End of LogType:stderr

LogType:stdout
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:0
Log Contents:

End of LogType:stdout

Container: container_e34_1479877553404_0174_02_000001 on 
hdp-node9.xcat.cluster_45454_1481228528357
===================================================================================================
LogType:directory.info
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:5356
Log Contents:
ls -l:
total 24
lrwxrwxrwx 1 my_user_name hadoop   79 Dec  8 20:21 __app__.jar -> 
/hadoop_1/hadoop/yarn/local/usercache/my_user_name/filecache/24/graphx_sp_2.10-1.0.jar
lrwxrwxrwx 1 my_user_name hadoop   54 Dec  8 20:21 __spark__.jar -> 
/hadoop/yarn/local/filecache/12/spark-hdp-assembly.jar
lrwxrwxrwx 1 my_user_name hadoop   85 Dec  8 20:21 __spark_conf__ -> 
/hadoop/yarn/local/usercache/my_user_name/filecache/23/__spark_conf__2528926660896665250.zip
-rw------- 1 my_user_name hadoop  322 Dec  8 20:21 container_tokens
lrwxrwxrwx 1 my_user_name hadoop   54 Dec  8 20:21 
keytab-a663e8b4-c007-4964-b12b-2d698412a057 -> 
/hadoop/yarn/local/usercache/my_user_name/filecache/25/keytab
-rwx------ 1 my_user_name hadoop 6211 Dec  8 20:21 launch_container.sh
drwxr-s--- 2 my_user_name hadoop 4096 Dec  8 20:21 tmp
find -L . -maxdepth 5 -ls:
54264016    4 drwxr-s---   3 my_user_name    hadoop       4096 Dec  8 20:21 .
54264019    8 -rwx------   1 my_user_name    hadoop       6211 Dec  8 20:21 
./launch_container.sh
54264020    4 -rw-------   1 my_user_name    hadoop        322 Dec  8 20:21 
./container_tokens
22937743    4 drwx------   2 my_user_name    my_user_name        4096 Dec  8 
20:21 ./__spark_conf__
22937750    8 -r-x------   1 my_user_name    my_user_name        7634 Dec  8 
20:21 ./__spark_conf__/core-site.xml
22937744    8 -r-x------   1 my_user_name    my_user_name        5410 Dec  8 
20:21 ./__spark_conf__/hadoop-env.sh
22937774    4 -r-x------   1 my_user_name    my_user_name        1000 Dec  8 
20:21 ./__spark_conf__/ssl-server.xml
22937766    8 -r-x------   1 my_user_name    my_user_name        4221 Dec  8 
20:21 ./__spark_conf__/task-log4j.properties
22937771    4 -r-x------   1 my_user_name    my_user_name        1308 Dec  8 
20:21 ./__spark_conf__/hadoop-policy.xml
22937773    4 -r-x------   1 my_user_name    my_user_name        2081 Dec  8 
20:21 ./__spark_conf__/topology_mappings.data
22937747    4 -r-x------   1 my_user_name    my_user_name        2207 Dec  8 
20:21 ./__spark_conf__/hadoop-metrics2.properties
22937764    4 -r-x------   1 my_user_name    my_user_name        1072 Dec  8 
20:21 ./__spark_conf__/container-executor.cfg
22937765    4 -r-x------   1 my_user_name    my_user_name        2490 Dec  8 
20:21 ./__spark_conf__/hadoop-metrics.properties
22937757    4 -r-x------   1 my_user_name    my_user_name        3518 Dec  8 
20:21 ./__spark_conf__/kms-acls.xml
22937763    4 -r-x------   1 my_user_name    my_user_name        1020 Dec  8 
20:21 ./__spark_conf__/commons-logging.properties
22937775    4 -r-x------   1 my_user_name    my_user_name         951 Dec  8 
20:21 ./__spark_conf__/mapred-env.cmd
22937746    8 -r-x------   1 my_user_name    my_user_name        7353 Dec  8 
20:21 ./__spark_conf__/mapred-site.xml
22937776    4 -r-x------   1 my_user_name    my_user_name        2268 Dec  8 
20:21 ./__spark_conf__/ssl-server.xml.example
22937749   24 -r-x------   1 my_user_name    my_user_name       21013 Dec  8 
20:21 ./__spark_conf__/yarn-site.xml
22937748    4 -r-x------   1 my_user_name    my_user_name        3979 Dec  8 
20:21 ./__spark_conf__/hadoop-env.cmd
22937745   12 -r-x------   1 my_user_name    my_user_name        9313 Dec  8 
20:21 ./__spark_conf__/log4j.properties
22937778    8 -r-x------   1 my_user_name    my_user_name        4113 Dec  8 
20:21 ./__spark_conf__/mapred-queues.xml.template
22937758    4 -r-x------   1 my_user_name    my_user_name        2358 Dec  8 
20:21 ./__spark_conf__/topology_script.py
22937768    4 -r-x------   1 my_user_name    my_user_name        1602 Dec  8 
20:21 ./__spark_conf__/health_check
22937759    4 -r-x------   1 my_user_name    my_user_name         758 Dec  8 
20:21 ./__spark_conf__/mapred-site.xml.template
22937761    8 -r-x------   1 my_user_name    my_user_name        5637 Dec  8 
20:21 ./__spark_conf__/yarn-env.sh
22937756    4 -r-x------   1 my_user_name    my_user_name        2313 Dec  8 
20:21 ./__spark_conf__/capacity-scheduler.xml
22937779    8 -r-x------   1 my_user_name    my_user_name        5511 Dec  8 
20:21 ./__spark_conf__/kms-site.xml
22937770    4 -r-x------   1 my_user_name    my_user_name        1527 Dec  8 
20:21 ./__spark_conf__/kms-env.sh
22937753    4 -r-x------   1 my_user_name    my_user_name        1631 Dec  8 
20:21 ./__spark_conf__/kms-log4j.properties
22937777    4 -r-x------   1 my_user_name    my_user_name         945 Dec  8 
20:21 ./__spark_conf__/taskcontroller.cfg
22937772    4 -r-x------   1 my_user_name    my_user_name         760 Dec  8 
20:21 ./__spark_conf__/slaves
22937769    4 -r-x------   1 my_user_name    my_user_name        2316 Dec  8 
20:21 ./__spark_conf__/ssl-client.xml.example
22937754    4 -r-x------   1 my_user_name    my_user_name        2250 Dec  8 
20:21 ./__spark_conf__/yarn-env.cmd
22937780    4 -r-x------   1 my_user_name    my_user_name         506 Dec  8 
20:21 ./__spark_conf__/__spark_conf__.properties
22937752    0 -r-x------   1 my_user_name    my_user_name           0 Dec  8 
20:21 ./__spark_conf__/yarn.exclude
22937751    4 -r-x------   1 my_user_name    my_user_name           1 Dec  8 
20:21 ./__spark_conf__/dfs.exclude
22937762   12 -r-x------   1 my_user_name    my_user_name        8217 Dec  8 
20:21 ./__spark_conf__/hdfs-site.xml
22937760    4 -r-x------   1 my_user_name    my_user_name        1335 Dec  8 
20:21 ./__spark_conf__/configuration.xsl
22937767    4 -r-x------   1 my_user_name    my_user_name         856 Dec  8 
20:21 ./__spark_conf__/mapred-env.sh
22937755    4 -r-x------   1 my_user_name    my_user_name         884 Dec  8 
20:21 ./__spark_conf__/ssl-client.xml
22806554 184304 -r-xr-xr-x   1 yarn     hadoop   188727178 Dec  5 16:50 
./__spark__.jar
54264018    4 drwxr-s---   2 my_user_name    hadoop       4096 Dec  8 20:21 
./tmp
54263971   16 -r-x------   1 my_user_name    my_user_name       14787 Dec  8 
20:21 ./__app__.jar
22937742    4 -r-x------   1 my_user_name    my_user_name          75 Dec  8 
20:21 ./keytab-a663e8b4-c007-4964-b12b-2d698412a057
broken symlinks(find -L . -maxdepth 5 -type l -ls):

End of LogType:directory.info

LogType:launch_container.sh
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:6211
Log Contents:
#!/bin/bash

export SPARK_YARN_STAGING_DIR=".sparkStaging/application_1479877553404_0174"
export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/hdp/current/hadoop-client/conf"}
export MAX_APP_ATTEMPTS="2"
export JAVA_HOME=${JAVA_HOME:-"/usr/jdk64/jdk1.8.0_60"}
export 
SPARK_YARN_CACHE_FILES="hdfs://hdp-master1.xcat.cluster:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar#__spark__.jar,hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/graphx_sp_2.10-1.0.jar#__app__.jar"
export APP_SUBMIT_TIME_ENV="1481228483091"
export NM_HOST="hdp-node9.xcat.cluster"
export SPARK_YARN_CACHE_FILES_FILE_SIZES="188727178,14787"
export SPARK_YARN_CACHE_ARCHIVES_TIME_STAMPS="1481228482907"
export LOGNAME="my_user_name"
export JVM_PID="$$"
export 
PWD="/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/container_e34_1479877553404_0174_02_000001"
export 
LOCAL_DIRS="/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174,/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174"
export APPLICATION_WEB_PROXY_BASE="/proxy/application_1479877553404_0174"
export NM_HTTP_PORT="8042"
export 
LOG_DIRS="/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000001,/hadoop_1/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000001"
export 
NM_AUX_SERVICE_mapreduce_shuffle="AAA0+gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=
"
export NM_PORT="45454"
export SPARK_YARN_CACHE_FILES_TIME_STAMPS="1478064369306,1481228482852"
export USER="my_user_name"
export 
HADOOP_YARN_HOME=${HADOOP_YARN_HOME:-"/usr/hdp/current/hadoop-yarn-nodemanager"}
export 
SPARK_YARN_CACHE_ARCHIVES="hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/__spark_conf__2528926660896665250.zip#__spark_conf__"
export 
CLASSPATH="$PWD:$PWD/__spark_conf__:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.5.0.0-1245/hadoop/lib/hadoop-lzo-0.6.0.2.5.0.0-1245.jar:/etc/hadoop/conf/secure"
export SPARK_YARN_CACHE_ARCHIVES_FILE_SIZES="125215"
export SPARK_YARN_MODE="true"
export SPARK_YARN_CACHE_FILES_VISIBILITIES="PUBLIC,PRIVATE"
export 
HADOOP_TOKEN_FILE_LOCATION="/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/container_e34_1479877553404_0174_02_000001/container_tokens"
export NM_AUX_SERVICE_spark_shuffle=""
export SPARK_USER="my_user_name"
export 
LOCAL_USER_DIRS="/hadoop/yarn/local/usercache/my_user_name/,/hadoop_1/hadoop/yarn/local/usercache/my_user_name/"
export SPARK_YARN_CACHE_ARCHIVES_VISIBILITIES="PRIVATE"
export HOME="/home/"
export NM_AUX_SERVICE_spark2_shuffle=""
export CONTAINER_ID="container_e34_1479877553404_0174_02_000001"
export MALLOC_ARENA_MAX="4"
ln -sf 
"/hadoop_1/hadoop/yarn/local/usercache/my_user_name/filecache/24/graphx_sp_2.10-1.0.jar"
 "__app__.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf 
"/hadoop/yarn/local/usercache/my_user_name/filecache/23/__spark_conf__2528926660896665250.zip"
 "__spark_conf__"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/filecache/12/spark-hdp-assembly.jar" "__spark__.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/usercache/my_user_name/filecache/25/keytab" 
"keytab-a663e8b4-c007-4964-b12b-2d698412a057"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
# Creating copy of launch script
cp "launch_container.sh" 
"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000001/launch_container.sh"
chmod 640 
"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000001/launch_container.sh"
# Determining directory contents
echo "ls -l:" 
1>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000001/directory.info"
ls -l 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000001/directory.info"
echo "find -L . -maxdepth 5 -ls:" 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000001/directory.info"
find -L . -maxdepth 5 -ls 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000001/directory.info"
echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000001/directory.info"
find -L . -maxdepth 5 -type l -ls 
1>>"/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000001/directory.info"
exec /bin/bash -c "$JAVA_HOME/bin/java -server -Xmx1024m 
-Djava.io.tmpdir=$PWD/tmp -Dhdp.version=2.5.0.0-1245 
-Dspark.yarn.app.container.log.dir=/hadoop_1/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000001
 org.apache.spark.deploy.yarn.ApplicationMaster --class 'graphx_sp' --jar 
file:/home/my_user_name/Aoife/spark-abm/target/scala-2.10/graphx_sp_2.10-1.0.jar
 --executor-memory 13312m --executor-cores 1 --properties-file 
$PWD/__spark_conf__/__spark_conf__.properties 1> 
/hadoop_1/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000001/stdout
 2> 
/hadoop_1/hadoop/yarn/log/application_1479877553404_0174/container_e34_1479877553404_0174_02_000001/stderr"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi

End of LogType:launch_container.sh

LogType:stderr
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:20028
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/hadoop_1/yarn/local/filecache/12/spark-hdp-assembly.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/usr/hdp/2.5.0.0-1245/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/12/08 20:21:50 INFO yarn.ApplicationMaster: Registered signal handlers for 
[TERM, HUP, INT]
16/12/08 20:21:50 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
16/12/08 20:21:50 INFO yarn.ApplicationMaster: ApplicationAttemptId: 
appattempt_1479877553404_0174_000002
16/12/08 20:21:52 WARN shortcircuit.DomainSocketFactory: The short-circuit 
local reads feature cannot be used because libhadoop cannot be loaded.
16/12/08 20:21:52 INFO spark.SecurityManager: Changing view acls to: 
my_user_name
16/12/08 20:21:52 INFO spark.SecurityManager: Changing modify acls to: 
my_user_name
16/12/08 20:21:52 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(my_user_name); 
users with modify permissions: Set(my_user_name)
16/12/08 20:21:52 INFO yarn.AMDelegationTokenRenewer: Scheduling login from 
keytab in 64769540 millis.
16/12/08 20:21:52 INFO yarn.ApplicationMaster: Starting the user application in 
a separate Thread
16/12/08 20:21:52 INFO yarn.ApplicationMaster: Waiting for spark context 
initialization
16/12/08 20:21:52 INFO yarn.ApplicationMaster: Waiting for spark context 
initialization ... 
16/12/08 20:21:52 INFO spark.SparkContext: Running Spark version 1.6.2
16/12/08 20:21:52 INFO spark.SecurityManager: Changing view acls to: 
my_user_name
16/12/08 20:21:52 INFO spark.SecurityManager: Changing modify acls to: 
my_user_name
16/12/08 20:21:52 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(my_user_name); 
users with modify permissions: Set(my_user_name)
16/12/08 20:21:52 INFO util.Utils: Successfully started service 'sparkDriver' 
on port 43398.
16/12/08 20:21:53 INFO slf4j.Slf4jLogger: Slf4jLogger started
16/12/08 20:21:53 INFO Remoting: Starting remoting
16/12/08 20:21:53 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://sparkDriverActorSystem@10.143.65.9:33744]
16/12/08 20:21:53 INFO util.Utils: Successfully started service 
'sparkDriverActorSystem' on port 33744.
16/12/08 20:21:53 INFO spark.SparkEnv: Registering MapOutputTracker
16/12/08 20:21:53 INFO spark.SparkEnv: Registering BlockManagerMaster
16/12/08 20:21:53 INFO storage.DiskBlockManager: Created local directory at 
/hadoop_1/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/blockmgr-64b021a8-9c80-43ac-b455-4d26ead94015
16/12/08 20:21:53 INFO storage.DiskBlockManager: Created local directory at 
/hadoop_1/hadoop/yarn/local/usercache/my_user_name/appcache/application_1479877553404_0174/blockmgr-b058584b-7cca-4f0a-9134-e45cd4e6f7d9
16/12/08 20:21:53 INFO storage.MemoryStore: MemoryStore started with capacity 
511.9 MB
16/12/08 20:21:53 INFO spark.SparkEnv: Registering OutputCommitCoordinator
16/12/08 20:21:54 INFO ui.JettyUtils: Adding filter: 
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
16/12/08 20:21:54 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/12/08 20:21:54 INFO server.AbstractConnector: Started 
SelectChannelConnector@0.0.0.0:44070
16/12/08 20:21:54 INFO util.Utils: Successfully started service 'SparkUI' on 
port 44070.
16/12/08 20:21:54 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at 
http://10.143.65.9:44070
16/12/08 20:21:54 INFO cluster.YarnClusterScheduler: Created 
YarnClusterScheduler
16/12/08 20:21:54 INFO cluster.SchedulerExtensionServices: Starting Yarn 
extension services with app application_1479877553404_0174 and attemptId 
Some(appattempt_1479877553404_0174_000002)
16/12/08 20:21:54 INFO util.Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 43402.
16/12/08 20:21:54 INFO netty.NettyBlockTransferService: Server created on 43402
16/12/08 20:21:54 INFO storage.BlockManagerMaster: Trying to register 
BlockManager
16/12/08 20:21:54 INFO storage.BlockManagerMasterEndpoint: Registering block 
manager 10.143.65.9:43402 with 511.9 MB RAM, BlockManagerId(driver, 
10.143.65.9, 43402)
16/12/08 20:21:54 INFO storage.BlockManagerMaster: Registered BlockManager
16/12/08 20:21:54 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: 
ApplicationMaster registered as 
NettyRpcEndpointRef(spark://YarnAM@10.143.65.9:43398)
16/12/08 20:21:54 INFO client.RMProxy: Connecting to ResourceManager at 
hdp-master1.xcat.cluster/10.143.65.254:8030
16/12/08 20:21:54 INFO yarn.YarnRMClient: Registering the ApplicationMaster
16/12/08 20:21:54 INFO yarn.YarnAllocator: Will request 2 executor containers, 
each with 1 cores and 14643 MB memory including 1331 MB overhead
16/12/08 20:21:54 INFO yarn.YarnAllocator: Container request (host: Any, 
capability: <memory:14643, vCores:1>)
16/12/08 20:21:54 INFO yarn.YarnAllocator: Container request (host: Any, 
capability: <memory:14643, vCores:1>)
16/12/08 20:21:54 INFO yarn.ApplicationMaster: Started progress reporter thread 
with (heartbeat : 3000, initial allocation : 200) intervals
16/12/08 20:21:55 INFO impl.AMRMClientImpl: Received new token for : 
hdp-node26.xcat.cluster:45454
16/12/08 20:21:55 INFO impl.AMRMClientImpl: Received new token for : 
hdp-node28.xcat.cluster:45454
16/12/08 20:21:55 INFO yarn.YarnAllocator: Launching container 
container_e34_1479877553404_0174_02_000002 for on host hdp-node26.xcat.cluster
16/12/08 20:21:55 INFO yarn.YarnAllocator: Launching ExecutorRunnable. 
driverUrl: spark://CoarseGrainedScheduler@10.143.65.9:43398,  executorHostname: 
hdp-node26.xcat.cluster
16/12/08 20:21:55 INFO yarn.YarnAllocator: Launching container 
container_e34_1479877553404_0174_02_000003 for on host hdp-node28.xcat.cluster
16/12/08 20:21:55 INFO yarn.ExecutorRunnable: Starting Executor Container
16/12/08 20:21:55 INFO yarn.YarnAllocator: Launching ExecutorRunnable. 
driverUrl: spark://CoarseGrainedScheduler@10.143.65.9:43398,  executorHostname: 
hdp-node28.xcat.cluster
16/12/08 20:21:55 INFO yarn.ExecutorRunnable: Starting Executor Container
16/12/08 20:21:55 INFO yarn.YarnAllocator: Received 2 containers from YARN, 
launching executors on 2 of them.
16/12/08 20:21:55 INFO impl.ContainerManagementProtocolProxy: 
yarn.client.max-cached-nodemanagers-proxies : 0
16/12/08 20:21:55 INFO impl.ContainerManagementProtocolProxy: 
yarn.client.max-cached-nodemanagers-proxies : 0
16/12/08 20:21:55 INFO yarn.ExecutorRunnable: Setting up ContainerLaunchContext
16/12/08 20:21:55 INFO yarn.ExecutorRunnable: Setting up ContainerLaunchContext
16/12/08 20:21:55 INFO yarn.ExecutorRunnable: Preparing Local resources
16/12/08 20:21:55 INFO yarn.ExecutorRunnable: Preparing Local resources
16/12/08 20:21:55 INFO yarn.ExecutorRunnable: Prepared Local resources 
Map(__app__.jar -> resource { scheme: "hdfs" host: "hdp-master1.xcat.cluster" 
port: 8020 file: 
"/user/my_user_name/.sparkStaging/application_1479877553404_0174/graphx_sp_2.10-1.0.jar"
 } size: 14787 timestamp: 1481228482852 type: FILE visibility: PRIVATE, 
__spark__.jar -> resource { scheme: "hdfs" host: "hdp-master1.xcat.cluster" 
port: 8020 file: "/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar" } size: 
188727178 timestamp: 1478064369306 type: FILE visibility: PUBLIC, 
__spark_conf__ -> resource { scheme: "hdfs" host: "hdp-master1.xcat.cluster" 
port: 8020 file: 
"/user/my_user_name/.sparkStaging/application_1479877553404_0174/__spark_conf__2528926660896665250.zip"
 } size: 125215 timestamp: 1481228482907 type: ARCHIVE visibility: PRIVATE)
16/12/08 20:21:55 INFO yarn.ExecutorRunnable: Prepared Local resources 
Map(__app__.jar -> resource { scheme: "hdfs" host: "hdp-master1.xcat.cluster" 
port: 8020 file: 
"/user/my_user_name/.sparkStaging/application_1479877553404_0174/graphx_sp_2.10-1.0.jar"
 } size: 14787 timestamp: 1481228482852 type: FILE visibility: PRIVATE, 
__spark__.jar -> resource { scheme: "hdfs" host: "hdp-master1.xcat.cluster" 
port: 8020 file: "/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar" } size: 
188727178 timestamp: 1478064369306 type: FILE visibility: PUBLIC, 
__spark_conf__ -> resource { scheme: "hdfs" host: "hdp-master1.xcat.cluster" 
port: 8020 file: 
"/user/my_user_name/.sparkStaging/application_1479877553404_0174/__spark_conf__2528926660896665250.zip"
 } size: 125215 timestamp: 1481228482907 type: ARCHIVE visibility: PRIVATE)
16/12/08 20:21:55 INFO yarn.Client: Using the spark assembly jar on HDFS 
because you are using HDP, 
defaultSparkAssembly:hdfs://hdp-master1.xcat.cluster:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar
16/12/08 20:21:55 INFO yarn.Client: Using the spark assembly jar on HDFS 
because you are using HDP, 
defaultSparkAssembly:hdfs://hdp-master1.xcat.cluster:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar
16/12/08 20:21:55 INFO yarn.ExecutorRunnable: 
===============================================================================
YARN executor launch context:
  env:
    CLASSPATH -> 
{{PWD}}<CPS>{{PWD}}/__spark_conf__<CPS>{{PWD}}/__spark__.jar<CPS>$HADOOP_CONF_DIR<CPS>/usr/hdp/current/hadoop-client/*<CPS>/usr/hdp/current/hadoop-client/lib/*<CPS>/usr/hdp/current/hadoop-hdfs-client/*<CPS>/usr/hdp/current/hadoop-hdfs-client/lib/*<CPS>/usr/hdp/current/hadoop-yarn-client/*<CPS>/usr/hdp/current/hadoop-yarn-client/lib/*<CPS>$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.5.0.0-1245/hadoop/lib/hadoop-lzo-0.6.0.2.5.0.0-1245.jar:/etc/hadoop/conf/secure
    SPARK_YARN_CACHE_ARCHIVES -> 
hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/__spark_conf__2528926660896665250.zip#__spark_conf__
    SPARK_LOG_URL_STDERR -> 
http://hdp-node28.xcat.cluster:8042/node/containerlogs/container_e34_1479877553404_0174_02_000003/my_user_name/stderr?start=-4096
    SPARK_YARN_CACHE_FILES_FILE_SIZES -> 188727178,14787
    SPARK_YARN_STAGING_DIR -> .sparkStaging/application_1479877553404_0174
    SPARK_YARN_CACHE_ARCHIVES_FILE_SIZES -> 125215
    SPARK_YARN_CACHE_FILES_VISIBILITIES -> PUBLIC,PRIVATE
    SPARK_USER -> my_user_name
    SPARK_YARN_CACHE_ARCHIVES_TIME_STAMPS -> 1481228482907
    SPARK_YARN_MODE -> true
    SPARK_YARN_CACHE_FILES_TIME_STAMPS -> 1478064369306,1481228482852
    SPARK_LOG_URL_STDOUT -> 
http://hdp-node28.xcat.cluster:8042/node/containerlogs/container_e34_1479877553404_0174_02_000003/my_user_name/stdout?start=-4096
    SPARK_YARN_CACHE_FILES -> 
hdfs://hdp-master1.xcat.cluster:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar#__spark__.jar,hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/graphx_sp_2.10-1.0.jar#__app__.jar
    SPARK_YARN_CACHE_ARCHIVES_VISIBILITIES -> PRIVATE

  command:
    {{JAVA_HOME}}/bin/java -server -XX:OnOutOfMemoryError='kill %p' -Xms13312m 
-Xmx13312m -Djava.io.tmpdir={{PWD}}/tmp '-Dspark.driver.port=43398' 
'-Dspark.ui.port=0' -Dspark.yarn.app.container.log.dir=<LOG_DIR> 
org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url 
spark://CoarseGrainedScheduler@10.143.65.9:43398 --executor-id 2 --hostname 
hdp-node28.xcat.cluster --cores 1 --app-id application_1479877553404_0174 
--user-class-path file:$PWD/__app__.jar 1> <LOG_DIR>/stdout 2> <LOG_DIR>/stderr
===============================================================================
      
16/12/08 20:21:55 INFO yarn.ExecutorRunnable: 
===============================================================================
YARN executor launch context:
  env:
    CLASSPATH -> 
{{PWD}}<CPS>{{PWD}}/__spark_conf__<CPS>{{PWD}}/__spark__.jar<CPS>$HADOOP_CONF_DIR<CPS>/usr/hdp/current/hadoop-client/*<CPS>/usr/hdp/current/hadoop-client/lib/*<CPS>/usr/hdp/current/hadoop-hdfs-client/*<CPS>/usr/hdp/current/hadoop-hdfs-client/lib/*<CPS>/usr/hdp/current/hadoop-yarn-client/*<CPS>/usr/hdp/current/hadoop-yarn-client/lib/*<CPS>$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.5.0.0-1245/hadoop/lib/hadoop-lzo-0.6.0.2.5.0.0-1245.jar:/etc/hadoop/conf/secure
    SPARK_YARN_CACHE_ARCHIVES -> 
hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/__spark_conf__2528926660896665250.zip#__spark_conf__
    SPARK_LOG_URL_STDERR -> 
http://hdp-node26.xcat.cluster:8042/node/containerlogs/container_e34_1479877553404_0174_02_000002/my_user_name/stderr?start=-4096
    SPARK_YARN_CACHE_FILES_FILE_SIZES -> 188727178,14787
    SPARK_YARN_STAGING_DIR -> .sparkStaging/application_1479877553404_0174
    SPARK_YARN_CACHE_ARCHIVES_FILE_SIZES -> 125215
    SPARK_YARN_CACHE_FILES_VISIBILITIES -> PUBLIC,PRIVATE
    SPARK_USER -> my_user_name
    SPARK_YARN_CACHE_ARCHIVES_TIME_STAMPS -> 1481228482907
    SPARK_YARN_MODE -> true
    SPARK_YARN_CACHE_FILES_TIME_STAMPS -> 1478064369306,1481228482852
    SPARK_LOG_URL_STDOUT -> 
http://hdp-node26.xcat.cluster:8042/node/containerlogs/container_e34_1479877553404_0174_02_000002/my_user_name/stdout?start=-4096
    SPARK_YARN_CACHE_FILES -> 
hdfs://hdp-master1.xcat.cluster:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar#__spark__.jar,hdfs://hdp-master1.xcat.cluster:8020/user/my_user_name/.sparkStaging/application_1479877553404_0174/graphx_sp_2.10-1.0.jar#__app__.jar
    SPARK_YARN_CACHE_ARCHIVES_VISIBILITIES -> PRIVATE

  command:
    {{JAVA_HOME}}/bin/java -server -XX:OnOutOfMemoryError='kill %p' -Xms13312m 
-Xmx13312m -Djava.io.tmpdir={{PWD}}/tmp '-Dspark.driver.port=43398' 
'-Dspark.ui.port=0' -Dspark.yarn.app.container.log.dir=<LOG_DIR> 
org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url 
spark://CoarseGrainedScheduler@10.143.65.9:43398 --executor-id 1 --hostname 
hdp-node26.xcat.cluster --cores 1 --app-id application_1479877553404_0174 
--user-class-path file:$PWD/__app__.jar 1> <LOG_DIR>/stdout 2> <LOG_DIR>/stderr
===============================================================================
      
16/12/08 20:21:55 INFO impl.ContainerManagementProtocolProxy: Opening proxy : 
hdp-node28.xcat.cluster:45454
16/12/08 20:21:55 INFO impl.ContainerManagementProtocolProxy: Opening proxy : 
hdp-node26.xcat.cluster:45454
16/12/08 20:22:01 INFO cluster.YarnClusterSchedulerBackend: Registered executor 
NettyRpcEndpointRef(null) (hdp-node28.xcat.cluster:33382) with ID 2
16/12/08 20:22:01 INFO storage.BlockManagerMasterEndpoint: Registering block 
manager hdp-node28.xcat.cluster:36248 with 9.1 GB RAM, BlockManagerId(2, 
hdp-node28.xcat.cluster, 36248)
16/12/08 20:22:05 INFO cluster.YarnClusterSchedulerBackend: Registered executor 
NettyRpcEndpointRef(null) (hdp-node26.xcat.cluster:55140) with ID 1
16/12/08 20:22:05 INFO cluster.YarnClusterSchedulerBackend: SchedulerBackend is 
ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
16/12/08 20:22:05 INFO cluster.YarnClusterScheduler: 
YarnClusterScheduler.postStartHook done
16/12/08 20:22:05 INFO storage.BlockManagerMasterEndpoint: Registering block 
manager hdp-node26.xcat.cluster:44549 with 9.1 GB RAM, BlockManagerId(1, 
hdp-node26.xcat.cluster, 44549)
16/12/08 20:22:06 ERROR yarn.ApplicationMaster: User class threw exception: 
org.apache.hadoop.security.AccessControlException: Authentication required
org.apache.hadoop.security.AccessControlException: Authentication required
        at 
org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:457)
        at 
org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:113)
        at 
org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:738)
        at 
org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:582)
        at 
org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:612)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at 
org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:608)
        at 
org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getDelegationToken(WebHdfsFileSystem.java:1507)
        at 
org.apache.hadoop.fs.FileSystem.collectDelegationTokens(FileSystem.java:545)
        at 
org.apache.hadoop.fs.FileSystem.addDelegationTokens(FileSystem.java:523)
        at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:140)
        at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:100)
        at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodes(TokenCache.java:80)
        at 
org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:206)
        at 
org.apache.hadoop.mapred.SequenceFileInputFormat.listStatus(SequenceFileInputFormat.java:45)
        at 
org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315)
        at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:199)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:242)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:240)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:240)
        at 
org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:242)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:240)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:240)
        at 
org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:242)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:240)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:240)
        at org.apache.spark.graphx.EdgeRDD.getPartitions(EdgeRDD.scala:48)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:242)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:240)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:240)
        at 
org.apache.spark.graphx.impl.EdgeRDDImpl$$anonfun$1.apply(EdgeRDDImpl.scala:49)
        at 
org.apache.spark.graphx.impl.EdgeRDDImpl$$anonfun$1.apply(EdgeRDDImpl.scala:49)
        at scala.Option.orElse(Option.scala:257)
        at org.apache.spark.graphx.impl.EdgeRDDImpl.<init>(EdgeRDDImpl.scala:49)
        at 
org.apache.spark.graphx.EdgeRDD$.fromEdgePartitions(EdgeRDD.scala:124)
        at org.apache.spark.graphx.EdgeRDD$.fromEdges(EdgeRDD.scala:113)
        at org.apache.spark.graphx.impl.GraphImpl$.apply(GraphImpl.scala:334)
        at org.apache.spark.graphx.Graph$.apply(Graph.scala:576)
        at graphx_sp$.main(graphx_sp.scala:52)
        at graphx_sp.main(graphx_sp.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:559)

End of LogType:stderr

LogType:stdout
Log Upload Time:Thu Dec 08 20:22:08 +0000 2016
LogLength:81
Log Contents:
Loading files from HDFS
vertices RDD loaded from HDFS
edges RDD loaded from HDFS

End of LogType:stdout

> On 8 Dec 2016, at 20:18, Marcelo Vanzin <van...@cloudera.com> wrote:
> 
> Then you probably have a configuration error somewhere. Since you
> haven't actually posted the error you're seeing, it's kinda hard to
> help any further.
> 
> On Thu, Dec 8, 2016 at 11:17 AM, Gerard Casey <gerardhughca...@gmail.com> 
> wrote:
>> Right. I’m confident that is setup correctly.
>> 
>> I can run the SparkPi test script. The main difference between it and my 
>> application is that it doesn’t access HDFS.
>> 
>>> On 8 Dec 2016, at 18:43, Marcelo Vanzin <van...@cloudera.com> wrote:
>>> 
>>> On Wed, Dec 7, 2016 at 11:54 PM, Gerard Casey <gerardhughca...@gmail.com> 
>>> wrote:
>>>> To be specific, where exactly should spark.authenticate be set to true?
>>> 
>>> spark.authenticate has nothing to do with kerberos. It's for
>>> authentication between different Spark processes belonging to the same
>>> app.
>>> 
>>> --
>>> Marcelo
>>> 
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>> 
>> 
> 
> 
> 
> -- 
> Marcelo
> 
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to