I get the following error message when I start pyspark shell.
The config has the following settings-
# spark.master            spark://master:7077
# spark.eventLog.enabled  true
# spark.eventLog.dir      hdfs://namenode:8021/directory
# spark.serializer        org.apache.spark.serializer.KryoSerializer
spark.eventLog.dir=/user/spark/applicationHistory
spark.eventLog.enabled=true
spark.yarn.historyServer.address=name101-car.ldcint.com:10020


[pzilaro@name101-car conf]$ pyspark
Python 2.6.6 (r266:84292, Jan 22 2014, 09:42:36)
[GCC 4.4.7 20120313 (Red Hat 4.4.7-4)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
15/03/26 13:46:06 INFO spark.SecurityManager: Changing view acls to: pzilaro
15/03/26 13:46:06 INFO spark.SecurityManager: Changing modify acls to:
pzilaro
15/03/26 13:46:06 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(pzilaro); users with modify permissions: Set(pzilaro)
15/03/26 13:46:07 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/26 13:46:07 INFO Remoting: Starting remoting
15/03/26 13:46:07 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkdri...@name101-car.ldcint.com:48040]
15/03/26 13:46:07 INFO Remoting: Remoting now listens on addresses:
[akka.tcp://sparkdri...@name101-car.ldcint.com:48040]
15/03/26 13:46:07 INFO util.Utils: Successfully started service
'sparkDriver' on port 48040.
15/03/26 13:46:07 INFO spark.SparkEnv: Registering MapOutputTracker
15/03/26 13:46:07 INFO spark.SparkEnv: Registering BlockManagerMaster
15/03/26 13:46:07 INFO storage.DiskBlockManager: Created local directory at
/tmp/spark-local-20150326134607-072e
15/03/26 13:46:07 INFO storage.MemoryStore: MemoryStore started with
capacity 265.4 MB
15/03/26 13:46:08 INFO spark.HttpFileServer: HTTP File server directory is
/tmp/spark-2f342a3a-c5bb-474d-867b-8bd5b9f9d1ac
15/03/26 13:46:08 INFO spark.HttpServer: Starting HTTP Server
15/03/26 13:46:08 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/03/26 13:46:08 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0:55296
15/03/26 13:46:08 INFO util.Utils: Successfully started service 'HTTP file
server' on port 55296.
15/03/26 13:46:08 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/03/26 13:46:08 INFO server.AbstractConnector: Started
SelectChannelConnector@0.0.0.0:4040
15/03/26 13:46:08 INFO util.Utils: Successfully started service 'SparkUI' on
port 4040.
15/03/26 13:46:08 INFO ui.SparkUI: Started SparkUI at
http://name101-car.ldcint.com:4040
15/03/26 13:46:08 INFO util.AkkaUtils: Connecting to HeartbeatReceiver:
akka.tcp://sparkdri...@name101-car.ldcint.com:48040/user/HeartbeatReceiver
15/03/26 13:46:08 INFO netty.NettyBlockTransferService: Server created on
55241
15/03/26 13:46:08 INFO storage.BlockManagerMaster: Trying to register
BlockManager
15/03/26 13:46:08 INFO storage.BlockManagerMasterActor: Registering block
manager localhost:55241 with 265.4 MB RAM, BlockManagerId(<driver>,
localhost, 55241)
15/03/26 13:46:08 INFO storage.BlockManagerMaster: Registered BlockManager
Traceback (most recent call last):
  File "/usr/lib/spark/python/pyspark/shell.py", line 45, in <module>
    sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
  File "/usr/lib/spark/python/pyspark/context.py", line 105, in __init__
    conf, jsc)
  File "/usr/lib/spark/python/pyspark/context.py", line 153, in _do_init
    self._jsc = jsc or self._initialize_context(self._conf._jconf)
  File "/usr/lib/spark/python/pyspark/context.py", line 201, in
_initialize_context
    return self._jvm.JavaSparkContext(jconf)
  File
"/usr/lib/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line
701, in __call__
  File "/usr/lib/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/protocol.py",
line 300, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling
None.org.apache.spark.api.java.JavaSparkContext.
: java.io.IOException: Error in creating log directory:
file:/user/spark/applicationHistory//local-1427402768636
        at
org.apache.spark.util.FileLogger.createLogDir(FileLogger.scala:133)
        at org.apache.spark.util.FileLogger.start(FileLogger.scala:115)
        at
org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:74)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:353)
        at
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
        at
py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
        at py4j.Gateway.invoke(Gateway.java:214)
        at
py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
        at
py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
        at py4j.GatewayConnection.run(GatewayConnection.java:207)
        at java.lang.Thread.run(Thread.java:745)




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Error-in-creating-log-directory-tp22250.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to