Thanks, it helped.
We can't use Spark 1.3 because Cassandra DSE doesn't support it.

2015-04-17 21:48 GMT+02:00 Imran Rashid <iras...@cloudera.com>:

> are you calling sc.stop() at the end of your applications?
>
> The history server only displays completed applications, but if you don't
> call sc.stop(), it doesn't know that those applications have been stopped.
>
> Note that in spark 1.3, the history server can also display running
> applications (including completed applications, but that it thinks are
> still running), which improves things a little bit.
>
> On Fri, Apr 17, 2015 at 10:13 AM, Serega Sheypak <serega.shey...@gmail.com
> > wrote:
>
>> Hi, started history-server
>> Here is UI output:
>>
>>
>>    - *Event log directory:* file:/var/log/spark/applicationHistory/
>>
>> No completed applications found!
>>
>> Did you specify the correct logging directory? Please verify your setting
>> of spark.history.fs.logDirectory and whether you have the permissions to
>> access it.
>> It is also possible that your application did not run to completion or
>> did not stop the SparkContext.
>>
>> Spark 1.2.0
>>
>> I goto node where server runs and:
>>
>> ls -la /var/log/spark/applicationHistory/
>>
>> total 44
>>
>> drwxrwxrwx 11 root      root    4096 Apr 17 14:50 .
>>
>> drwxrwxrwx  3 cassandra root    4096 Apr 16 15:31 ..
>>
>> drwxrwxrwx  2 vagrant   vagrant 4096 Apr 17 10:06 app-20150417100630-0000
>>
>> drwxrwxrwx  2 vagrant   vagrant 4096 Apr 17 11:01 app-20150417110140-0001
>>
>> drwxrwxrwx  2 vagrant   vagrant 4096 Apr 17 11:12 app-20150417111216-0002
>>
>> drwxrwxrwx  2 vagrant   vagrant 4096 Apr 17 11:14 app-20150417111441-0003
>>
>> drwxrwx---  2 vagrant   vagrant 4096 Apr 17 11:20
>> *app-20150417112028-0004*
>>
>> drwxrwx---  2 vagrant   vagrant 4096 Apr 17 14:17
>> *app-20150417141733-0005*
>>
>> drwxrwx---  2 vagrant   vagrant 4096 Apr 17 14:32
>> *app-20150417143237-0006*
>>
>> drwxrwx---  2 vagrant   vagrant 4096 Apr 17 14:49
>> *app-20150417144902-0007*
>>
>> drwxrwx---  2 vagrant   vagrant 4096 Apr 17 14:50
>> *app-20150417145025-0008*
>>
>>
>> So there are logs, but history-server doesn't want to display them.
>>
>> I've checked workers, they are pointed to that dir also, I run app, I see
>> new log.
>>
>>
>> Here is history-server log output:
>>
>> vagrant@dsenode01:/usr/lib/spark/logs$ cat
>> spark-root-org.apache.spark.deploy.history.HistoryServer-1-dsenode01.out
>>
>> Spark assembly has been built with Hive, including Datanucleus jars on
>> classpath
>>
>> Spark Command: java -cp
>> ::/usr/lib/spark/sbin/../conf:/usr/lib/spark/lib/spark-assembly-1.2.0-hadoop2.4.0.jar:/usr/lib/spark/lib/datanucleus-api-jdo-3.2.6.jar:/usr/lib/spark/lib/datanucleus-rdbms-3.2.9.jar:/usr/lib/spark/lib/datanucleus-core-3.2.10.jar
>> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true
>> -Dspark.history.fs.logDirectory=/var/log/spark/applicationHistory
>> -Dspark.eventLog.enabled=true -Xms512m -Xmx512m
>> org.apache.spark.deploy.history.HistoryServer
>>
>> ========================================
>>
>>
>> Using Spark's default log4j profile:
>> org/apache/spark/log4j-defaults.properties
>>
>> 15/04/17 09:55:21 INFO HistoryServer: Registered signal handlers for
>> [TERM, HUP, INT]
>>
>> 15/04/17 09:55:21 INFO SecurityManager: Changing view acls to: root
>>
>> 15/04/17 09:55:21 INFO SecurityManager: Changing modify acls to: root
>>
>> 15/04/17 09:55:21 INFO SecurityManager: SecurityManager: authentication
>> disabled; ui acls disabled; users with view permissions: Set(root); users
>> with modify permissions: Set(root)
>>
>> 15/04/17 09:55:22 WARN NativeCodeLoader: Unable to load native-hadoop
>> library for your platform... using builtin-java classes where applicable
>>
>> 15/04/17 09:55:24 INFO Utils: Successfully started service on port 18080.
>>
>> 15/04/17 09:55:24 INFO HistoryServer: Started HistoryServer at
>> http://dsenode01:18080
>>
>>
>> What could be wrong with it?
>>
>
>

Reply via email to