I've successfully installed edx and insights without any errors. However, 
when I run the analytics tasks, it always fails with pending task, looking 
into the log I saw this error thus I guess it may be the reason. I run 
insights on EC2 m3.large instance
 I have also tried cleaning up warehouse and rerun tasks but no help. The 
full log attached.

Really appreciate if anyone can help out.

Many thanks,
Andy

-- 
You received this message because you are subscribed to the Google Groups 
"General Open edX discussion" group.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/edx-code/7fd24537-de71-4e2f-b80e-e07da1776172%40googlegroups.com.
2017-02-10 06:21:30,241 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.sql.SQLException: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:21:30,241 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
2017-02-10 06:21:30,241 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
 Source)
2017-02-10 06:21:30,242 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
2017-02-10 06:21:30,242 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
2017-02-10 06:21:30,242 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown 
Source)
2017-02-10 06:21:30,243 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown 
Source)
2017-02-10 06:21:30,243 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
2017-02-10 06:21:30,243 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
2017-02-10 06:21:30,243 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
2017-02-10 06:21:30,244 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
2017-02-10 06:21:30,244 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
2017-02-10 06:21:30,244 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
2017-02-10 06:21:30,244 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
2017-02-10 06:21:30,245 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
2017-02-10 06:21:30,245 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:664)
2017-02-10 06:21:30,248 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:270)
2017-02-10 06:21:30,250 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsPublisher.init(JDBCStatsPublisher.java:265)
2017-02-10 06:21:30,250 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:435)
2017-02-10 06:21:30,251 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:138)
2017-02-10 06:21:30,251 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
2017-02-10 06:21:30,251 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
2017-02-10 06:21:30,252 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
2017-02-10 06:21:30,252 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
2017-02-10 06:21:30,252 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
2017-02-10 06:21:30,252 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
2017-02-10 06:21:30,253 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
2017-02-10 06:21:30,253 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
2017-02-10 06:21:30,253 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
2017-02-10 06:21:30,254 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
2017-02-10 06:21:30,254 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
2017-02-10 06:21:30,254 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
2017-02-10 06:21:30,254 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
2017-02-10 06:21:30,255 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-02-10 06:21:30,255 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-02-10 06:21:30,255 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-02-10 06:21:30,255 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.lang.reflect.Method.invoke(Method.java:497)
2017-02-10 06:21:30,256 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.util.RunJar.main(RunJar.java:212)
2017-02-10 06:21:30,256 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
ERROR XBM0H: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:21:30,256 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
2017-02-10 06:21:30,257 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown 
Source)
2017-02-10 06:21:30,257 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.security.AccessController.doPrivileged(Native Method)
2017-02-10 06:21:30,259 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
 Source)
2017-02-10 06:21:30,259 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
2017-02-10 06:21:30,259 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
 Source)
2017-02-10 06:21:30,259 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown 
Source)
2017-02-10 06:21:30,260 INFO 19790 [luigi-interface] hadoop.py:273 - ... 30 more
2017-02-10 06:21:30,260 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= end nested exception, level (2) ===========
2017-02-10 06:21:30,260 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= begin nested exception, level (3) ===========
2017-02-10 06:21:30,261 INFO 19790 [luigi-interface] hadoop.py:273 - ERROR 
XBM0H: Directory /var/lib/analytics-tasks/analyticstack/repo/TempStatsStore 
cannot be created.
2017-02-10 06:21:30,261 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
2017-02-10 06:21:30,262 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown 
Source)
2017-02-10 06:21:30,268 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.security.AccessController.doPrivileged(Native Method)
2017-02-10 06:21:30,271 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
 Source)
2017-02-10 06:21:30,271 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
2017-02-10 06:21:30,273 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
 Source)
2017-02-10 06:21:30,274 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown 
Source)
2017-02-10 06:21:30,275 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
2017-02-10 06:21:30,275 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
2017-02-10 06:21:30,276 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
2017-02-10 06:21:30,276 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
2017-02-10 06:21:30,279 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
2017-02-10 06:21:30,279 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
2017-02-10 06:21:30,281 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
2017-02-10 06:21:30,282 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:664)
2017-02-10 06:21:30,283 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:270)
2017-02-10 06:21:30,285 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsPublisher.init(JDBCStatsPublisher.java:265)
2017-02-10 06:21:30,285 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:435)
2017-02-10 06:21:30,286 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:138)
2017-02-10 06:21:30,287 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
2017-02-10 06:21:30,287 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
2017-02-10 06:21:30,287 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
2017-02-10 06:21:30,287 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
2017-02-10 06:21:30,288 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
2017-02-10 06:21:30,289 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
2017-02-10 06:21:30,290 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
2017-02-10 06:21:30,290 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
2017-02-10 06:21:30,290 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
2017-02-10 06:21:30,290 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
2017-02-10 06:21:30,291 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
2017-02-10 06:21:30,292 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
2017-02-10 06:21:30,292 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
2017-02-10 06:21:30,298 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-02-10 06:21:30,298 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-02-10 06:21:30,300 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-02-10 06:21:30,300 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.lang.reflect.Method.invoke(Method.java:497)
2017-02-10 06:21:30,300 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.util.RunJar.main(RunJar.java:212)
2017-02-10 06:21:30,301 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= end nested exception, level (3) ===========
2017-02-10 06:21:30,306 INFO 19790 [luigi-interface] hadoop.py:273 - Cleanup 
action completed
2017-02-10 06:21:30,564 INFO 19790 [luigi-interface] hadoop.py:273 - Starting 
Job = job_1486680388499_0022, Tracking URL = 
http://ip-172-31-62-193:8088/proxy/application_1486680388499_0022/
2017-02-10 06:21:30,565 INFO 19790 [luigi-interface] hadoop.py:273 - Kill 
Command = /edx/app/hadoop/hadoop-2.3.0/bin/hadoop job  -kill 
job_1486680388499_0022
2017-02-10 06:21:43,187 INFO 19790 [luigi-interface] hadoop.py:273 - Hadoop job 
information for Stage-2: number of mappers: 1; number of reducers: 1
2017-02-10 06:21:43,250 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:21:43,248 Stage-2 map = 0%,  reduce = 0%
2017-02-10 06:21:51,025 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:21:51,023 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.23 sec
2017-02-10 06:21:52,077 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:21:52,074 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.23 sec
2017-02-10 06:21:53,135 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:21:53,132 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.23 sec
2017-02-10 06:21:54,205 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:21:54,194 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.23 sec
2017-02-10 06:21:55,277 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:21:55,274 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.23 sec
2017-02-10 06:21:56,368 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:21:56,347 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.23 sec
2017-02-10 06:21:57,424 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:21:57,423 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.23 sec
2017-02-10 06:21:58,510 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:21:58,509 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.23 sec
2017-02-10 06:21:59,607 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:21:59,584 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.23 sec
2017-02-10 06:22:00,657 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:22:00,654 Stage-2 map = 100%,  reduce = 100%, Cumulative CPU 3.16 sec
2017-02-10 06:22:01,702 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:22:01,700 Stage-2 map = 100%,  reduce = 100%, Cumulative CPU 3.16 sec
2017-02-10 06:22:01,704 INFO 19790 [luigi-interface] hadoop.py:273 - MapReduce 
Total cumulative CPU time: 3 seconds 160 msec
2017-02-10 06:22:01,744 INFO 19790 [luigi-interface] hadoop.py:273 - Ended Job 
= job_1486680388499_0022
2017-02-10 06:22:01,762 INFO 19790 [luigi-interface] hadoop.py:273 - Loading 
data to table default.course_enrollment_birth_year_daily partition 
(dt=2017-02-10)
2017-02-10 06:22:02,677 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:22:02.677 GMT Thread[main,5,main] Cleanup action starting
2017-02-10 06:22:02,678 INFO 19790 [luigi-interface] hadoop.py:273 - ERROR 
XBM0H: Directory /var/lib/analytics-tasks/analyticstack/repo/TempStatsStore 
cannot be created.
2017-02-10 06:22:02,678 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
2017-02-10 06:22:02,678 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown 
Source)
2017-02-10 06:22:02,685 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.security.AccessController.doPrivileged(Native Method)
2017-02-10 06:22:02,686 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
 Source)
2017-02-10 06:22:02,686 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
2017-02-10 06:22:02,686 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
 Source)
2017-02-10 06:22:02,687 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown 
Source)
2017-02-10 06:22:02,687 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
2017-02-10 06:22:02,687 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
2017-02-10 06:22:02,687 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
2017-02-10 06:22:02,688 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
2017-02-10 06:22:02,688 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
2017-02-10 06:22:02,688 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
2017-02-10 06:22:02,689 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
2017-02-10 06:22:02,689 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:664)
2017-02-10 06:22:02,689 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:270)
2017-02-10 06:22:02,689 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Utilities.connectWithRetry(Utilities.java:2295)
2017-02-10 06:22:02,690 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsAggregator.connect(JDBCStatsAggregator.java:85)
2017-02-10 06:22:02,690 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.aggregateStats(StatsTask.java:298)
2017-02-10 06:22:02,691 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.execute(StatsTask.java:252)
2017-02-10 06:22:02,691 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
2017-02-10 06:22:02,692 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
2017-02-10 06:22:02,697 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
2017-02-10 06:22:02,697 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
2017-02-10 06:22:02,697 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
2017-02-10 06:22:02,698 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
2017-02-10 06:22:02,698 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
2017-02-10 06:22:02,705 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
2017-02-10 06:22:02,706 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
2017-02-10 06:22:02,713 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
2017-02-10 06:22:02,714 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
2017-02-10 06:22:02,714 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
2017-02-10 06:22:02,714 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
2017-02-10 06:22:02,715 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-02-10 06:22:02,715 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-02-10 06:22:02,715 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-02-10 06:22:02,715 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.lang.reflect.Method.invoke(Method.java:497)
2017-02-10 06:22:02,716 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.util.RunJar.main(RunJar.java:212)
2017-02-10 06:22:02,716 INFO 19790 [luigi-interface] hadoop.py:273 - Cleanup 
action completed
2017-02-10 06:22:02,716 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:22:02.699 GMT Thread[main,5,main] Cleanup action starting
2017-02-10 06:22:02,718 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.sql.SQLException: Failed to create database 'TempStatsStore', see the next 
exception for details.
2017-02-10 06:22:02,718 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
2017-02-10 06:22:02,718 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
2017-02-10 06:22:02,718 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
2017-02-10 06:22:02,723 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
2017-02-10 06:22:02,723 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
2017-02-10 06:22:02,723 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
2017-02-10 06:22:02,724 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
2017-02-10 06:22:02,724 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
2017-02-10 06:22:02,724 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
2017-02-10 06:22:02,724 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
2017-02-10 06:22:02,725 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:664)
2017-02-10 06:22:02,727 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:270)
2017-02-10 06:22:02,737 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Utilities.connectWithRetry(Utilities.java:2295)
2017-02-10 06:22:02,738 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsAggregator.connect(JDBCStatsAggregator.java:85)
2017-02-10 06:22:02,739 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.aggregateStats(StatsTask.java:298)
2017-02-10 06:22:02,739 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.execute(StatsTask.java:252)
2017-02-10 06:22:02,739 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
2017-02-10 06:22:02,740 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
2017-02-10 06:22:02,740 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
2017-02-10 06:22:02,740 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
2017-02-10 06:22:02,741 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
2017-02-10 06:22:02,741 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
2017-02-10 06:22:02,741 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
2017-02-10 06:22:02,742 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
2017-02-10 06:22:02,749 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
2017-02-10 06:22:02,749 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
2017-02-10 06:22:02,749 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
2017-02-10 06:22:02,750 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
2017-02-10 06:22:02,750 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
2017-02-10 06:22:02,750 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-02-10 06:22:02,751 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-02-10 06:22:02,751 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-02-10 06:22:02,751 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.lang.reflect.Method.invoke(Method.java:497)
2017-02-10 06:22:02,751 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.util.RunJar.main(RunJar.java:212)
2017-02-10 06:22:02,752 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
java.sql.SQLException: Failed to create database 'TempStatsStore', see the next 
exception for details.
2017-02-10 06:22:02,753 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
2017-02-10 06:22:02,761 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
 Source)
2017-02-10 06:22:02,762 INFO 19790 [luigi-interface] hadoop.py:273 - ... 34 more
2017-02-10 06:22:02,762 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
java.sql.SQLException: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:22:02,762 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
2017-02-10 06:22:02,763 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
 Source)
2017-02-10 06:22:02,763 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
2017-02-10 06:22:02,763 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
2017-02-10 06:22:02,763 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown 
Source)
2017-02-10 06:22:02,764 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown 
Source)
2017-02-10 06:22:02,764 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
2017-02-10 06:22:02,764 INFO 19790 [luigi-interface] hadoop.py:273 - ... 31 more
2017-02-10 06:22:02,765 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
ERROR XBM0H: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:22:02,765 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
2017-02-10 06:22:02,769 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown 
Source)
2017-02-10 06:22:02,770 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.security.AccessController.doPrivileged(Native Method)
2017-02-10 06:22:02,770 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
 Source)
2017-02-10 06:22:02,770 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
2017-02-10 06:22:02,771 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
 Source)
2017-02-10 06:22:02,771 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown 
Source)
2017-02-10 06:22:02,771 INFO 19790 [luigi-interface] hadoop.py:273 - ... 31 more
2017-02-10 06:22:02,792 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= begin nested exception, level (1) ===========
2017-02-10 06:22:02,805 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.sql.SQLException: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:22:02,806 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
2017-02-10 06:22:02,806 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
2017-02-10 06:22:02,806 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown 
Source)
2017-02-10 06:22:02,807 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown 
Source)
2017-02-10 06:22:02,807 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
2017-02-10 06:22:02,807 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
2017-02-10 06:22:02,808 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
2017-02-10 06:22:02,808 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
2017-02-10 06:22:02,808 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
2017-02-10 06:22:02,808 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
2017-02-10 06:22:02,809 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
2017-02-10 06:22:02,809 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
2017-02-10 06:22:02,809 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:664)
2017-02-10 06:22:02,812 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:270)
2017-02-10 06:22:02,813 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Utilities.connectWithRetry(Utilities.java:2295)
2017-02-10 06:22:02,813 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsAggregator.connect(JDBCStatsAggregator.java:85)
2017-02-10 06:22:02,813 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.aggregateStats(StatsTask.java:298)
2017-02-10 06:22:02,813 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.execute(StatsTask.java:252)
2017-02-10 06:22:02,814 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
2017-02-10 06:22:02,814 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
2017-02-10 06:22:02,814 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
2017-02-10 06:22:02,815 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
2017-02-10 06:22:02,815 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
2017-02-10 06:22:02,828 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
2017-02-10 06:22:02,828 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
2017-02-10 06:22:02,829 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
2017-02-10 06:22:02,829 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
2017-02-10 06:22:02,830 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
2017-02-10 06:22:02,830 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
2017-02-10 06:22:02,830 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
2017-02-10 06:22:02,830 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
2017-02-10 06:22:02,831 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-02-10 06:22:02,831 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-02-10 06:22:02,831 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-02-10 06:22:02,831 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.lang.reflect.Method.invoke(Method.java:497)
2017-02-10 06:22:02,831 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.util.RunJar.main(RunJar.java:212)
2017-02-10 06:22:02,832 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
java.sql.SQLException: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:22:02,836 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
2017-02-10 06:22:02,836 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
 Source)
2017-02-10 06:22:02,851 INFO 19790 [luigi-interface] hadoop.py:273 - ... 36 more
2017-02-10 06:22:02,852 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
ERROR XBM0H: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:22:02,852 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
2017-02-10 06:22:02,852 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown 
Source)
2017-02-10 06:22:02,852 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.security.AccessController.doPrivileged(Native Method)
2017-02-10 06:22:02,853 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
 Source)
2017-02-10 06:22:02,853 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
2017-02-10 06:22:02,853 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
 Source)
2017-02-10 06:22:02,853 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown 
Source)
2017-02-10 06:22:02,854 INFO 19790 [luigi-interface] hadoop.py:273 - ... 31 more
2017-02-10 06:22:02,854 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= end nested exception, level (1) ===========
2017-02-10 06:22:02,854 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= begin nested exception, level (2) ===========
2017-02-10 06:22:02,854 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.sql.SQLException: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:22:02,855 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
2017-02-10 06:22:02,855 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
 Source)
2017-02-10 06:22:02,855 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
2017-02-10 06:22:02,855 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
2017-02-10 06:22:02,856 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown 
Source)
2017-02-10 06:22:02,856 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown 
Source)
2017-02-10 06:22:02,856 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
2017-02-10 06:22:02,856 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
2017-02-10 06:22:02,861 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
2017-02-10 06:22:02,862 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
2017-02-10 06:22:02,862 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
2017-02-10 06:22:02,862 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
2017-02-10 06:22:02,873 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
2017-02-10 06:22:02,873 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
2017-02-10 06:22:02,874 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:664)
2017-02-10 06:22:02,874 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:270)
2017-02-10 06:22:02,874 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Utilities.connectWithRetry(Utilities.java:2295)
2017-02-10 06:22:02,874 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsAggregator.connect(JDBCStatsAggregator.java:85)
2017-02-10 06:22:02,875 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.aggregateStats(StatsTask.java:298)
2017-02-10 06:22:02,875 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.execute(StatsTask.java:252)
2017-02-10 06:22:02,875 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
2017-02-10 06:22:02,875 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
2017-02-10 06:22:02,876 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
2017-02-10 06:22:02,876 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
2017-02-10 06:22:02,876 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
2017-02-10 06:22:02,876 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
2017-02-10 06:22:02,876 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
2017-02-10 06:22:02,877 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
2017-02-10 06:22:02,877 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
2017-02-10 06:22:02,889 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
2017-02-10 06:22:02,890 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
2017-02-10 06:22:02,890 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
2017-02-10 06:22:02,890 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
2017-02-10 06:22:02,890 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-02-10 06:22:02,891 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-02-10 06:22:02,891 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-02-10 06:22:02,892 INFO 19790 [luigi-interface] worker.py:296 - [pid 
19790] Worker Worker(salt=997523257, host=ip-172-31-62-193, username=hadoop, 
pid=19790) done      
HiveTableFromParameterQueryTask(warehouse_path=hdfs://localhost:9000/edx-analytics-pipeline/warehouse/,
 insert_query=
            SELECT
                ce.date,
                ce.course_id,
                p.year_of_birth,
                SUM(ce.at_end),
                COUNT(ce.user_id)
            FROM course_enrollment ce
            LEFT OUTER JOIN auth_userprofile p ON p.user_id = ce.user_id
            WHERE ce.date = '2017-02-09'
            GROUP BY
                ce.date,
                ce.course_id,
                p.year_of_birth
        , table=course_enrollment_birth_year_daily, columns=(('date', 
'STRING'), ('course_id', 'STRING'), ('birth_year', 'INT'), ('count', 'INT'), 
('cumulative_count', 'INT')), partition=dt=2017-02-10)
2017-02-10 06:22:02,894 INFO 19790 [luigi-interface] worker.py:282 - [pid 
19790] Worker Worker(salt=997523257, host=ip-172-31-62-193, username=hadoop, 
pid=19790) running   
HiveTableFromParameterQueryTask(warehouse_path=hdfs://localhost:9000/edx-analytics-pipeline/warehouse/,
 insert_query=
            SELECT
                ce.date,
                ce.course_id,
                IF(p.gender != '', p.gender, NULL),
                SUM(ce.at_end),
                COUNT(ce.user_id)
            FROM course_enrollment ce
            LEFT OUTER JOIN auth_userprofile p ON p.user_id = ce.user_id
            GROUP BY
                ce.date,
                ce.course_id,
                IF(p.gender != '', p.gender, NULL)
        , table=course_enrollment_gender_daily, columns=(('date', 'STRING'), 
('course_id', 'STRING'), ('gender', 'STRING'), ('count', 'INT'), 
('cumulative_count', 'INT')), partition=dt=2017-02-10)
2017-02-10 06:22:07,362 INFO 19790 [luigi-interface] hive.py:338 - Creating 
parent directory 
'/edx-analytics-pipeline/warehouse/course_enrollment_gender_daily'
2017-02-10 06:22:10,269 INFO 19790 [luigi-interface] hive.py:358 - ['hive', 
'-f', '/tmp/tmpSyiRsN', '--hiveconf', 
"mapred.job.name=HiveTableFromParameterQueryTask(warehouse_path=hdfs://localhost:9000/edx-analytics-pipeline/warehouse/,
 insert_query=\n            SELECT\n                ce.date,\n                
ce.course_id,\n                IF(p.gender != '', p.gender, NULL),\n            
    SUM(ce.at_end),\n                COUNT(ce.user_id)\n            FROM 
course_enrollment ce\n            LEFT OUTER JOIN auth_userprofile p ON 
p.user_id = ce.user_id\n            GROUP BY\n                ce.date,\n        
        ce.course_id,\n                IF(p.gender != '', p.gender, NULL)\n     
   , table=course_enrollment_gender_daily, columns=(('date', 'STRING'), 
('course_id', 'STRING'), ('gender', 'STRING'), ('count', 'INT'), 
('cumulative_count', 'INT')), partition=dt=2017-02-10)"]
2017-02-10 06:22:10,270 INFO 19790 [luigi-interface] hadoop.py:242 - hive -f 
/tmp/tmpSyiRsN --hiveconf 
mapred.job.name=HiveTableFromParameterQueryTask(warehouse_path=hdfs://localhost:9000/edx-analytics-pipeline/warehouse/,
 insert_query=
            SELECT
                ce.date,
                ce.course_id,
                IF(p.gender != '', p.gender, NULL),
                SUM(ce.at_end),
                COUNT(ce.user_id)
            FROM course_enrollment ce
            LEFT OUTER JOIN auth_userprofile p ON p.user_id = ce.user_id
            GROUP BY
                ce.date,
                ce.course_id,
                IF(p.gender != '', p.gender, NULL)
        , table=course_enrollment_gender_daily, columns=(('date', 'STRING'), 
('course_id', 'STRING'), ('gender', 'STRING'), ('count', 'INT'), 
('cumulative_count', 'INT')), partition=dt=2017-02-10)
2017-02-10 06:22:12,655 INFO 19790 [luigi-interface] hadoop.py:273 - 17/02/10 
06:22:12 INFO Configuration.deprecation: mapred.input.dir.recursive is 
deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive
2017-02-10 06:22:12,661 INFO 19790 [luigi-interface] hadoop.py:273 - 17/02/10 
06:22:12 INFO Configuration.deprecation: mapred.max.split.size is deprecated. 
Instead, use mapreduce.input.fileinputformat.split.maxsize
2017-02-10 06:22:12,662 INFO 19790 [luigi-interface] hadoop.py:273 - 17/02/10 
06:22:12 INFO Configuration.deprecation: mapred.min.split.size is deprecated. 
Instead, use mapreduce.input.fileinputformat.split.minsize
2017-02-10 06:22:12,662 INFO 19790 [luigi-interface] hadoop.py:273 - 17/02/10 
06:22:12 INFO Configuration.deprecation: mapred.min.split.size.per.rack is 
deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack
2017-02-10 06:22:12,663 INFO 19790 [luigi-interface] hadoop.py:273 - 17/02/10 
06:22:12 INFO Configuration.deprecation: mapred.min.split.size.per.node is 
deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node
2017-02-10 06:22:12,663 INFO 19790 [luigi-interface] hadoop.py:273 - 17/02/10 
06:22:12 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. 
Instead, use mapreduce.job.reduces
2017-02-10 06:22:12,664 INFO 19790 [luigi-interface] hadoop.py:273 - 17/02/10 
06:22:12 INFO Configuration.deprecation: 
mapred.reduce.tasks.speculative.execution is deprecated. Instead, use 
mapreduce.reduce.speculative
2017-02-10 06:22:13,146 INFO 19790 [luigi-interface] hadoop.py:273 - 17/02/10 
06:22:13 WARN conf.Configuration: 
org.apache.hadoop.hive.conf.LoopingByteArrayInputStream@711f39f9:an attempt to 
override final parameter: mapreduce.job.end-notification.max.retry.interval;  
Ignoring.
2017-02-10 06:22:13,163 INFO 19790 [luigi-interface] hadoop.py:273 - 17/02/10 
06:22:13 WARN conf.Configuration: 
org.apache.hadoop.hive.conf.LoopingByteArrayInputStream@711f39f9:an attempt to 
override final parameter: mapreduce.job.end-notification.max.attempts;  
Ignoring.
2017-02-10 06:22:13,167 INFO 19790 [luigi-interface] hadoop.py:273 - 17/02/10 
06:22:13 INFO Configuration.deprecation: mapred.job.name is deprecated. 
Instead, use mapreduce.job.name
2017-02-10 06:22:13,485 INFO 19790 [luigi-interface] hadoop.py:273 - Logging 
initialized using configuration in 
jar:file:/edx/app/hadoop/hive-0.11.0-bin/lib/hive-common-0.11.0.jar!/hive-log4j.properties
2017-02-10 06:22:13,499 INFO 19790 [luigi-interface] hadoop.py:273 - Hive 
history 
file=/tmp/hadoop/[email protected]_201702100622_1035106644.txt
2017-02-10 06:22:13,781 INFO 19790 [luigi-interface] hadoop.py:273 - SLF4J: 
Class path contains multiple SLF4J bindings.
2017-02-10 06:22:13,782 INFO 19790 [luigi-interface] hadoop.py:273 - SLF4J: 
Found binding in 
[jar:file:/edx/app/hadoop/hadoop-2.3.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2017-02-10 06:22:13,782 INFO 19790 [luigi-interface] hadoop.py:273 - SLF4J: 
Found binding in 
[jar:file:/edx/app/hadoop/hive-0.11.0-bin/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2017-02-10 06:22:13,783 INFO 19790 [luigi-interface] hadoop.py:273 - SLF4J: See 
http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2017-02-10 06:22:13,786 INFO 19790 [luigi-interface] hadoop.py:273 - SLF4J: 
Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2017-02-10 06:22:18,909 INFO 19790 [luigi-interface] hadoop.py:273 - OK
2017-02-10 06:22:18,926 INFO 19790 [luigi-interface] hadoop.py:273 - Time 
taken: 4.934 seconds
2017-02-10 06:22:21,479 INFO 19790 [luigi-interface] hadoop.py:273 - OK
2017-02-10 06:22:21,479 INFO 19790 [luigi-interface] hadoop.py:273 - Time 
taken: 2.552 seconds
2017-02-10 06:22:22,002 INFO 19790 [luigi-interface] hadoop.py:273 - OK
2017-02-10 06:22:22,003 INFO 19790 [luigi-interface] hadoop.py:273 - Time 
taken: 0.523 seconds
2017-02-10 06:22:22,629 INFO 19790 [luigi-interface] hadoop.py:273 - OK
2017-02-10 06:22:22,630 INFO 19790 [luigi-interface] hadoop.py:273 - Time 
taken: 0.627 seconds
2017-02-10 06:22:24,635 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.InstantiationException: org.antlr.runtime.CommonToken
2017-02-10 06:22:24,636 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:24,636 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
2017-02-10 06:22:24,637 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:24,638 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.InstantiationException: org.antlr.runtime.CommonToken
2017-02-10 06:22:24,639 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:24,639 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
2017-02-10 06:22:24,639 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:24,641 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.InstantiationException: org.antlr.runtime.CommonToken
2017-02-10 06:22:24,641 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:24,641 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
2017-02-10 06:22:24,642 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:24,647 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.InstantiationException: org.antlr.runtime.CommonToken
2017-02-10 06:22:24,647 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:24,648 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
2017-02-10 06:22:24,648 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:24,650 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.InstantiationException: org.antlr.runtime.CommonToken
2017-02-10 06:22:24,650 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:24,651 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
2017-02-10 06:22:24,651 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:24,664 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.InstantiationException: org.antlr.runtime.CommonToken
2017-02-10 06:22:24,665 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:24,665 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
2017-02-10 06:22:24,665 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,134 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.InstantiationException: org.antlr.runtime.CommonToken
2017-02-10 06:22:25,135 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,135 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
2017-02-10 06:22:25,136 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,137 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.InstantiationException: org.antlr.runtime.CommonToken
2017-02-10 06:22:25,137 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,138 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
2017-02-10 06:22:25,138 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,140 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.InstantiationException: org.antlr.runtime.CommonToken
2017-02-10 06:22:25,140 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,140 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
2017-02-10 06:22:25,141 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,147 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.InstantiationException: org.antlr.runtime.CommonToken
2017-02-10 06:22:25,147 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,147 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
2017-02-10 06:22:25,148 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,149 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.InstantiationException: org.antlr.runtime.CommonToken
2017-02-10 06:22:25,150 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,150 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
2017-02-10 06:22:25,150 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,152 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.InstantiationException: org.antlr.runtime.CommonToken
2017-02-10 06:22:25,152 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,152 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
2017-02-10 06:22:25,153 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,160 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.InstantiationException: org.antlr.runtime.CommonToken
2017-02-10 06:22:25,160 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,161 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
2017-02-10 06:22:25,161 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,162 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.InstantiationException: org.antlr.runtime.CommonToken
2017-02-10 06:22:25,163 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,163 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
2017-02-10 06:22:25,163 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,167 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.InstantiationException: org.antlr.runtime.CommonToken
2017-02-10 06:22:25,167 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,168 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
2017-02-10 06:22:25,168 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,171 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.InstantiationException: org.antlr.runtime.CommonToken
2017-02-10 06:22:25,171 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:25,172 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.lang.RuntimeException: failed to evaluate: <unbound>=Class.new();
2017-02-10 06:22:25,172 INFO 19790 [luigi-interface] hadoop.py:273 - Continuing 
...
2017-02-10 06:22:26,073 INFO 19790 [luigi-interface] hadoop.py:273 - Total 
MapReduce jobs = 3
2017-02-10 06:22:26,084 INFO 19790 [luigi-interface] hadoop.py:273 - Stage-7 is 
filtered out by condition resolver.
2017-02-10 06:22:26,085 INFO 19790 [luigi-interface] hadoop.py:273 - Stage-1 is 
selected by condition resolver.
2017-02-10 06:22:26,088 INFO 19790 [luigi-interface] hadoop.py:273 - Launching 
Job 1 out of 3
2017-02-10 06:22:26,105 INFO 19790 [luigi-interface] hadoop.py:273 - Number of 
reduce tasks not specified. Estimated from input data size: 1
2017-02-10 06:22:26,106 INFO 19790 [luigi-interface] hadoop.py:273 - In order 
to change the average load for a reducer (in bytes):
2017-02-10 06:22:26,107 INFO 19790 [luigi-interface] hadoop.py:273 - set 
hive.exec.reducers.bytes.per.reducer=<number>
2017-02-10 06:22:26,107 INFO 19790 [luigi-interface] hadoop.py:273 - In order 
to limit the maximum number of reducers:
2017-02-10 06:22:26,108 INFO 19790 [luigi-interface] hadoop.py:273 - set 
hive.exec.reducers.max=<number>
2017-02-10 06:22:26,109 INFO 19790 [luigi-interface] hadoop.py:273 - In order 
to set a constant number of reducers:
2017-02-10 06:22:26,109 INFO 19790 [luigi-interface] hadoop.py:273 - set 
mapred.reduce.tasks=<number>
2017-02-10 06:22:29,213 INFO 19790 [luigi-interface] hadoop.py:273 - Starting 
Job = job_1486680388499_0023, Tracking URL = 
http://ip-172-31-62-193:8088/proxy/application_1486680388499_0023/
2017-02-10 06:22:29,214 INFO 19790 [luigi-interface] hadoop.py:273 - Kill 
Command = /edx/app/hadoop/hadoop-2.3.0/bin/hadoop job  -kill 
job_1486680388499_0023
2017-02-10 06:22:38,197 INFO 19790 [luigi-interface] hadoop.py:273 - Hadoop job 
information for Stage-1: number of mappers: 2; number of reducers: 1
2017-02-10 06:22:38,389 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:22:38,375 Stage-1 map = 0%,  reduce = 0%
2017-02-10 06:22:49,641 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:22:49,628 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 3.06 sec
2017-02-10 06:22:50,710 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:22:50,701 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 3.06 sec
2017-02-10 06:22:51,778 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:22:51,767 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 3.06 sec
2017-02-10 06:22:52,912 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:22:52,896 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 3.06 sec
2017-02-10 06:22:54,045 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:22:54,040 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 3.06 sec
2017-02-10 06:22:55,174 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:22:55,170 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 3.06 sec
2017-02-10 06:22:56,253 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:22:56,243 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 3.06 sec
2017-02-10 06:22:57,346 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:22:57,341 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 3.06 sec
2017-02-10 06:22:58,418 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:22:58,414 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 3.06 sec
2017-02-10 06:22:59,546 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:22:59,539 Stage-1 map = 100%,  reduce = 100%, Cumulative CPU 5.16 sec
2017-02-10 06:23:00,616 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:00,605 Stage-1 map = 100%,  reduce = 100%, Cumulative CPU 5.16 sec
2017-02-10 06:23:01,720 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:01,715 Stage-1 map = 100%,  reduce = 100%, Cumulative CPU 5.16 sec
2017-02-10 06:23:01,722 INFO 19790 [luigi-interface] hadoop.py:273 - MapReduce 
Total cumulative CPU time: 5 seconds 160 msec
2017-02-10 06:23:01,821 INFO 19790 [luigi-interface] hadoop.py:273 - Ended Job 
= job_1486680388499_0023
2017-02-10 06:23:01,921 INFO 19790 [luigi-interface] hadoop.py:273 - Launching 
Job 2 out of 3
2017-02-10 06:23:01,948 INFO 19790 [luigi-interface] hadoop.py:273 - Number of 
reduce tasks not specified. Estimated from input data size: 1
2017-02-10 06:23:01,955 INFO 19790 [luigi-interface] hadoop.py:273 - In order 
to change the average load for a reducer (in bytes):
2017-02-10 06:23:01,955 INFO 19790 [luigi-interface] hadoop.py:273 - set 
hive.exec.reducers.bytes.per.reducer=<number>
2017-02-10 06:23:01,956 INFO 19790 [luigi-interface] hadoop.py:273 - In order 
to limit the maximum number of reducers:
2017-02-10 06:23:01,956 INFO 19790 [luigi-interface] hadoop.py:273 - set 
hive.exec.reducers.max=<number>
2017-02-10 06:23:01,957 INFO 19790 [luigi-interface] hadoop.py:273 - In order 
to set a constant number of reducers:
2017-02-10 06:23:01,957 INFO 19790 [luigi-interface] hadoop.py:273 - set 
mapred.reduce.tasks=<number>
2017-02-10 06:23:03,217 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:03.211 GMT Thread[main,5,main] java.io.FileNotFoundException: derby.log 
(Permission denied)
2017-02-10 06:23:03,638 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:03.609 GMT Thread[main,5,main] Cleanup action starting
2017-02-10 06:23:03,638 INFO 19790 [luigi-interface] hadoop.py:273 - ERROR 
XBM0H: Directory /var/lib/analytics-tasks/analyticstack/repo/TempStatsStore 
cannot be created.
2017-02-10 06:23:03,638 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
2017-02-10 06:23:03,639 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown 
Source)
2017-02-10 06:23:03,639 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.security.AccessController.doPrivileged(Native Method)
2017-02-10 06:23:03,639 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
 Source)
2017-02-10 06:23:03,640 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
2017-02-10 06:23:03,640 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
 Source)
2017-02-10 06:23:03,640 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown 
Source)
2017-02-10 06:23:03,641 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
2017-02-10 06:23:03,641 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
2017-02-10 06:23:03,641 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
2017-02-10 06:23:03,641 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
2017-02-10 06:23:03,642 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
2017-02-10 06:23:03,642 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
2017-02-10 06:23:03,642 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
2017-02-10 06:23:03,643 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:664)
2017-02-10 06:23:03,643 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:270)
2017-02-10 06:23:03,643 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsPublisher.init(JDBCStatsPublisher.java:265)
2017-02-10 06:23:03,644 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:435)
2017-02-10 06:23:03,644 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:138)
2017-02-10 06:23:03,653 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
2017-02-10 06:23:03,654 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
2017-02-10 06:23:03,654 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
2017-02-10 06:23:03,654 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
2017-02-10 06:23:03,655 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
2017-02-10 06:23:03,655 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
2017-02-10 06:23:03,657 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
2017-02-10 06:23:03,657 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
2017-02-10 06:23:03,658 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
2017-02-10 06:23:03,658 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
2017-02-10 06:23:03,658 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
2017-02-10 06:23:03,659 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
2017-02-10 06:23:03,659 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
2017-02-10 06:23:03,659 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-02-10 06:23:03,660 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-02-10 06:23:03,662 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-02-10 06:23:03,669 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.lang.reflect.Method.invoke(Method.java:497)
2017-02-10 06:23:03,669 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.util.RunJar.main(RunJar.java:212)
2017-02-10 06:23:03,670 INFO 19790 [luigi-interface] hadoop.py:273 - Cleanup 
action completed
2017-02-10 06:23:03,685 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:03.661 GMT Thread[main,5,main] Cleanup action starting
2017-02-10 06:23:03,686 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.sql.SQLException: Failed to create database 'TempStatsStore', see the next 
exception for details.
2017-02-10 06:23:03,686 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
2017-02-10 06:23:03,686 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
2017-02-10 06:23:03,687 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
2017-02-10 06:23:03,687 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
2017-02-10 06:23:03,687 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
2017-02-10 06:23:03,688 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
2017-02-10 06:23:03,692 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
2017-02-10 06:23:03,697 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
2017-02-10 06:23:03,698 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
2017-02-10 06:23:03,698 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
2017-02-10 06:23:03,698 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:664)
2017-02-10 06:23:03,699 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:270)
2017-02-10 06:23:03,699 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsPublisher.init(JDBCStatsPublisher.java:265)
2017-02-10 06:23:03,699 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:435)
2017-02-10 06:23:03,699 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:138)
2017-02-10 06:23:03,701 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
2017-02-10 06:23:03,702 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
2017-02-10 06:23:03,714 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
2017-02-10 06:23:03,714 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
2017-02-10 06:23:03,714 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
2017-02-10 06:23:03,715 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
2017-02-10 06:23:03,715 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
2017-02-10 06:23:03,715 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
2017-02-10 06:23:03,716 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
2017-02-10 06:23:03,716 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
2017-02-10 06:23:03,716 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
2017-02-10 06:23:03,716 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
2017-02-10 06:23:03,717 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
2017-02-10 06:23:03,717 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-02-10 06:23:03,717 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-02-10 06:23:03,718 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-02-10 06:23:03,733 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.lang.reflect.Method.invoke(Method.java:497)
2017-02-10 06:23:03,734 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.util.RunJar.main(RunJar.java:212)
2017-02-10 06:23:03,734 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
java.sql.SQLException: Failed to create database 'TempStatsStore', see the next 
exception for details.
2017-02-10 06:23:03,735 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
2017-02-10 06:23:03,735 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
 Source)
2017-02-10 06:23:03,735 INFO 19790 [luigi-interface] hadoop.py:273 - ... 33 more
2017-02-10 06:23:03,736 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
java.sql.SQLException: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:23:03,736 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
2017-02-10 06:23:03,737 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
 Source)
2017-02-10 06:23:03,740 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
2017-02-10 06:23:03,740 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
2017-02-10 06:23:03,741 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown 
Source)
2017-02-10 06:23:03,750 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown 
Source)
2017-02-10 06:23:03,769 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
2017-02-10 06:23:03,770 INFO 19790 [luigi-interface] hadoop.py:273 - ... 30 more
2017-02-10 06:23:03,770 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
ERROR XBM0H: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:23:03,771 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
2017-02-10 06:23:03,771 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown 
Source)
2017-02-10 06:23:03,771 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.security.AccessController.doPrivileged(Native Method)
2017-02-10 06:23:03,772 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
 Source)
2017-02-10 06:23:03,772 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
2017-02-10 06:23:03,772 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
 Source)
2017-02-10 06:23:03,773 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown 
Source)
2017-02-10 06:23:03,775 INFO 19790 [luigi-interface] hadoop.py:273 - ... 30 more
2017-02-10 06:23:03,781 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= begin nested exception, level (1) ===========
2017-02-10 06:23:03,782 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.sql.SQLException: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:23:03,782 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
2017-02-10 06:23:03,782 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
2017-02-10 06:23:03,783 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown 
Source)
2017-02-10 06:23:03,783 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown 
Source)
2017-02-10 06:23:03,783 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
2017-02-10 06:23:03,784 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
2017-02-10 06:23:03,784 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
2017-02-10 06:23:03,784 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
2017-02-10 06:23:03,801 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
2017-02-10 06:23:03,802 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
2017-02-10 06:23:03,802 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
2017-02-10 06:23:03,802 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
2017-02-10 06:23:03,803 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:664)
2017-02-10 06:23:03,803 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:270)
2017-02-10 06:23:03,803 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsPublisher.init(JDBCStatsPublisher.java:265)
2017-02-10 06:23:03,804 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:435)
2017-02-10 06:23:03,804 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:138)
2017-02-10 06:23:03,804 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
2017-02-10 06:23:03,804 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
2017-02-10 06:23:03,805 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
2017-02-10 06:23:03,805 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
2017-02-10 06:23:03,805 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
2017-02-10 06:23:03,806 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
2017-02-10 06:23:03,806 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
2017-02-10 06:23:03,806 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
2017-02-10 06:23:03,807 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
2017-02-10 06:23:03,807 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
2017-02-10 06:23:03,807 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
2017-02-10 06:23:03,808 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
2017-02-10 06:23:03,808 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
2017-02-10 06:23:03,819 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-02-10 06:23:03,819 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-02-10 06:23:03,820 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-02-10 06:23:03,820 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.lang.reflect.Method.invoke(Method.java:497)
2017-02-10 06:23:03,820 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.util.RunJar.main(RunJar.java:212)
2017-02-10 06:23:03,821 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
java.sql.SQLException: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:23:03,823 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
2017-02-10 06:23:03,824 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
 Source)
2017-02-10 06:23:03,824 INFO 19790 [luigi-interface] hadoop.py:273 - ... 35 more
2017-02-10 06:23:03,824 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
ERROR XBM0H: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:23:03,833 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
2017-02-10 06:23:03,834 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown 
Source)
2017-02-10 06:23:03,834 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.security.AccessController.doPrivileged(Native Method)
2017-02-10 06:23:03,834 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
 Source)
2017-02-10 06:23:03,837 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
2017-02-10 06:23:03,837 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
 Source)
2017-02-10 06:23:03,838 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown 
Source)
2017-02-10 06:23:03,838 INFO 19790 [luigi-interface] hadoop.py:273 - ... 30 more
2017-02-10 06:23:03,838 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= end nested exception, level (1) ===========
2017-02-10 06:23:03,839 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= begin nested exception, level (2) ===========
2017-02-10 06:23:03,839 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.sql.SQLException: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:23:03,840 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
2017-02-10 06:23:03,849 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
 Source)
2017-02-10 06:23:03,863 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
2017-02-10 06:23:03,873 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
2017-02-10 06:23:03,874 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown 
Source)
2017-02-10 06:23:03,874 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown 
Source)
2017-02-10 06:23:03,874 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
2017-02-10 06:23:03,875 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
2017-02-10 06:23:03,875 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
2017-02-10 06:23:03,875 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
2017-02-10 06:23:03,875 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
2017-02-10 06:23:03,876 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
2017-02-10 06:23:03,876 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
2017-02-10 06:23:03,876 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
2017-02-10 06:23:03,877 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:664)
2017-02-10 06:23:03,877 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:270)
2017-02-10 06:23:03,877 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsPublisher.init(JDBCStatsPublisher.java:265)
2017-02-10 06:23:03,878 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:435)
2017-02-10 06:23:03,878 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:138)
2017-02-10 06:23:03,878 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
2017-02-10 06:23:03,879 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
2017-02-10 06:23:03,879 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
2017-02-10 06:23:03,879 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
2017-02-10 06:23:03,880 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
2017-02-10 06:23:03,880 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
2017-02-10 06:23:03,880 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
2017-02-10 06:23:03,886 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
2017-02-10 06:23:03,887 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
2017-02-10 06:23:03,887 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
2017-02-10 06:23:03,888 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
2017-02-10 06:23:03,888 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
2017-02-10 06:23:03,888 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
2017-02-10 06:23:03,889 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-02-10 06:23:03,889 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-02-10 06:23:03,896 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-02-10 06:23:03,896 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.lang.reflect.Method.invoke(Method.java:497)
2017-02-10 06:23:03,896 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.util.RunJar.main(RunJar.java:212)
2017-02-10 06:23:03,897 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
ERROR XBM0H: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:23:03,897 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
2017-02-10 06:23:03,897 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown 
Source)
2017-02-10 06:23:03,911 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.security.AccessController.doPrivileged(Native Method)
2017-02-10 06:23:03,911 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
 Source)
2017-02-10 06:23:03,912 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
2017-02-10 06:23:03,912 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
 Source)
2017-02-10 06:23:03,912 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown 
Source)
2017-02-10 06:23:03,913 INFO 19790 [luigi-interface] hadoop.py:273 - ... 30 more
2017-02-10 06:23:03,913 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= end nested exception, level (2) ===========
2017-02-10 06:23:03,919 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= begin nested exception, level (3) ===========
2017-02-10 06:23:03,919 INFO 19790 [luigi-interface] hadoop.py:273 - ERROR 
XBM0H: Directory /var/lib/analytics-tasks/analyticstack/repo/TempStatsStore 
cannot be created.
2017-02-10 06:23:03,920 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
2017-02-10 06:23:03,920 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown 
Source)
2017-02-10 06:23:03,922 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.security.AccessController.doPrivileged(Native Method)
2017-02-10 06:23:03,922 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
 Source)
2017-02-10 06:23:03,927 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
2017-02-10 06:23:03,929 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
 Source)
2017-02-10 06:23:03,929 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown 
Source)
2017-02-10 06:23:03,929 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
2017-02-10 06:23:03,934 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
2017-02-10 06:23:03,934 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
2017-02-10 06:23:03,934 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
2017-02-10 06:23:03,935 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
2017-02-10 06:23:03,935 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
2017-02-10 06:23:03,935 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
2017-02-10 06:23:03,935 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:664)
2017-02-10 06:23:03,936 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:270)
2017-02-10 06:23:03,960 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsPublisher.init(JDBCStatsPublisher.java:265)
2017-02-10 06:23:03,960 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:435)
2017-02-10 06:23:03,961 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:138)
2017-02-10 06:23:03,961 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
2017-02-10 06:23:03,962 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
2017-02-10 06:23:03,962 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
2017-02-10 06:23:03,962 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
2017-02-10 06:23:03,962 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
2017-02-10 06:23:03,969 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
2017-02-10 06:23:03,970 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
2017-02-10 06:23:03,970 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
2017-02-10 06:23:03,970 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
2017-02-10 06:23:03,973 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
2017-02-10 06:23:03,974 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
2017-02-10 06:23:03,981 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
2017-02-10 06:23:03,982 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
2017-02-10 06:23:03,982 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-02-10 06:23:03,982 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-02-10 06:23:03,983 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-02-10 06:23:03,983 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.lang.reflect.Method.invoke(Method.java:497)
2017-02-10 06:23:03,983 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.util.RunJar.main(RunJar.java:212)
2017-02-10 06:23:03,984 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= end nested exception, level (3) ===========
2017-02-10 06:23:03,987 INFO 19790 [luigi-interface] hadoop.py:273 - Cleanup 
action completed
2017-02-10 06:23:05,599 INFO 19790 [luigi-interface] hadoop.py:273 - Starting 
Job = job_1486680388499_0024, Tracking URL = 
http://ip-172-31-62-193:8088/proxy/application_1486680388499_0024/
2017-02-10 06:23:05,599 INFO 19790 [luigi-interface] hadoop.py:273 - Kill 
Command = /edx/app/hadoop/hadoop-2.3.0/bin/hadoop job  -kill 
job_1486680388499_0024
2017-02-10 06:23:15,415 INFO 19790 [luigi-interface] hadoop.py:273 - Hadoop job 
information for Stage-2: number of mappers: 1; number of reducers: 1
2017-02-10 06:23:15,557 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:15,553 Stage-2 map = 0%,  reduce = 0%
2017-02-10 06:23:24,004 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:24,002 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.34 sec
2017-02-10 06:23:25,057 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:25,039 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.34 sec
2017-02-10 06:23:26,121 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:26,108 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.34 sec
2017-02-10 06:23:27,211 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:27,210 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.34 sec
2017-02-10 06:23:28,282 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:28,279 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.34 sec
2017-02-10 06:23:29,347 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:29,345 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.34 sec
2017-02-10 06:23:30,389 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:30,388 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.34 sec
2017-02-10 06:23:31,476 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:31,474 Stage-2 map = 100%,  reduce = 0%, Cumulative CPU 1.34 sec
2017-02-10 06:23:32,611 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:32,610 Stage-2 map = 100%,  reduce = 100%, Cumulative CPU 3.16 sec
2017-02-10 06:23:33,652 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:33,641 Stage-2 map = 100%,  reduce = 100%, Cumulative CPU 3.16 sec
2017-02-10 06:23:33,654 INFO 19790 [luigi-interface] hadoop.py:273 - MapReduce 
Total cumulative CPU time: 3 seconds 160 msec
2017-02-10 06:23:33,703 INFO 19790 [luigi-interface] hadoop.py:273 - Ended Job 
= job_1486680388499_0024
2017-02-10 06:23:33,717 INFO 19790 [luigi-interface] hadoop.py:273 - Loading 
data to table default.course_enrollment_gender_daily partition (dt=2017-02-10)
2017-02-10 06:23:34,170 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:34.170 GMT Thread[main,5,main] Cleanup action starting
2017-02-10 06:23:34,170 INFO 19790 [luigi-interface] hadoop.py:273 - ERROR 
XBM0H: Directory /var/lib/analytics-tasks/analyticstack/repo/TempStatsStore 
cannot be created.
2017-02-10 06:23:34,171 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
2017-02-10 06:23:34,171 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown 
Source)
2017-02-10 06:23:34,171 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.security.AccessController.doPrivileged(Native Method)
2017-02-10 06:23:34,171 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
 Source)
2017-02-10 06:23:34,172 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
2017-02-10 06:23:34,172 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
 Source)
2017-02-10 06:23:34,172 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown 
Source)
2017-02-10 06:23:34,172 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
2017-02-10 06:23:34,172 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
2017-02-10 06:23:34,173 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
2017-02-10 06:23:34,173 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
2017-02-10 06:23:34,173 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
2017-02-10 06:23:34,173 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
2017-02-10 06:23:34,173 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
2017-02-10 06:23:34,174 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:664)
2017-02-10 06:23:34,174 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:270)
2017-02-10 06:23:34,174 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Utilities.connectWithRetry(Utilities.java:2295)
2017-02-10 06:23:34,174 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsAggregator.connect(JDBCStatsAggregator.java:85)
2017-02-10 06:23:34,174 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.aggregateStats(StatsTask.java:298)
2017-02-10 06:23:34,175 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.execute(StatsTask.java:252)
2017-02-10 06:23:34,175 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
2017-02-10 06:23:34,175 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
2017-02-10 06:23:34,175 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
2017-02-10 06:23:34,175 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
2017-02-10 06:23:34,176 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
2017-02-10 06:23:34,176 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
2017-02-10 06:23:34,176 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
2017-02-10 06:23:34,176 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
2017-02-10 06:23:34,176 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
2017-02-10 06:23:34,177 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
2017-02-10 06:23:34,179 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
2017-02-10 06:23:34,179 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
2017-02-10 06:23:34,179 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
2017-02-10 06:23:34,179 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-02-10 06:23:34,180 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-02-10 06:23:34,182 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-02-10 06:23:34,182 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.lang.reflect.Method.invoke(Method.java:497)
2017-02-10 06:23:34,183 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.util.RunJar.main(RunJar.java:212)
2017-02-10 06:23:34,183 INFO 19790 [luigi-interface] hadoop.py:273 - Cleanup 
action completed
2017-02-10 06:23:34,183 INFO 19790 [luigi-interface] hadoop.py:273 - 2017-02-10 
06:23:34.178 GMT Thread[main,5,main] Cleanup action starting
2017-02-10 06:23:34,183 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.sql.SQLException: Failed to create database 'TempStatsStore', see the next 
exception for details.
2017-02-10 06:23:34,183 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
2017-02-10 06:23:34,184 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
2017-02-10 06:23:34,184 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
2017-02-10 06:23:34,184 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
2017-02-10 06:23:34,184 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
2017-02-10 06:23:34,184 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
2017-02-10 06:23:34,185 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
2017-02-10 06:23:34,185 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
2017-02-10 06:23:34,186 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
2017-02-10 06:23:34,186 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
2017-02-10 06:23:34,187 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:664)
2017-02-10 06:23:34,187 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:270)
2017-02-10 06:23:34,187 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Utilities.connectWithRetry(Utilities.java:2295)
2017-02-10 06:23:34,187 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsAggregator.connect(JDBCStatsAggregator.java:85)
2017-02-10 06:23:34,187 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.aggregateStats(StatsTask.java:298)
2017-02-10 06:23:34,193 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.execute(StatsTask.java:252)
2017-02-10 06:23:34,194 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
2017-02-10 06:23:34,194 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
2017-02-10 06:23:34,194 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
2017-02-10 06:23:34,194 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
2017-02-10 06:23:34,194 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
2017-02-10 06:23:34,195 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
2017-02-10 06:23:34,195 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
2017-02-10 06:23:34,195 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
2017-02-10 06:23:34,195 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
2017-02-10 06:23:34,196 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
2017-02-10 06:23:34,196 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
2017-02-10 06:23:34,196 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
2017-02-10 06:23:34,197 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
2017-02-10 06:23:34,197 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-02-10 06:23:34,197 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-02-10 06:23:34,197 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-02-10 06:23:34,197 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.lang.reflect.Method.invoke(Method.java:497)
2017-02-10 06:23:34,198 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.util.RunJar.main(RunJar.java:212)
2017-02-10 06:23:34,198 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
java.sql.SQLException: Failed to create database 'TempStatsStore', see the next 
exception for details.
2017-02-10 06:23:34,198 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
2017-02-10 06:23:34,198 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
 Source)
2017-02-10 06:23:34,198 INFO 19790 [luigi-interface] hadoop.py:273 - ... 34 more
2017-02-10 06:23:34,201 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
java.sql.SQLException: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:23:34,201 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
2017-02-10 06:23:34,201 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
 Source)
2017-02-10 06:23:34,201 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
2017-02-10 06:23:34,202 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
2017-02-10 06:23:34,202 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown 
Source)
2017-02-10 06:23:34,209 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown 
Source)
2017-02-10 06:23:34,210 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
2017-02-10 06:23:34,210 INFO 19790 [luigi-interface] hadoop.py:273 - ... 31 more
2017-02-10 06:23:34,210 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
ERROR XBM0H: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:23:34,211 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
2017-02-10 06:23:34,211 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown 
Source)
2017-02-10 06:23:34,212 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.security.AccessController.doPrivileged(Native Method)
2017-02-10 06:23:34,213 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
 Source)
2017-02-10 06:23:34,213 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
2017-02-10 06:23:34,214 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
 Source)
2017-02-10 06:23:34,214 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown 
Source)
2017-02-10 06:23:34,214 INFO 19790 [luigi-interface] hadoop.py:273 - ... 31 more
2017-02-10 06:23:34,214 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= begin nested exception, level (1) ===========
2017-02-10 06:23:34,214 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.sql.SQLException: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:23:34,215 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
2017-02-10 06:23:34,215 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
2017-02-10 06:23:34,215 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown 
Source)
2017-02-10 06:23:34,215 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown 
Source)
2017-02-10 06:23:34,215 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
2017-02-10 06:23:34,216 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
2017-02-10 06:23:34,216 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
2017-02-10 06:23:34,216 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
2017-02-10 06:23:34,216 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
2017-02-10 06:23:34,216 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
2017-02-10 06:23:34,217 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
2017-02-10 06:23:34,217 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
2017-02-10 06:23:34,217 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:664)
2017-02-10 06:23:34,217 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:270)
2017-02-10 06:23:34,217 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Utilities.connectWithRetry(Utilities.java:2295)
2017-02-10 06:23:34,218 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsAggregator.connect(JDBCStatsAggregator.java:85)
2017-02-10 06:23:34,218 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.aggregateStats(StatsTask.java:298)
2017-02-10 06:23:34,218 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.execute(StatsTask.java:252)
2017-02-10 06:23:34,218 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
2017-02-10 06:23:34,218 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
2017-02-10 06:23:34,219 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
2017-02-10 06:23:34,219 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
2017-02-10 06:23:34,219 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
2017-02-10 06:23:34,219 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
2017-02-10 06:23:34,220 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
2017-02-10 06:23:34,220 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
2017-02-10 06:23:34,220 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
2017-02-10 06:23:34,220 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
2017-02-10 06:23:34,221 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
2017-02-10 06:23:34,221 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
2017-02-10 06:23:34,221 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
2017-02-10 06:23:34,221 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-02-10 06:23:34,221 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-02-10 06:23:34,222 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-02-10 06:23:34,222 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.lang.reflect.Method.invoke(Method.java:497)
2017-02-10 06:23:34,222 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.util.RunJar.main(RunJar.java:212)
2017-02-10 06:23:34,222 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
java.sql.SQLException: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:23:34,222 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
2017-02-10 06:23:34,223 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
 Source)
2017-02-10 06:23:34,223 INFO 19790 [luigi-interface] hadoop.py:273 - ... 36 more
2017-02-10 06:23:34,223 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
ERROR XBM0H: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:23:34,223 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
2017-02-10 06:23:34,223 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown 
Source)
2017-02-10 06:23:34,224 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.security.AccessController.doPrivileged(Native Method)
2017-02-10 06:23:34,224 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
 Source)
2017-02-10 06:23:34,224 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
2017-02-10 06:23:34,224 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
 Source)
2017-02-10 06:23:34,225 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown 
Source)
2017-02-10 06:23:34,225 INFO 19790 [luigi-interface] hadoop.py:273 - ... 31 more
2017-02-10 06:23:34,225 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= end nested exception, level (1) ===========
2017-02-10 06:23:34,225 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= begin nested exception, level (2) ===========
2017-02-10 06:23:34,225 INFO 19790 [luigi-interface] hadoop.py:273 - 
java.sql.SQLException: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:23:34,226 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
2017-02-10 06:23:34,226 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
 Source)
2017-02-10 06:23:34,226 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
2017-02-10 06:23:34,226 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
2017-02-10 06:23:34,226 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown 
Source)
2017-02-10 06:23:34,227 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown 
Source)
2017-02-10 06:23:34,227 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
2017-02-10 06:23:34,227 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
2017-02-10 06:23:34,227 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
2017-02-10 06:23:34,227 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
2017-02-10 06:23:34,228 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
2017-02-10 06:23:34,228 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
2017-02-10 06:23:34,228 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
2017-02-10 06:23:34,228 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
2017-02-10 06:23:34,228 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:664)
2017-02-10 06:23:34,228 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:270)
2017-02-10 06:23:34,229 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Utilities.connectWithRetry(Utilities.java:2295)
2017-02-10 06:23:34,229 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsAggregator.connect(JDBCStatsAggregator.java:85)
2017-02-10 06:23:34,229 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.aggregateStats(StatsTask.java:298)
2017-02-10 06:23:34,237 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.execute(StatsTask.java:252)
2017-02-10 06:23:34,237 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
2017-02-10 06:23:34,238 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
2017-02-10 06:23:34,238 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
2017-02-10 06:23:34,238 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
2017-02-10 06:23:34,238 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
2017-02-10 06:23:34,238 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
2017-02-10 06:23:34,239 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
2017-02-10 06:23:34,239 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
2017-02-10 06:23:34,239 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
2017-02-10 06:23:34,239 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
2017-02-10 06:23:34,239 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
2017-02-10 06:23:34,240 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
2017-02-10 06:23:34,240 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
2017-02-10 06:23:34,240 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-02-10 06:23:34,240 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-02-10 06:23:34,240 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-02-10 06:23:34,241 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.lang.reflect.Method.invoke(Method.java:497)
2017-02-10 06:23:34,241 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.util.RunJar.main(RunJar.java:212)
2017-02-10 06:23:34,241 INFO 19790 [luigi-interface] hadoop.py:273 - Caused by: 
ERROR XBM0H: Directory 
/var/lib/analytics-tasks/analyticstack/repo/TempStatsStore cannot be created.
2017-02-10 06:23:34,241 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
2017-02-10 06:23:34,241 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown 
Source)
2017-02-10 06:23:34,242 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.security.AccessController.doPrivileged(Native Method)
2017-02-10 06:23:34,242 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
 Source)
2017-02-10 06:23:34,242 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
2017-02-10 06:23:34,242 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
 Source)
2017-02-10 06:23:34,242 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown 
Source)
2017-02-10 06:23:34,249 INFO 19790 [luigi-interface] hadoop.py:273 - ... 31 more
2017-02-10 06:23:34,249 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= end nested exception, level (2) ===========
2017-02-10 06:23:34,250 INFO 19790 [luigi-interface] hadoop.py:273 - 
============= begin nested exception, level (3) ===========
2017-02-10 06:23:34,250 INFO 19790 [luigi-interface] hadoop.py:273 - ERROR 
XBM0H: Directory /var/lib/analytics-tasks/analyticstack/repo/TempStatsStore 
cannot be created.
2017-02-10 06:23:34,250 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
2017-02-10 06:23:34,250 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown 
Source)
2017-02-10 06:23:34,251 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.security.AccessController.doPrivileged(Native Method)
2017-02-10 06:23:34,251 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
 Source)
2017-02-10 06:23:34,251 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
2017-02-10 06:23:34,251 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
 Source)
2017-02-10 06:23:34,251 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown 
Source)
2017-02-10 06:23:34,251 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
2017-02-10 06:23:34,252 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
2017-02-10 06:23:34,252 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
2017-02-10 06:23:34,252 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
2017-02-10 06:23:34,252 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
2017-02-10 06:23:34,252 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
2017-02-10 06:23:34,252 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
2017-02-10 06:23:34,253 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:664)
2017-02-10 06:23:34,253 INFO 19790 [luigi-interface] hadoop.py:273 - at 
java.sql.DriverManager.getConnection(DriverManager.java:270)
2017-02-10 06:23:34,253 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Utilities.connectWithRetry(Utilities.java:2295)
2017-02-10 06:23:34,253 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.stats.jdbc.JDBCStatsAggregator.connect(JDBCStatsAggregator.java:85)
2017-02-10 06:23:34,253 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.aggregateStats(StatsTask.java:298)
2017-02-10 06:23:34,254 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.StatsTask.execute(StatsTask.java:252)
2017-02-10 06:23:34,254 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
2017-02-10 06:23:34,254 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
2017-02-10 06:23:34,254 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
2017-02-10 06:23:34,254 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
2017-02-10 06:23:34,255 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
2017-02-10 06:23:34,255 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
2017-02-10 06:23:34,255 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
2017-02-10 06:23:34,255 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
2017-02-10 06:23:34,255 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
2017-02-10 06:23:34,256 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
2017-02-10 06:23:34,256 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
2017-02-10 06:23:34,256 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
2017-02-10 06:23:34,256 INFO 19790 [luigi-interface] hadoop.py:273 - at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
2017-02-10 06:23:34,256 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-02-10 06:23:34,257 INFO 19790 [luigi-interface] hadoop.py:273 - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-02-10 06:23:34,258 INFO 19790 [luigi-interface] worker.py:296 - [pid 
19790] Worker Worker(salt=997523257, host=ip-172-31-62-193, username=hadoop, 
pid=19790) done      
HiveTableFromParameterQueryTask(warehouse_path=hdfs://localhost:9000/edx-analytics-pipeline/warehouse/,
 insert_query=
            SELECT
                ce.date,
                ce.course_id,
                IF(p.gender != '', p.gender, NULL),
                SUM(ce.at_end),
                COUNT(ce.user_id)
            FROM course_enrollment ce
            LEFT OUTER JOIN auth_userprofile p ON p.user_id = ce.user_id
            GROUP BY
                ce.date,
                ce.course_id,
                IF(p.gender != '', p.gender, NULL)
        , table=course_enrollment_gender_daily, columns=(('date', 'STRING'), 
('course_id', 'STRING'), ('gender', 'STRING'), ('count', 'INT'), 
('cumulative_count', 'INT')), partition=dt=2017-02-10)
2017-02-10 06:23:34,265 INFO 19790 [luigi-interface] worker.py:337 - Done
2017-02-10 06:23:34,265 INFO 19790 [luigi-interface] worker.py:338 - There are 
no more tasks to run at this time
2017-02-10 06:23:34,265 INFO 19790 [luigi-interface] worker.py:343 - There are 
11 pending tasks possibly being run by other workers
2017-02-10 06:23:34,277 INFO 19790 [luigi-interface] worker.py:117 - Worker 
Worker(salt=997523257, host=ip-172-31-62-193, username=hadoop, pid=19790) was 
stopped. Shutting down Keep-Alive thread
Connection to localhost closed.
Exiting with status = 0

Reply via email to