[jira] [Commented] (PHOENIX-3460) Phoenix Spark plugin cannot find table with a Namespace prefix

2017-10-31 Thread Thomas D'Silva (JIRA)

[ 
https://issues.apache.org/jira/browse/PHOENIX-3460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16227667#comment-16227667
 ] 

Thomas D'Silva commented on PHOENIX-3460:
-

[~shiroari] Can you provide the changes you made in PhoenixRuntime to get 
around this issue?

> Phoenix Spark plugin cannot find table with a Namespace prefix
> --
>
> Key: PHOENIX-3460
> URL: https://issues.apache.org/jira/browse/PHOENIX-3460
> Project: Phoenix
>  Issue Type: Bug
>Affects Versions: 4.8.0
> Environment: HDP 2.5
>Reporter: Xindian Long
>  Labels: namespaces, phoenix, spark
> Fix For: 4.7.0
>
>
> I am testing some code using Phoenix Spark plug in to read a Phoenix table 
> with a namespace prefix in the table name (the table is created as a phoenix 
> table not a hbase table), but it returns an TableNotFoundException.
> The table is obviously there because I can query it using plain phoenix sql 
> through Squirrel. In addition, using spark sql to query it has no problem at 
> all.
> I am running on the HDP 2.5 platform, with phoenix 4.7.0.2.5.0.0-1245
> The problem does not exist at all when I was running the same code on HDP 2.4 
> cluster, with phoenix 4.4.
> Neither does the problem occur when I query a table without a namespace 
> prefix in the DB table name, on HDP 2.5
> The log is in the attached file: tableNoFound.txt
> My testing code is also attached.
> The weird thing is in the attached code, if I run testSpark alone it gives 
> the above exception, but if I run the testJdbc first, and followed by 
> testSpark, both of them work.
>  After changing to create table by using
> create table ACME.ENDPOINT_STATUS
> The phoenix-spark plug in seems working. I also find some weird behavior,
> If I do both the following
> create table ACME.ENDPOINT_STATUS ...
> create table "ACME:ENDPOINT_STATUS" ...
> Both table shows up in phoenix, the first one shows as Schema ACME, and table 
> name ENDPOINT_STATUS, and the later on shows as scheme none, and table name 
> ACME:ENDPOINT_STATUS.
> However, in HBASE, I only see one table ACME:ENDPOINT_STATUS. In addition, 
> upserts in the table ACME.ENDPOINT_STATUS show up in the other table, so is 
> the other way around.
>  



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (PHOENIX-3460) Phoenix Spark plugin cannot find table with a Namespace prefix

2017-09-13 Thread Ankit Singhal (JIRA)

[ 
https://issues.apache.org/jira/browse/PHOENIX-3460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16164357#comment-16164357
 ] 

Ankit Singhal commented on PHOENIX-3460:


Thanks [~shiroari] for reporting and doing the analysis. I'll take a look into 
the APIs consistency.
In meanwhile, can you confirm that if you are seeing the same problem with the 
latest version of Phoenix as well(I see PHOENIX-3944 fixing some of the 
problems for Mapped Phoenix table)?

> Phoenix Spark plugin cannot find table with a Namespace prefix
> --
>
> Key: PHOENIX-3460
> URL: https://issues.apache.org/jira/browse/PHOENIX-3460
> Project: Phoenix
>  Issue Type: Bug
>Affects Versions: 4.8.0
> Environment: HDP 2.5
>Reporter: Xindian Long
>  Labels: namespaces, phoenix, spark
> Fix For: 4.7.0
>
>
> I am testing some code using Phoenix Spark plug in to read a Phoenix table 
> with a namespace prefix in the table name (the table is created as a phoenix 
> table not a hbase table), but it returns an TableNotFoundException.
> The table is obviously there because I can query it using plain phoenix sql 
> through Squirrel. In addition, using spark sql to query it has no problem at 
> all.
> I am running on the HDP 2.5 platform, with phoenix 4.7.0.2.5.0.0-1245
> The problem does not exist at all when I was running the same code on HDP 2.4 
> cluster, with phoenix 4.4.
> Neither does the problem occur when I query a table without a namespace 
> prefix in the DB table name, on HDP 2.5
> The log is in the attached file: tableNoFound.txt
> My testing code is also attached.
> The weird thing is in the attached code, if I run testSpark alone it gives 
> the above exception, but if I run the testJdbc first, and followed by 
> testSpark, both of them work.
>  After changing to create table by using
> create table ACME.ENDPOINT_STATUS
> The phoenix-spark plug in seems working. I also find some weird behavior,
> If I do both the following
> create table ACME.ENDPOINT_STATUS ...
> create table "ACME:ENDPOINT_STATUS" ...
> Both table shows up in phoenix, the first one shows as Schema ACME, and table 
> name ENDPOINT_STATUS, and the later on shows as scheme none, and table name 
> ACME:ENDPOINT_STATUS.
> However, in HBASE, I only see one table ACME:ENDPOINT_STATUS. In addition, 
> upserts in the table ACME.ENDPOINT_STATUS show up in the other table, so is 
> the other way around.
>  



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (PHOENIX-3460) Phoenix Spark plugin cannot find table with a Namespace prefix

2017-09-06 Thread Stas Sukhanov (JIRA)

[ 
https://issues.apache.org/jira/browse/PHOENIX-3460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16155492#comment-16155492
 ] 

Stas Sukhanov commented on PHOENIX-3460:


Hi, I have the same problem and conducted some investigation. There is a 
problem in {{org.apache.phoenix.util.PhoenixRuntime}} in 
[getTable|https://github.com/apache/phoenix/blob/master/phoenix-core/src/main/java/org/apache/phoenix/util/PhoenixRuntime.java#L442]
 and 
[generateColumnInfo|https://github.com/apache/phoenix/blob/master/phoenix-core/src/main/java/org/apache/phoenix/util/PhoenixRuntime.java#L469]
 methods when one uses namespace and not schema (e.g. "namespace:table").

Code in phoenix-spark calls {{generateColumnInfo}} that removes quotes by 
calling {{SchemaUtil.normalizeFullTableName(tableName)}} and passes call to 
{{getTable}}. When {{getTable}} fails to find table in cache it goes to 
fallback (see catch block). Without quotas that block treats namespace as 
schema and fails by throwing exception with origin table name. Unfortunately 
there is no good workaround. One option is to call 
{{MetaDataClient.updateCache}} manually beforehand and fill up the cache then 
{{getTable}} works on driver but you most likely get exception on workers.

In our project we included phoenix-core in shaded jar and replaced 
{{PhoenixRuntime}} with our implementation that doesn't convert namespace to 
schema.



> Phoenix Spark plugin cannot find table with a Namespace prefix
> --
>
> Key: PHOENIX-3460
> URL: https://issues.apache.org/jira/browse/PHOENIX-3460
> Project: Phoenix
>  Issue Type: Bug
>Affects Versions: 4.8.0
> Environment: HDP 2.5
>Reporter: Xindian Long
>  Labels: namespaces, phoenix, spark
> Fix For: 4.7.0
>
>
> I am testing some code using Phoenix Spark plug in to read a Phoenix table 
> with a namespace prefix in the table name (the table is created as a phoenix 
> table not a hbase table), but it returns an TableNotFoundException.
> The table is obviously there because I can query it using plain phoenix sql 
> through Squirrel. In addition, using spark sql to query it has no problem at 
> all.
> I am running on the HDP 2.5 platform, with phoenix 4.7.0.2.5.0.0-1245
> The problem does not exist at all when I was running the same code on HDP 2.4 
> cluster, with phoenix 4.4.
> Neither does the problem occur when I query a table without a namespace 
> prefix in the DB table name, on HDP 2.5
> The log is in the attached file: tableNoFound.txt
> My testing code is also attached.
> The weird thing is in the attached code, if I run testSpark alone it gives 
> the above exception, but if I run the testJdbc first, and followed by 
> testSpark, both of them work.
>  After changing to create table by using
> create table ACME.ENDPOINT_STATUS
> The phoenix-spark plug in seems working. I also find some weird behavior,
> If I do both the following
> create table ACME.ENDPOINT_STATUS ...
> create table "ACME:ENDPOINT_STATUS" ...
> Both table shows up in phoenix, the first one shows as Schema ACME, and table 
> name ENDPOINT_STATUS, and the later on shows as scheme none, and table name 
> ACME:ENDPOINT_STATUS.
> However, in HBASE, I only see one table ACME:ENDPOINT_STATUS. In addition, 
> upserts in the table ACME.ENDPOINT_STATUS show up in the other table, so is 
> the other way around.
>  



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (PHOENIX-3460) Phoenix Spark plugin cannot find table with a Namespace prefix

2017-09-04 Thread Aritomo Abe (JIRA)

[ 
https://issues.apache.org/jira/browse/PHOENIX-3460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16152659#comment-16152659
 ] 

Aritomo Abe commented on PHOENIX-3460:
--

I have the same problem. But I cannot just use "." to solve this issue. My 
scheme was created using the following syntax:

create table if not exists "create table "test_namespace:test_table" (id 
varchar not null primary key);

I can work with the table through JDBC, Phoenix console, etc. But not from 
Spark:

org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table 
undefined. tableName=test_namespace:test_table

Spark code is really simple:

{code:java}
import org.apache.hadoop.conf.Configuration
import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
import org.apache.phoenix.spark._

val configuration = new Configuration()
val sqlContext = new SQLContext(sc)

val df = sqlContext.phoenixTableAsDataFrame(
  "\"test_namespace:test_table\"", Array("id"), conf = configuration
)
{code}


> Phoenix Spark plugin cannot find table with a Namespace prefix
> --
>
> Key: PHOENIX-3460
> URL: https://issues.apache.org/jira/browse/PHOENIX-3460
> Project: Phoenix
>  Issue Type: Bug
>Affects Versions: 4.8.0
> Environment: HDP 2.5
>Reporter: Xindian Long
>  Labels: phoenix, spark
> Fix For: 4.7.0
>
>
> I am testing some code using Phoenix Spark plug in to read a Phoenix table 
> with a namespace prefix in the table name (the table is created as a phoenix 
> table not a hbase table), but it returns an TableNotFoundException.
> The table is obviously there because I can query it using plain phoenix sql 
> through Squirrel. In addition, using spark sql to query it has no problem at 
> all.
> I am running on the HDP 2.5 platform, with phoenix 4.7.0.2.5.0.0-1245
> The problem does not exist at all when I was running the same code on HDP 2.4 
> cluster, with phoenix 4.4.
> Neither does the problem occur when I query a table without a namespace 
> prefix in the DB table name, on HDP 2.5
> The log is in the attached file: tableNoFound.txt
> My testing code is also attached.
> The weird thing is in the attached code, if I run testSpark alone it gives 
> the above exception, but if I run the testJdbc first, and followed by 
> testSpark, both of them work.
>  After changing to create table by using
> create table ACME.ENDPOINT_STATUS
> The phoenix-spark plug in seems working. I also find some weird behavior,
> If I do both the following
> create table ACME.ENDPOINT_STATUS ...
> create table "ACME:ENDPOINT_STATUS" ...
> Both table shows up in phoenix, the first one shows as Schema ACME, and table 
> name ENDPOINT_STATUS, and the later on shows as scheme none, and table name 
> ACME:ENDPOINT_STATUS.
> However, in HBASE, I only see one table ACME:ENDPOINT_STATUS. In addition, 
> upserts in the table ACME.ENDPOINT_STATUS show up in the other table, so is 
> the other way around.
>  



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (PHOENIX-3460) Phoenix Spark plugin cannot find table with a Namespace prefix

2017-01-02 Thread Ankit Singhal (JIRA)

[ 
https://issues.apache.org/jira/browse/PHOENIX-3460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15792656#comment-15792656
 ] 

Ankit Singhal commented on PHOENIX-3460:


Is the table name provided at the time of creation is without double quotes? if 
yes, then you should be running your queries having a table name without double 
quotes and a dot between schema and table. 
like this: - Select ENDPOINT_ID, CITY from ACME.ENDPOINT_STATUS Where city = 
'ACME City'


If above is not the case, then can you share the dump of SYSTEM.CATALOG and 
SYSTEM:CATALOG table( by running scan from hbase shell). and confirm if you 
have reverted any namespace property.
{code}
org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table 
undefined. tableName=ACME:ENDPOINT_STATUS
at 
org.apache.phoenix.schema.PMetaDataImpl.getTableRef(PMetaDataImpl.java:265)
at 
org.apache.phoenix.jdbc.PhoenixConnection.getTable(PhoenixConnection.java:449)
at 
org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:407)
at 
org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:433)
at 
{code}

> Phoenix Spark plugin cannot find table with a Namespace prefix
> --
>
> Key: PHOENIX-3460
> URL: https://issues.apache.org/jira/browse/PHOENIX-3460
> Project: Phoenix
>  Issue Type: Bug
>Affects Versions: 4.8.0
> Environment: HDP 2.5
>Reporter: Xindian Long
>  Labels: phoenix, spark
> Fix For: 4.7.0
>
>
> I am testing some code using Phoenix Spark plug in to read a Phoenix table 
> with a namespace prefix in the table name (the table is created as a phoenix 
> table not a hbase table), but it returns an TableNotFoundException.
> The table is obviously there because I can query it using plain phoenix sql 
> through Squirrel. In addition, using spark sql to query it has no problem at 
> all.
> I am running on the HDP 2.5 platform, with phoenix 4.7.0.2.5.0.0-1245
> The problem does not exist at all when I was running the same code on HDP 2.4 
> cluster, with phoenix 4.4.
> Neither does the problem occur when I query a table without a namespace 
> prefix in the DB table name, on HDP 2.5
> The log is in the attached file: tableNoFound.txt
> My testing code is also attached.
> The weird thing is in the attached code, if I run testSpark alone it gives 
> the above exception, but if I run the testJdbc first, and followed by 
> testSpark, both of them work.
>  After changing to create table by using
> create table ACME.ENDPOINT_STATUS
> The phoenix-spark plug in seems working. I also find some weird behavior,
> If I do both the following
> create table ACME.ENDPOINT_STATUS ...
> create table "ACME:ENDPOINT_STATUS" ...
> Both table shows up in phoenix, the first one shows as Schema ACME, and table 
> name ENDPOINT_STATUS, and the later on shows as scheme none, and table name 
> ACME:ENDPOINT_STATUS.
> However, in HBASE, I only see one table ACME:ENDPOINT_STATUS. In addition, 
> upserts in the table ACME.ENDPOINT_STATUS show up in the other table, so is 
> the other way around.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (PHOENIX-3460) Phoenix Spark plugin cannot find table with a Namespace prefix

2016-11-07 Thread Xindian Long (JIRA)

[ 
https://issues.apache.org/jira/browse/PHOENIX-3460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15644768#comment-15644768
 ] 

Xindian Long commented on PHOENIX-3460:
---

Test code:

public class Application {
static private Logger log  = Logger.getLogger(Application.class);
 
 
public static void main(String[] args) {
SparkConf conf = new SparkConf().setAppName("NMS Tuning Engine");
JavaSparkContext sc = new JavaSparkContext(conf);
 
 
 
 
//testJdbc(sc);
testSpark(sc);
}
 
 
static public void testSpark(JavaSparkContext sc)  {
//SparkContextBuilder.buildSparkContext("Simple Application", "local");
 
 
// One JVM can only have one Spark Context now
Map options = new HashMap();
SQLContext sqlContext = new SQLContext(sc);
 
 
String tableStr = "\"ACME:ENDPOINT_STATUS\"";
String 
dataSrcUrl="jdbc:phoenix:luna-sdp-nms-01.davis.sensus.lab:2181:/hbase-unsecure";
options.put("zkUrl", dataSrcUrl);
options.put("table", tableStr);
log.info("Phoenix DB URL: " + dataSrcUrl + " tableStr: " + tableStr);
 
 
DataFrame df = null;
try {
df = 
sqlContext.read().format("org.apache.phoenix.spark").options(options).load();
df.explain(true);
} catch (Exception ex) {
log.error("sql error: ", ex);
}
 
 
try {
log.info ("Count By phoenix spark plugin: "+ df.count());
} catch (Exception ex) {
log.error("dataframe error: ", ex);
}
 
 
}
 
 
static public void testJdbc(JavaSparkContext sc)   {
Map options = new HashMap();
SQLContext sqlContext = new SQLContext(sc);
 
 
if (sc == null || sqlContext == null || options == null) {
log.info("NULL sc, sqlContext, or options");
}
 
 
String qry2 = "(Select ENDPOINT_ID, CITY from \"ACME:ENDPOINT_STATUS\" 
Where city = 'ACME City')";
String 
dataSrcUrl="jdbc:phoenix:luna-sdp-nms-01.davis.sensus.lab:2181:/hbase-unsecure";
options.put("url", dataSrcUrl);
options.put("dbtable", qry2);
log.info("Phoenix DB URL: " + dataSrcUrl + "\nquery: " + qry2);
 
 
DataFrame df = null;
try {
DataFrameReader dfRd = 
sqlContext.read().format("jdbc").options(options);
 
 
if (dfRd == null) {
log.error("NULL DataFrameReader Object dfRd in 
getEndPointDataByJdbc");
}
df = dfRd.load();
df.explain(true);
 
 
} catch (Exception ex) {
log.error("sql error: ", ex);
}
 
 
try {
log.info ("Count By Jdbc: "+ df.count());
} catch (Exception ex) {
log.error("dataframe error: ", ex);
}
}
}

> Phoenix Spark plugin cannot find table with a Namespace prefix
> --
>
> Key: PHOENIX-3460
> URL: https://issues.apache.org/jira/browse/PHOENIX-3460
> Project: Phoenix
>  Issue Type: Bug
>Affects Versions: 4.8.0
> Environment: HDP 2.5
>Reporter: Xindian Long
>  Labels: phoenix, spark
> Fix For: 4.7.0
>
>
> I am testing some code using Phoenix Spark plug in to read a Phoenix table 
> with a namespace prefix in the table name (the table is created as a phoenix 
> table not a hbase table), but it returns an TableNotFoundException.
> The table is obviously there because I can query it using plain phoenix sql 
> through Squirrel. In addition, using spark sql to query it has no problem at 
> all.
> I am running on the HDP 2.5 platform, with phoenix 4.7.0.2.5.0.0-1245
> The problem does not exist at all when I was running the same code on HDP 2.4 
> cluster, with phoenix 4.4.
> Neither does the problem occur when I query a table without a namespace 
> prefix in the DB table name, on HDP 2.5
> The log is in the attached file: tableNoFound.txt
> My testing code is also attached.
> The weird thing is in the attached code, if I run testSpark alone it gives 
> the above exception, but if I run the testJdbc first, and followed by 
> testSpark, both of them work.
>  After changing to create table by using
> create table ACME.ENDPOINT_STATUS
> The phoenix-spark plug in seems working. I also find some weird behavior,
> If I do both the following
> create table ACME.ENDPOINT_STATUS ...
> create table "ACME:ENDPOINT_STATUS" ...
> Both table shows up in phoenix, the first one shows as Schema ACME, and table 
> name ENDPOINT_STATUS, and the later on shows as scheme none, and table name 
> ACME:ENDPOINT_STATUS.
> However, in HBASE, I only see one table ACME:ENDPOINT_STATUS. In addition, 
> upserts in the table ACME.ENDPOINT_STATUS show up in the other table, so is 
> the other 

[jira] [Commented] (PHOENIX-3460) Phoenix Spark plugin cannot find table with a Namespace prefix

2016-11-07 Thread Xindian Long (JIRA)

[ 
https://issues.apache.org/jira/browse/PHOENIX-3460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15644770#comment-15644770
 ] 

Xindian Long commented on PHOENIX-3460:
---

Exception log:

16/11/03 16:32:25 INFO ZooKeeper: Initiating client connection, 
connectString=luna-sdp-nms-01.davis.sensus.lab:2181 sessionTimeout=9 
watcher=org.apache.hadoop.hbase.zookeeper.PendingWatcher@27898e13
16/11/03 16:32:25 INFO ClientCnxn: Opening socket connection to server 
10.22.13.19/10.22.13.19:2181. Will not attempt to authenticate using SASL 
(unknown error)
16/11/03 16:32:25 INFO ClientCnxn: Socket connection established to 
10.22.13.19/10.22.13.19:2181, initiating session
16/11/03 16:32:25 INFO ClientCnxn: Session establishment complete on server 
10.22.13.19/10.22.13.19:2181, sessionid = 0x1582610cca900a6, negotiated timeout 
= 4
16/11/03 16:32:25 INFO Metrics: Initializing metrics system: phoenix
16/11/03 16:32:25 WARN MetricsConfig: Cannot locate configuration: tried 
hadoop-metrics2-phoenix.properties,hadoop-metrics2.properties
16/11/03 16:32:25 INFO MetricsSystemImpl: Scheduled snapshot period at 10 
second(s).
16/11/03 16:32:25 INFO MetricsSystemImpl: phoenix metrics system started
16/11/03 16:32:26 ERROR Application: sql error: 
org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table 
undefined. tableName=ACME:ENDPOINT_STATUS
at 
org.apache.phoenix.schema.PMetaDataImpl.getTableRef(PMetaDataImpl.java:265)
at 
org.apache.phoenix.jdbc.PhoenixConnection.getTable(PhoenixConnection.java:449)
at 
org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:407)
at 
org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:433)
at 
org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:279)
at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:106)
at 
org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:57)
at 
org.apache.spark.sql.execution.datasources.LogicalRelation.(LogicalRelation.scala:37)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
at 
com.sensus.NMSEngineOnHadoop.Application.testSpark(Application.java:150)
at com.sensus.NMSEngineOnHadoop.Application.main(Application.java:129)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/11/03 16:32:26 ERROR Application: dataframe error: 
java.lang.NullPointerException
at 
com.sensus.NMSEngineOnHadoop.Application.testSpark(Application.java:157)
at com.sensus.NMSEngineOnHadoop.Application.main(Application.java:129)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/11/03 16:32:26 INFO SparkContext: Invoking stop() from shutdown hook
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/static/sql,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/SQL/execution/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/SQL/execution,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/SQL/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/SQL,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/metrics/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped 
o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
16/11/03 16:32:26 INFO ContextHandler: stopped