Hi Spark Users,

I hope everyone here are doing great.

I am trying to read data from SAS through Spark SQL and write into HDFS.
Initially, I started with pure java program please find the program and
logs in the attached file sas_pure_java.txt . My program ran successfully
and it returned the data from Sas to Spark_SQL. Please note the highlighted
part in the log.

My SAS dataset has 4 rows,

Program ran successfully. So my output is,

[2016-06-10 10:35:21,584] INFO stmt(1.1)#executeQuery SELECT
a.sr_no,a.start_dt,a.end_dt FROM sasLib.run_control a; created result set
1.1.1; time= 0.122 secs (com.sas.rio.MVAStatement:590)

[2016-06-10 10:35:21,630] INFO rs(1.1.1)#next (first call to next); time=
0.045 secs (com.sas.rio.MVAResultSet:773)

1,'2016-01-01','2016-01-31'

2,'2016-02-01','2016-02-29'

3,'2016-03-01','2016-03-31'

4,'2016-04-01','2016-04-30'


Please find the full logs attached to this email in file sas_pure_java.txt.

_______________________


Now I am trying to do the same via Spark SQL. Please find my program and
logs attached to this email in file sas_spark_sql.txt .

Connection to SAS dataset is established successfully. But please note the
highlighted log below.

[2016-06-10 10:29:05,834] INFO conn(2)#prepareStatement sql=SELECT
"SR_NO","start_dt","end_dt" FROM sasLib.run_control ; prepared statement
2.1; time= 0.038 secs (com.sas.rio.MVAConnection:538)

[2016-06-10 10:29:05,935] INFO ps(2.1)#executeQuery SELECT
"SR_NO","start_dt","end_dt" FROM sasLib.run_control ; created result set
2.1.1; time= 0.102 secs (com.sas.rio.MVAStatement:590)
Please find the full logs attached to this email in file sas_spark_sql.txt

I am using same driver in both pure java and spark sql programs. But the
query generated in spark sql has quotes around the column names(Highlighted
above).
So my resulting output for that query is like this,

+-----+--------+------+
|  _c0|     _c1|   _c2|
+-----+--------+------+
|SR_NO|start_dt|end_dt|
|SR_NO|start_dt|end_dt|
|SR_NO|start_dt|end_dt|
|SR_NO|start_dt|end_dt|
+-----+--------+------+

Since both programs are using the same driver com.sas.rio.MVADriver .
Expected output should be same as my pure java programs output. But
something else is happening behind the scenes.

Any insights on this issue. Thanks for your time.


Regards,

Ajay
Spark Code to read SAS dataset
---------------------------------


package com.test.sas.connections;

import java.sql.SQLException;
import java.util.HashMap;
import java.util.Map;
import java.util.Properties;

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.sql.DataFrame;
import org.apache.spark.sql.hive.HiveContext;


public class SparkSasConnectionTest {

        public static void main(String args[]) throws SQLException, 
ClassNotFoundException {

                SparkConf sc = new 
SparkConf().setAppName("SASSparkJdbcTest").setMaster("local");
                @SuppressWarnings("resource")
                JavaSparkContext jsc = new JavaSparkContext(sc);
                HiveContext hiveContext = new 
org.apache.spark.sql.hive.HiveContext(jsc.sc());

                Properties props = new Properties();
                props.setProperty("user", "Ajay");
                props.setProperty("password", "Ajay");
                props.setProperty("librefs", "sasLib '/export/home/Ajay'");
                props.setProperty("usesspi", "none");
                props.setProperty("encryptionPolicy", "required");
                props.setProperty("encryptionAlgorithms", "AES");

                Class.forName("com.sas.rio.MVADriver");

                Map<String, String> options = new HashMap<String, String>();
                options.put("driver", "com.sas.rio.MVADriver");

                DataFrame jdbcDF = 
hiveContext.read().jdbc("jdbc:sasiom://remote1.system.com:8594","sasLib.run_control",
 props);
                jdbcDF.show();
        }

}





Spark Log
-------------

[2016-06-10 10:28:26,812] INFO Running Spark version 1.5.0 
(org.apache.spark.SparkContext:59)
[2016-06-10 10:28:27,024] WARN Unable to load native-hadoop library for your 
platform... using builtin-java classes where applicable 
(org.apache.hadoop.util.NativeCodeLoader:62)
[2016-06-10 10:28:27,588] INFO Changing view acls to: Ajay 
(org.apache.spark.SecurityManager:59)
[2016-06-10 10:28:27,589] INFO Changing modify acls to: Ajay 
(org.apache.spark.SecurityManager:59)
[2016-06-10 10:28:27,589] INFO SecurityManager: authentication disabled; ui 
acls disabled; users with view permissions: Set(Ajay); users with modify 
permissions: Set(AE10302) (org.apache.spark.SecurityManager:59)
[2016-06-10 10:28:28,012] INFO Slf4jLogger started 
(akka.event.slf4j.Slf4jLogger:80)
[2016-06-10 10:28:28,039] INFO Starting remoting (Remoting:74)
[2016-06-10 10:28:28,153] INFO Remoting started; listening on addresses 
:[akka.tcp://sparkDriver@10.0.0.9:49499] (Remoting:74)
[2016-06-10 10:28:28,158] INFO Successfully started service 'sparkDriver' on 
port 49499. (org.apache.spark.util.Utils:59)
[2016-06-10 10:28:28,177] INFO Registering MapOutputTracker 
(org.apache.spark.SparkEnv:59)
[2016-06-10 10:28:28,186] INFO Registering BlockManagerMaster 
(org.apache.spark.SparkEnv:59)
[2016-06-10 10:28:28,202] INFO Created local directory at 
/private/var/folders/2s/4vzbdgz15sj8c1kk6j2xhlfc0000gp/T/blockmgr-dfd9a2c8-0513-40cc-8b91-cf1bc931a761
 (org.apache.spark.storage.DiskBlockManager:59)
[2016-06-10 10:28:28,211] INFO MemoryStore started with capacity 1966.1 MB 
(org.apache.spark.storage.MemoryStore:59)
[2016-06-10 10:28:28,242] INFO HTTP File server directory is 
/private/var/folders/2s/4vzbdgz15sj8c1kk6j2xhlfc0000gp/T/spark-d05d1925-dc66-4b8a-b08c-4b04ed696ff4/httpd-4b6cb960-2f9e-477c-b4ef-ec7d484d2202
 (org.apache.spark.HttpFileServer:59)
[2016-06-10 10:28:28,245] INFO Starting HTTP Server 
(org.apache.spark.HttpServer:59)
[2016-06-10 10:28:28,275] INFO jetty-8.y.z-SNAPSHOT 
(org.spark-project.jetty.server.Server:272)
[2016-06-10 10:28:28,290] INFO Started SocketConnector@0.0.0.0:49500 
(org.spark-project.jetty.server.AbstractConnector:338)
[2016-06-10 10:28:28,290] INFO Successfully started service 'HTTP file server' 
on port 49500. (org.apache.spark.util.Utils:59)
[2016-06-10 10:28:28,298] INFO Registering OutputCommitCoordinator 
(org.apache.spark.SparkEnv:59)
[2016-06-10 10:28:28,414] INFO jetty-8.y.z-SNAPSHOT 
(org.spark-project.jetty.server.Server:272)
[2016-06-10 10:28:28,424] INFO Started SelectChannelConnector@0.0.0.0:4040 
(org.spark-project.jetty.server.AbstractConnector:338)
[2016-06-10 10:28:28,424] INFO Successfully started service 'SparkUI' on port 
4040. (org.apache.spark.util.Utils:59)
[2016-06-10 10:28:28,425] INFO Started SparkUI at http://10.0.0.9:4040 
(org.apache.spark.ui.SparkUI:59)
[2016-06-10 10:28:28,481] WARN Using default name DAGScheduler for source 
because spark.app.id is not set. (org.apache.spark.metrics.MetricsSystem:71)
[2016-06-10 10:28:28,483] INFO Starting executor ID driver on host localhost 
(org.apache.spark.executor.Executor:59)
[2016-06-10 10:28:28,602] INFO Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 49501. 
(org.apache.spark.util.Utils:59)
[2016-06-10 10:28:28,602] INFO Server created on 49501 
(org.apache.spark.network.netty.NettyBlockTransferService:59)
[2016-06-10 10:28:28,603] INFO Trying to register BlockManager 
(org.apache.spark.storage.BlockManagerMaster:59)
[2016-06-10 10:28:28,605] INFO Registering block manager localhost:49501 with 
1966.1 MB RAM, BlockManagerId(driver, localhost, 49501) 
(org.apache.spark.storage.BlockManagerMasterEndpoint:59)
[2016-06-10 10:28:28,606] INFO Registered BlockManager 
(org.apache.spark.storage.BlockManagerMaster:59)
[2016-06-10 10:28:29,381] INFO Initializing execution hive, version 1.2.1 
(org.apache.spark.sql.hive.HiveContext:59)
[2016-06-10 10:28:29,420] INFO Inspected Hadoop version: 2.6.0 
(org.apache.spark.sql.hive.client.ClientWrapper:59)
[2016-06-10 10:28:29,421] INFO Loaded 
org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0 
(org.apache.spark.sql.hive.client.ClientWrapper:59)
[2016-06-10 10:28:29,614] INFO 0: Opening raw store with implemenation 
class:org.apache.hadoop.hive.metastore.ObjectStore 
(org.apache.hadoop.hive.metastore.HiveMetaStore:589)
[2016-06-10 10:28:29,635] INFO ObjectStore, initialize called 
(org.apache.hadoop.hive.metastore.ObjectStore:289)
[2016-06-10 10:28:29,741] INFO Property hive.metastore.integral.jdo.pushdown 
unknown - will be ignored (DataNucleus.Persistence:77)
[2016-06-10 10:28:29,741] INFO Property datanucleus.cache.level2 unknown - will 
be ignored (DataNucleus.Persistence:77)
[2016-06-10 10:28:30,922] INFO Setting MetaStore object pin classes with 
hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
 (org.apache.hadoop.hive.metastore.ObjectStore:370)
[2016-06-10 10:28:31,640] INFO The class 
"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as 
"embedded-only" so does not have its own datastore table. 
(DataNucleus.Datastore:77)
[2016-06-10 10:28:31,641] INFO The class 
"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so 
does not have its own datastore table. (DataNucleus.Datastore:77)
[2016-06-10 10:28:32,180] INFO The class 
"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as 
"embedded-only" so does not have its own datastore table. 
(DataNucleus.Datastore:77)
[2016-06-10 10:28:32,180] INFO The class 
"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so 
does not have its own datastore table. (DataNucleus.Datastore:77)
[2016-06-10 10:28:32,318] INFO Using direct SQL, underlying DB is DERBY 
(org.apache.hadoop.hive.metastore.MetaStoreDirectSql:139)
[2016-06-10 10:28:32,320] INFO Initialized ObjectStore 
(org.apache.hadoop.hive.metastore.ObjectStore:272)
[2016-06-10 10:28:32,396] WARN Version information not found in metastore. 
hive.metastore.schema.verification is not enabled so recording the schema 
version 1.2.0 (org.apache.hadoop.hive.metastore.ObjectStore:6666)
[2016-06-10 10:28:32,490] WARN Failed to get database default, returning 
NoSuchObjectException (org.apache.hadoop.hive.metastore.ObjectStore:568)
[2016-06-10 10:28:32,991] INFO Added admin role in metastore 
(org.apache.hadoop.hive.metastore.HiveMetaStore:663)
[2016-06-10 10:28:32,992] INFO Added public role in metastore 
(org.apache.hadoop.hive.metastore.HiveMetaStore:672)
[2016-06-10 10:28:33,036] INFO No user is added in admin role, since config is 
empty (org.apache.hadoop.hive.metastore.HiveMetaStore:712)
[2016-06-10 10:28:33,120] INFO 0: get_all_databases 
(org.apache.hadoop.hive.metastore.HiveMetaStore:746)
[2016-06-10 10:28:33,121] INFO ugi=AE10302      ip=unknown-ip-addr      
cmd=get_all_databases    
(org.apache.hadoop.hive.metastore.HiveMetaStore.audit:371)
[2016-06-10 10:28:33,134] INFO 0: get_functions: db=default pat=* 
(org.apache.hadoop.hive.metastore.HiveMetaStore:746)
[2016-06-10 10:28:33,134] INFO ugi=AE10302      ip=unknown-ip-addr      
cmd=get_functions: db=default pat=*      
(org.apache.hadoop.hive.metastore.HiveMetaStore.audit:371)
[2016-06-10 10:28:33,135] INFO The class 
"org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as 
"embedded-only" so does not have its own datastore table. 
(DataNucleus.Datastore:77)
[2016-06-10 10:28:33,520] INFO Created local directory: 
/var/folders/2s/4vzbdgz15sj8c1kk6j2xhlfc0000gp/T/c1c91882-ecf3-4ae5-b873-555617a0f7b0_resources
 (org.apache.hadoop.hive.ql.session.SessionState:641)
[2016-06-10 10:28:33,665] INFO Created HDFS directory: 
/tmp/hive/AE10302/c1c91882-ecf3-4ae5-b873-555617a0f7b0 
(org.apache.hadoop.hive.ql.session.SessionState:641)
[2016-06-10 10:28:33,815] INFO Created local directory: 
/var/folders/2s/4vzbdgz15sj8c1kk6j2xhlfc0000gp/T/AE10302/c1c91882-ecf3-4ae5-b873-555617a0f7b0
 (org.apache.hadoop.hive.ql.session.SessionState:641)
[2016-06-10 10:28:33,964] INFO Created HDFS directory: 
/tmp/hive/AE10302/c1c91882-ecf3-4ae5-b873-555617a0f7b0/_tmp_space.db 
(org.apache.hadoop.hive.ql.session.SessionState:641)
[2016-06-10 10:28:34,023] INFO default warehouse location is 
/user/hive/warehouse (org.apache.spark.sql.hive.HiveContext:59)
[2016-06-10 10:28:34,032] INFO Initializing HiveMetastoreConnection version 
1.2.1 using Spark classes. (org.apache.spark.sql.hive.HiveContext:59)
[2016-06-10 10:28:34,075] INFO Inspected Hadoop version: 2.6.0 
(org.apache.spark.sql.hive.client.ClientWrapper:59)
[2016-06-10 10:28:34,100] INFO Loaded 
org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0 
(org.apache.spark.sql.hive.client.ClientWrapper:59)
[2016-06-10 10:28:34,598] WARN Unable to load native-hadoop library for your 
platform... using builtin-java classes where applicable 
(org.apache.hadoop.util.NativeCodeLoader:62)
[2016-06-10 10:28:34,664] INFO 0: Opening raw store with implemenation 
class:org.apache.hadoop.hive.metastore.ObjectStore 
(org.apache.hadoop.hive.metastore.HiveMetaStore:589)
[2016-06-10 10:28:34,683] INFO ObjectStore, initialize called 
(org.apache.hadoop.hive.metastore.ObjectStore:289)
[2016-06-10 10:28:34,778] INFO Property hive.metastore.integral.jdo.pushdown 
unknown - will be ignored (DataNucleus.Persistence:77)
[2016-06-10 10:28:34,778] INFO Property datanucleus.cache.level2 unknown - will 
be ignored (DataNucleus.Persistence:77)
[2016-06-10 10:28:35,400] INFO Setting MetaStore object pin classes with 
hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
 (org.apache.hadoop.hive.metastore.ObjectStore:370)
[2016-06-10 10:28:36,032] INFO The class 
"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as 
"embedded-only" so does not have its own datastore table. 
(DataNucleus.Datastore:77)
[2016-06-10 10:28:36,032] INFO The class 
"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so 
does not have its own datastore table. (DataNucleus.Datastore:77)
[2016-06-10 10:28:36,161] INFO The class 
"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as 
"embedded-only" so does not have its own datastore table. 
(DataNucleus.Datastore:77)
[2016-06-10 10:28:36,161] INFO The class 
"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so 
does not have its own datastore table. (DataNucleus.Datastore:77)
[2016-06-10 10:28:36,219] INFO Reading in results for query 
"org.datanucleus.store.rdbms.query.SQLQuery@0" since the connection used is 
closing (DataNucleus.Query:77)
[2016-06-10 10:28:36,221] INFO Using direct SQL, underlying DB is DERBY 
(org.apache.hadoop.hive.metastore.MetaStoreDirectSql:139)
[2016-06-10 10:28:36,223] INFO Initialized ObjectStore 
(org.apache.hadoop.hive.metastore.ObjectStore:272)
[2016-06-10 10:28:36,352] INFO Added admin role in metastore 
(org.apache.hadoop.hive.metastore.HiveMetaStore:663)
[2016-06-10 10:28:36,353] INFO Added public role in metastore 
(org.apache.hadoop.hive.metastore.HiveMetaStore:672)
[2016-06-10 10:28:36,379] INFO No user is added in admin role, since config is 
empty (org.apache.hadoop.hive.metastore.HiveMetaStore:712)
[2016-06-10 10:28:36,446] INFO 0: get_all_databases 
(org.apache.hadoop.hive.metastore.HiveMetaStore:746)
[2016-06-10 10:28:36,447] INFO ugi=AE10302      ip=unknown-ip-addr      
cmd=get_all_databases    
(org.apache.hadoop.hive.metastore.HiveMetaStore.audit:371)
[2016-06-10 10:28:36,462] INFO 0: get_functions: db=default pat=* 
(org.apache.hadoop.hive.metastore.HiveMetaStore:746)
[2016-06-10 10:28:36,462] INFO ugi=AE10302      ip=unknown-ip-addr      
cmd=get_functions: db=default pat=*      
(org.apache.hadoop.hive.metastore.HiveMetaStore.audit:371)
[2016-06-10 10:28:36,463] INFO The class 
"org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as 
"embedded-only" so does not have its own datastore table. 
(DataNucleus.Datastore:77)
[2016-06-10 10:28:37,210] INFO Created local directory: 
/var/folders/2s/4vzbdgz15sj8c1kk6j2xhlfc0000gp/T/f46996ff-5f3b-4abd-953b-ae83dc9e3ce4_resources
 (org.apache.hadoop.hive.ql.session.SessionState:641)
[2016-06-10 10:28:37,371] INFO Created HDFS directory: 
/tmp/hive/AE10302/f46996ff-5f3b-4abd-953b-ae83dc9e3ce4 
(org.apache.hadoop.hive.ql.session.SessionState:641)
[2016-06-10 10:28:37,532] INFO Created local directory: 
/var/folders/2s/4vzbdgz15sj8c1kk6j2xhlfc0000gp/T/AE10302/f46996ff-5f3b-4abd-953b-ae83dc9e3ce4
 (org.apache.hadoop.hive.ql.session.SessionState:641)
[2016-06-10 10:28:37,695] INFO Created HDFS directory: 
/tmp/hive/AE10302/f46996ff-5f3b-4abd-953b-ae83dc9e3ce4/_tmp_space.db 
(org.apache.hadoop.hive.ql.session.SessionState:641)
[2016-06-10 10:28:37,716] INFO MVADriver#connect (driver version is 9.4) 
(com.sas.rio.MVADriver:267)
[2016-06-10 10:28:53,193] INFO conn(1): 
URL=jdbc:sasiom://remote1.system.com:8594 (com.sas.rio.MVAConnection:285)
[2016-06-10 10:28:53,194] INFO conn(1):  librefs=sasLib '/export/home/Ajay' 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:28:53,194] INFO conn(1):  port=8594 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:28:53,194] INFO conn(1):  password=xxxx 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:28:53,194] INFO conn(1):  encryptionPolicy=required 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:28:53,194] INFO conn(1):  host=remote1.system.com 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:28:53,194] INFO conn(1):  userName=Ajay 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:28:53,194] INFO conn(1):  encryptionAlgorithms=AES 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:28:53,245] WARN ps(1.1) "log4j.configuration" configuration not 
reloaded.  Please specify a URL in the log4j.configuration JVM option. 
(com.sas.rio.MVAStatement:450)
[2016-06-10 10:28:53,247] INFO conn(1)#prepareStatement sql=SELECT * FROM 
sasLib.run_control WHERE 1=0; prepared statement 1.1; time= 0.05 secs 
(com.sas.rio.MVAConnection:538)
[2016-06-10 10:28:53,483] INFO ps(1.1)#executeQuery SELECT * FROM 
sasLib.run_control WHERE 1=0; created result set 1.1.1; time= 0.237 secs 
(com.sas.rio.MVAStatement:590)
[2016-06-10 10:28:53,494] INFO rs(1.1.1)#close (com.sas.rio.MVAResultSet:812)
[2016-06-10 10:28:53,544] INFO conn(1)#close (com.sas.rio.MVAConnection:669)
[2016-06-10 10:28:53,544] INFO ps(1.1)#close (com.sas.rio.MVAStatement:663)
[2016-06-10 10:28:53,971] INFO Starting job: show at 
SparkSasConnectionTest.java:42 (org.apache.spark.SparkContext:59)
[2016-06-10 10:28:53,984] INFO Got job 0 (show at 
SparkSasConnectionTest.java:42) with 1 output partitions 
(org.apache.spark.scheduler.DAGScheduler:59)
[2016-06-10 10:28:53,984] INFO Final stage: ResultStage 0(show at 
SparkSasConnectionTest.java:42) (org.apache.spark.scheduler.DAGScheduler:59)
[2016-06-10 10:28:53,985] INFO Parents of final stage: List() 
(org.apache.spark.scheduler.DAGScheduler:59)
[2016-06-10 10:28:53,985] INFO Missing parents: List() 
(org.apache.spark.scheduler.DAGScheduler:59)
[2016-06-10 10:28:53,991] INFO Submitting ResultStage 0 (MapPartitionsRDD[1] at 
show at SparkSasConnectionTest.java:42), which has no missing parents 
(org.apache.spark.scheduler.DAGScheduler:59)
[2016-06-10 10:28:54,063] INFO ensureFreeSpace(5664) called with curMem=0, 
maxMem=2061647216 (org.apache.spark.storage.MemoryStore:59)
[2016-06-10 10:28:54,065] INFO Block broadcast_0 stored as values in memory 
(estimated size 5.5 KB, free 1966.1 MB) 
(org.apache.spark.storage.MemoryStore:59)
[2016-06-10 10:28:54,072] INFO ensureFreeSpace(2848) called with curMem=5664, 
maxMem=2061647216 (org.apache.spark.storage.MemoryStore:59)
[2016-06-10 10:28:54,072] INFO Block broadcast_0_piece0 stored as bytes in 
memory (estimated size 2.8 KB, free 1966.1 MB) 
(org.apache.spark.storage.MemoryStore:59)
[2016-06-10 10:28:54,074] INFO Added broadcast_0_piece0 in memory on 
localhost:49501 (size: 2.8 KB, free: 1966.1 MB) 
(org.apache.spark.storage.BlockManagerInfo:59)
[2016-06-10 10:28:54,075] INFO Created broadcast 0 from broadcast at 
DAGScheduler.scala:861 (org.apache.spark.SparkContext:59)
[2016-06-10 10:28:54,078] INFO Submitting 1 missing tasks from ResultStage 0 
(MapPartitionsRDD[1] at show at SparkSasConnectionTest.java:42) 
(org.apache.spark.scheduler.DAGScheduler:59)
[2016-06-10 10:28:54,079] INFO Adding task set 0.0 with 1 tasks 
(org.apache.spark.scheduler.TaskSchedulerImpl:59)
[2016-06-10 10:28:54,105] INFO Starting task 0.0 in stage 0.0 (TID 0, 
localhost, PROCESS_LOCAL, 1929 bytes) 
(org.apache.spark.scheduler.TaskSetManager:59)
[2016-06-10 10:28:54,110] INFO Running task 0.0 in stage 0.0 (TID 0) 
(org.apache.spark.executor.Executor:59)
[2016-06-10 10:28:54,129] INFO MVADriver#connect (driver version is 9.4) 
(com.sas.rio.MVADriver:267)
[2016-06-10 10:29:05,794] INFO conn(2): 
URL=jdbc:sasiom://remote1.system.com:8594 (com.sas.rio.MVAConnection:285)
[2016-06-10 10:29:05,795] INFO conn(2):  librefs=sasLib '/export/home/Ajay' 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:29:05,795] INFO conn(2):  port=8594 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:29:05,795] INFO conn(2):  password=xxxx 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:29:05,795] INFO conn(2):  encryptionPolicy=required 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:29:05,795] INFO conn(2):  host=remote1.system.com 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:29:05,795] INFO conn(2):  userName=Ajay 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:29:05,795] INFO conn(2):  encryptionAlgorithms=AES 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:29:05,833] WARN ps(2.1) "log4j.configuration" configuration not 
reloaded.  Please specify a URL in the log4j.configuration JVM option. 
(com.sas.rio.MVAStatement:450)
[2016-06-10 10:29:05,834] INFO conn(2)#prepareStatement sql=SELECT 
"SR_NO","start_dt","end_dt" FROM sasLib.run_control ; prepared statement 2.1; 
time= 0.038 secs (com.sas.rio.MVAConnection:538)
[2016-06-10 10:29:05,935] INFO ps(2.1)#executeQuery SELECT 
"SR_NO","start_dt","end_dt" FROM sasLib.run_control ; created result set 2.1.1; 
time= 0.102 secs (com.sas.rio.MVAStatement:590)
[2016-06-10 10:29:05,982] INFO rs(2.1.1)#next (first call to next); time= 0.039 
secs (com.sas.rio.MVAResultSet:773)
[2016-06-10 10:29:05,988] INFO rs(2.1.1)#close (com.sas.rio.MVAResultSet:812)
[2016-06-10 10:29:06,026] INFO ps(2.1)#close (com.sas.rio.MVAStatement:663)
[2016-06-10 10:29:06,027] INFO conn(2)#close (com.sas.rio.MVAConnection:669)
[2016-06-10 10:29:06,095] INFO closed connection 
(org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD:59)
[2016-06-10 10:29:06,099] ERROR Exception in task 0.0 in stage 0.0 (TID 0) 
(org.apache.spark.executor.Executor:96)
+-----+--------+------+
|  _c0|     _c1|   _c2|
+-----+--------+------+
|SR_NO|start_dt|end_dt|
|SR_NO|start_dt|end_dt|
|SR_NO|start_dt|end_dt|
|SR_NO|start_dt|end_dt|
+-----+--------+------+
[2016-06-10 10:29:06,129] INFO Invoking stop() from shutdown hook 
(org.apache.spark.SparkContext:59)
[2016-06-10 10:29:06,155] INFO stopped 
o.s.j.s.ServletContextHandler{/static/sql,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,155] INFO stopped 
o.s.j.s.ServletContextHandler{/SQL/execution/json,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,155] INFO stopped 
o.s.j.s.ServletContextHandler{/SQL/execution,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,155] INFO stopped 
o.s.j.s.ServletContextHandler{/SQL/json,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,155] INFO stopped o.s.j.s.ServletContextHandler{/SQL,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,155] INFO stopped 
o.s.j.s.ServletContextHandler{/metrics/json,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,155] INFO stopped 
o.s.j.s.ServletContextHandler{/stages/stage/kill,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,156] INFO stopped o.s.j.s.ServletContextHandler{/api,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,156] INFO stopped o.s.j.s.ServletContextHandler{/,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,156] INFO stopped 
o.s.j.s.ServletContextHandler{/static,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,156] INFO stopped 
o.s.j.s.ServletContextHandler{/executors/threadDump/json,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,156] INFO stopped 
o.s.j.s.ServletContextHandler{/executors/threadDump,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,156] INFO stopped 
o.s.j.s.ServletContextHandler{/executors/json,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,156] INFO stopped 
o.s.j.s.ServletContextHandler{/executors,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,156] INFO stopped 
o.s.j.s.ServletContextHandler{/environment/json,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,156] INFO stopped 
o.s.j.s.ServletContextHandler{/environment,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,157] INFO stopped 
o.s.j.s.ServletContextHandler{/storage/rdd/json,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,157] INFO stopped 
o.s.j.s.ServletContextHandler{/storage/rdd,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,157] INFO stopped 
o.s.j.s.ServletContextHandler{/storage/json,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,157] INFO stopped 
o.s.j.s.ServletContextHandler{/storage,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,157] INFO stopped 
o.s.j.s.ServletContextHandler{/stages/pool/json,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,157] INFO stopped 
o.s.j.s.ServletContextHandler{/stages/pool,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,157] INFO stopped 
o.s.j.s.ServletContextHandler{/stages/stage/json,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,157] INFO stopped 
o.s.j.s.ServletContextHandler{/stages/stage,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,157] INFO stopped 
o.s.j.s.ServletContextHandler{/stages/json,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,157] INFO stopped 
o.s.j.s.ServletContextHandler{/stages,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,158] INFO stopped 
o.s.j.s.ServletContextHandler{/jobs/job/json,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,158] INFO stopped 
o.s.j.s.ServletContextHandler{/jobs/job,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,158] INFO stopped 
o.s.j.s.ServletContextHandler{/jobs/json,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,158] INFO stopped 
o.s.j.s.ServletContextHandler{/jobs,null} 
(org.spark-project.jetty.server.handler.ContextHandler:843)
[2016-06-10 10:29:06,210] INFO Stopped Spark web UI at http://10.0.0.9:4040 
(org.apache.spark.ui.SparkUI:59)
[2016-06-10 10:29:06,213] INFO Stopping DAGScheduler 
(org.apache.spark.scheduler.DAGScheduler:59)
[2016-06-10 10:29:06,272] INFO MapOutputTrackerMasterEndpoint stopped! 
(org.apache.spark.MapOutputTrackerMasterEndpoint:59)
[2016-06-10 10:29:06,277] INFO MemoryStore cleared 
(org.apache.spark.storage.MemoryStore:59)
[2016-06-10 10:29:06,277] INFO BlockManager stopped 
(org.apache.spark.storage.BlockManager:59)
[2016-06-10 10:29:06,277] INFO BlockManagerMaster stopped 
(org.apache.spark.storage.BlockManagerMaster:59)
[2016-06-10 10:29:06,279] INFO OutputCommitCoordinator stopped! 
(org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:59)
[2016-06-10 10:29:06,279] INFO Successfully stopped SparkContext 
(org.apache.spark.SparkContext:59)
[2016-06-10 10:29:06,279] INFO Shutdown hook called 
(org.apache.spark.util.ShutdownHookManager:59)
[2016-06-10 10:29:06,280] INFO Deleting directory 
/private/var/folders/2s/4vzbdgz15sj8c1kk6j2xhlfc0000gp/T/spark-d05d1925-dc66-4b8a-b08c-4b04ed696ff4
 (org.apache.spark.util.ShutdownHookManager:59)
[2016-06-10 10:29:06,280] INFO Deleting directory 
/private/var/folders/2s/4vzbdgz15sj8c1kk6j2xhlfc0000gp/T/spark-eaa15c2f-25ab-4898-86f8-0cc00130188a
 (org.apache.spark.util.ShutdownHookManager:59)
[2016-06-10 10:29:06,286] INFO Shutting down remote daemon. 
(akka.remote.RemoteActorRefProvider$RemotingTerminator:74)
[2016-06-10 10:29:06,287] INFO Remote daemon shut down; proceeding with 
flushing remote transports. 
(akka.remote.RemoteActorRefProvider$RemotingTerminator:74)
Java Code to read SAS dataset
---------------------------------

package com.test.sas.connections;

import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import java.util.HashMap;
import java.util.Map;
import java.util.Properties;


public class SASJDBCTester {

    public static void main(String args[]) throws SQLException, 
ClassNotFoundException {
        Properties props = new Properties();
        props.setProperty("user", "Ajay");
        props.setProperty("password", "Ajay");
        props.setProperty("librefs", "sasLib '/export/home/Ajay'");
        props.setProperty("usesspi", "none");
        props.setProperty("encryptionPolicy", "required");
        props.setProperty("encryptionAlgorithms","AES");

        Class.forName("com.sas.rio.MVADriver");

        try (Connection connection = 
DriverManager.getConnection("jdbc:sasiom://remote1.system.com:8594", props); 
Statement statement = connection.createStatement()) {
            ResultSet result  = statement.executeQuery("SELECT 
a.sr_no,a.start_dt,a.end_dt FROM sasLib.run_control a");
            while (result.next()) {
                System.out.println(result.getString(1) + "," + 
result.getString(2) + "," + result.getString(3));
            }
        }
    }
}

Java Logs
-------------

[2016-06-10 10:35:00,125] INFO MVADriver#connect (driver version is 9.4) 
(com.sas.rio.MVADriver:267)
[2016-06-10 10:35:21,413] INFO conn(1): 
URL=jdbc:sasiom://remote1.system.com:8594 (com.sas.rio.MVAConnection:285)
[2016-06-10 10:35:21,413] INFO conn(1):  librefs=sasLib '/export/home/Ajay' 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:35:21,415] INFO conn(1):  port=8594 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:35:21,415] INFO conn(1):  password=xxxx 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:35:21,415] INFO conn(1):  encryptionPolicy=required 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:35:21,415] INFO conn(1):  host=remote1.system.com 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:35:21,415] INFO conn(1):  userName=Ajay 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:35:21,415] INFO conn(1):  encryptionAlgorithms=AES 
(com.sas.rio.MVAConnection:296)
[2016-06-10 10:35:21,456] WARN stmt(1.1) "log4j.configuration" configuration 
not reloaded.  Please specify a URL in the log4j.configuration JVM option. 
(com.sas.rio.MVAStatement:450)
[2016-06-10 10:35:21,462] INFO conn(1)#createStatement type=TYPE_FORWARD_ONLY, 
concur=CONCUR_READ_ONLY; created statement 1.1; time= 0.041 secs 
(com.sas.rio.MVAConnection:487)
[2016-06-10 10:35:21,584] INFO stmt(1.1)#executeQuery SELECT 
a.sr_no,a.start_dt,a.end_dt FROM sasLib.run_control a; created result set 
1.1.1; time= 0.122 secs (com.sas.rio.MVAStatement:590)
[2016-06-10 10:35:21,630] INFO rs(1.1.1)#next (first call to next); time= 0.045 
secs (com.sas.rio.MVAResultSet:773)
1,'2016-01-01','2016-01-31'
2,'2016-02-01','2016-02-29'
3,'2016-03-01','2016-03-31'
4,'2016-04-01','2016-04-30'
[2016-06-10 10:35:21,665] INFO #next (end of result set) 5 
(com.sas.rio.MVAResultSet:782)
[2016-06-10 10:35:21,665] INFO stmt(1.1)#close (com.sas.rio.MVAStatement:663)
[2016-06-10 10:35:21,665] INFO rs(1.1.1)#close (com.sas.rio.MVAResultSet:812)
[2016-06-10 10:35:21,711] INFO conn(1)#close (com.sas.rio.MVAConnection:669)



==============================================================================================================================
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to