[ 
https://issues.apache.org/jira/browse/SPARK-36765?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17416519#comment-17416519
 ] 

Gabor Somogyi commented on SPARK-36765:
---------------------------------------

It was long time ago when I've done that and AFAIR it took me almost a month to 
make it work so definitely a horror task!
My knowledge is cloudy because it was not yesterday but I remember something 
like this:

The exception generally indicates that the driver can not find the appropriate 
sqljdbc_auth lib in the JVM library path.  To correct the problem, one can use 
use the java -D option to specify the "java.library.path" system property 
value.  Worth to mention full path must be set as path, otherwise it was not 
working.

All in all I've faced at least 5-6 different issues which were extremely hard 
to address. Hope others need less time to solve the issues.


> Spark Support for MS Sql JDBC connector with Kerberos/Keytab
> ------------------------------------------------------------
>
>                 Key: SPARK-36765
>                 URL: https://issues.apache.org/jira/browse/SPARK-36765
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.1.2
>         Environment: Unix Redhat Environment
>            Reporter: Dilip Thallam Sridhar
>            Priority: Major
>             Fix For: 3.1.2
>
>
> Hi Team,
>  
> We are using the Spark-3.0.2 to connect to MS SqlServer with the following 
> instruction  
> Also tried with the Spark-3.1.2 Version,
>  
>  1) download mssql-jdbc-9.4.0.jre8.jar
>  2) Generated Keytab using kinit
>  3) Validate Keytab using klist
>  4) Run the spark job with jdbc_library, principal and keytabs passed
> .config("spark.driver.extraClassPath", spark_jar_lib) \
> .config("spark.executor.extraClassPath", spark_jar_lib) \
>  5) connection_url = 
> "jdbc:sqlserver://{}:{};databaseName={};integratedSecurity=true;authenticationSchema=JavaKerberos"\
>  .format(jdbc_host_name, jdbc_port, jdbc_database_name)
> Note: without integratedSecurity=true;authenticationSchema=JavaKerberos it 
> looks for the usual username/password option to connect
> 6) passing the following options during spark read.
>  .option("principal", database_principal) \
>  .option("files", database_keytab) \
>  .option("keytab", database_keytab) \
>   
>  tried with files and keytab, just files, and with all above 3 parameters
>   
>  We are unable to connect to SqlServer from Spark and getting the following 
> error shown below. 
>   
>  A) Wanted to know if anybody was successful Spark to SqlServer? (as I see 
> the previous Jira has been closed)
>  https://issues.apache.org/jira/browse/SPARK-12312
>  https://issues.apache.org/jira/browse/SPARK-31337
>   
>  B) If yes, could you let us know if there are any additional configs needed 
> for Spark to connect to SqlServer please?
>  Appreciate if we can get inputs to resolve this error.
>   
>   
>  Full Stack Trace
> {code}
> Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: This driver is 
> not configured for integrated authentication.         at 
> com.microsoft.sqlserver.jdbc.SQLServerConnection.terminate(SQLServerConnection.java:1352)
>          at 
> com.microsoft.sqlserver.jdbc.SQLServerConnection.sendLogon(SQLServerConnection.java:2329)
>          at 
> com.microsoft.sqlserver.jdbc.SQLServerConnection.logon(SQLServerConnection.java:1905)
>          at 
> com.microsoft.sqlserver.jdbc.SQLServerConnection.access$000(SQLServerConnection.java:41)
>          at 
> com.microsoft.sqlserver.jdbc.SQLServerConnection$LogonCommand.doExecute(SQLServerConnection.java:1893)
>          at 
> com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)         
> at 
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
>          at 
> com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:1045)
>          at 
> com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:817)
>          at 
> com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:700)
>          at 
> com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:842)
>          at 
> org.apache.spark.sql.execution.datasources.jdbc.connection.BasicConnectionProvider.getConnection(BasicConnectionProvider.scala:49)
>          at 
> org.apache.spark.sql.execution.datasources.jdbc.connection.SecureConnectionProvider.getConnection(SecureConnectionProvider.scala:44)
>          at 
> org.apache.spark.sql.execution.datasources.jdbc.connection.MSSQLConnectionProvider.org$apache$spark$sql$execution$datasources$jdbc$connection$MSSQLConnectionProvider$$super$getConnection(MSSQLConnectionProvider.scala:69)
>          at 
> org.apache.spark.sql.execution.datasources.jdbc.connection.MSSQLConnectionProvider$$anon$1.run(MSSQLConnectionProvider.scala:69)
>          at 
> org.apache.spark.sql.execution.datasources.jdbc.connection.MSSQLConnectionProvider$$anon$1.run(MSSQLConnectionProvider.scala:67)
>          at java.base/java.security.AccessController.doPrivileged(Native 
> Method)         at 
> java.base/javax.security.auth.Subject.doAs(Subject.java:423)         at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>          ... 23 more Caused by: java.lang.UnsatisfiedLinkError: no 
> sqljdbc_auth in java.library.path: [/usr/java/packages/lib, /usr/lib64, 
> /lib64, /lib, /usr/lib]         at 
> java.base/java.lang.ClassLoader.loadLibrary(ClassLoader.java:2660)         at 
> java.base/java.lang.Runtime.loadLibrary0(Runtime.java:827)         at 
> java.base/java.lang.System.loadLibrary(System.java:1871)         at 
> com.microsoft.sqlserver.jdbc.AuthenticationJNI.<clinit>(AuthenticationJNI.java:32)
>          at 
> com.microsoft.sqlserver.jdbc.SQLServerConnection.logon(SQLServerConnection.java:1902)
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to