gaborgsomogyi commented on a change in pull request #28368:
URL: https://github.com/apache/spark/pull/28368#discussion_r416571234



##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/connection/DB2ConnectionProvider.scala
##########
@@ -48,7 +48,7 @@ private[sql] class DB2ConnectionProvider(driver: Driver, 
options: JDBCOptions)
     result
   }
 
-  override def setAuthenticationConfigIfNeeded(): Unit = {
+  override def setAuthenticationConfigIfNeeded(): Unit = 
SecurityConfigurationLock.synchronized {

Review comment:
       Valid use-case to consider, let me explain what happens this case.
   
   Let's assume:
   * database-1 = postgres
   * database-2 = db2
   
   Execution path:
   * t => security config state: `postgres` -> `empty` (JVM default)
   * t+1 =>  security config state: `db2` -> `postgres` -> `empty`
   * t+2 =>  thread-1 tries to reach `postgres` entry but the system config is 
`db2`. Since the name at `db2` entry doesn't match it forwards the request to 
it's parent `postgres`. The entry is matching here so this will be used for 
authentication.
   * t+3 =>  thread-2 tries to reach `db2` entry and the system config is 
`db2`. The entry is matching here so this will be used for authentication.
   
   That said the newly added security configuration is stacked together with 
its parent. It forwards the requests to the parent when the application name is 
not matching.
   
   The writes between the threads are synchronised with the new object adde in 
this PR.
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to