Mich,
That's right, referring to you guys.
Cheers, Kuassi
On 8/27/20 9:27 AM, Mich Talebzadeh wrote:
Thanks Kuassi,
I presume you mean Spark DEV team by "they are using ... "
cheers,
Mich
LinkedIn
/https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
According to our dev team.
From the error it is evident that they are using a jdbc jar which does
not support setting tns_admin in URL.
They might have some old jar in class-path which is being used instead
of 18.3 jar.
You can ask them to use either full URL or tns alias format URL with
Fwiw here are our write-ups on Java connectivity to Database Cloud
Services:
https://www.oracle.com/database/technologies/appdev/jdbc-db-cloud.html
Kuassi
On 8/26/20 1:50 PM, Mich Talebzadeh wrote:
Thanks Jorn,
Only running in REPL in local mode
This works fine connecting with ojdbc6.jar
Mich,
All looks fine.
Perhaps some special chars in username or password?
it is recommended not to use such characters like '@', '.' in your
password.
Best, Kuassi
On 8/26/20 12:52 PM, Mich Talebzadeh wrote:
Thanks Kuassi.
This is the version of jar file that work OK with JDBC connection
Hi,
From which release is the ojdbc8.jar from? 12c, 18c or 19c? I'd
recommend ojdbc8.jar from the latest release.
One more thing to pay attention to is the content of the
ojdbc.properties file (part of the unzipped wallet)
Make sure that ojdbc.properties file has been configured to use Oracle
Apology in advance for injecting Oracle product in this discussion but I
thought it might help address the requirements (as far as I understood
these).
We are looking into furnishing for Spark a new connector similar to the
Oracle Datasource for Hadoop,