[ 
https://issues.apache.org/jira/browse/SPARK-12403?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15082687#comment-15082687
 ] 

Lunen commented on SPARK-12403:
-------------------------------

I've managed to get in contact with the people who develops the Spark ODBC 
drivers. They told me that they OEM the driver to Databricks and that they 
don't understand why they would not make the latest driver available. I've also 
tested a trail version of the developer's latest driver and it works perfectly 
fine.

Asked on Databricks' forumn and sent emails to their sales and info department 
explaining the situation. Hopefully someone can help.

> "Simba Spark ODBC Driver 1.0" not working with 1.5.2 anymore
> ------------------------------------------------------------
>
>                 Key: SPARK-12403
>                 URL: https://issues.apache.org/jira/browse/SPARK-12403
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.0, 1.5.1, 1.5.2
>         Environment: ODBC connector query 
>            Reporter: Lunen
>
> We are unable to query the SPARK tables using the ODBC driver from Simba 
> Spark(Databricks - "Simba Spark ODBC Driver 1.0")  We are able to do a show 
> databases and show tables, but not any queries. eg.
> Working:
> Select * from openquery(SPARK,'SHOW DATABASES')
> Select * from openquery(SPARK,'SHOW TABLES')
> Not working:
> Select * from openquery(SPARK,'Select * from lunentest')
> The error I get is:
> OLE DB provider "MSDASQL" for linked server "SPARK" returned message 
> "[Simba][SQLEngine] (31740) Table or view not found: spark..lunentest".
> Msg 7321, Level 16, State 2, Line 2
> An error occurred while preparing the query "Select * from lunentest" for 
> execution against OLE DB provider "MSDASQL" for linked server "SPARK"



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to