Hi, The link deals with JDBC and states:
[image: Inline images 1] So it is only SQL. It lacks functionality on Stored procedures with returning result set. This is on an Oracle table scala> var _ORACLEserver = "jdbc:oracle:thin:@rhes564:1521:mydb12" _ORACLEserver: String = jdbc:oracle:thin:@rhes564:1521:mydb12 scala> var _username = "scratchpad" _username: String = scratchpad scala> var _password = "xxxxxxx" _password: String = oracle scala> val s = HiveContext.read.format("jdbc").options( | Map("url" -> _ORACLEserver, | *"dbtable" -> "exec weights_sp",* | "user" -> _username, | "password" -> _password)).load java.sql.SQLSyntaxErrorException: ORA-00942: table or view does not exist and that stored procedure exists in Oracle scratch...@mydb12.mich.LOCAL> desc weights_sp PROCEDURE weights_sp HTH Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>* http://talebzadehmich.wordpress.com *Disclaimer:* Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction. On 14 August 2016 at 17:42, Michael Armbrust <mich...@databricks.com> wrote: > As described here > <http://spark.apache.org/docs/latest/sql-programming-guide.html#jdbc-to-other-databases>, > you can use the DataSource API to connect to an external database using > JDBC. While the dbtable option is usually just a table name, it can also > be any valid SQL command that returns a table when enclosed in > (parentheses). I'm not certain, but I'd expect you could use this feature > to invoke a stored procedure and return the results as a DataFrame. > > On Sat, Aug 13, 2016 at 10:40 AM, sujeet jog <sujeet....@gmail.com> wrote: > >> Hi, >> >> Is there a way to call a stored procedure using spark ? >> >> >> thanks, >> Sujeet >> > >