Re: Spark SQL JDBC Connectivity and more

2014-06-09 Thread Michael Armbrust
> [Venkat] Are you saying - pull in the SharkServer2 code in my standalone > spark application (as a part of the standalone application process), pass > in > the spark context of the standalone app to SharkServer2 Sparkcontext at > startup and viola we get a SQL/JDBC interfaces for the RDDs of t

Re: Spark SQL JDBC Connectivity and more

2014-06-09 Thread Venkat Subramanian
1) If I have a standalone spark application that has already built a RDD, how can SharkServer2 or for that matter Shark access 'that' RDD and do queries on it. All the examples I have seen for Shark, the RDD (tables) are created within Shark's spark context and processed. This is not possible out

Re: Spark SQL JDBC Connectivity and more

2014-05-29 Thread Michael Armbrust
On Thu, May 29, 2014 at 3:26 PM, Venkat Subramanian wrote: > > 1) If I have a standalone spark application that has already built a RDD, > how can SharkServer2 or for that matter Shark access 'that' RDD and do > queries on it. All the examples I have seen for Shark, the RDD (tables) are > created

Re: Spark SQL JDBC Connectivity and more

2014-05-29 Thread Venkat Subramanian
Thanks Michael. OK will try SharkServer2.. But I have some basic questions on a related area: 1) If I have a standalone spark application that has already built a RDD, how can SharkServer2 or for that matter Shark access 'that' RDD and do queries on it. All the examples I have seen for Shark, the