You don't need multiple contexts to do this:
http://spark.apache.org/docs/latest/sql-programming-guide.html#jdbc-to-other-databases
On Tue, Apr 12, 2016 at 4:05 PM, Michael Segel
wrote:
> Reading from multiple sources within the same application?
>
> How would you
You can, but I'm not sure why you would want to. If you want to isolate
different users just use hiveContext.newSession().
On Tue, Apr 12, 2016 at 1:48 AM, Natu Lauchande
wrote:
> Hi,
>
> Is it possible to have both a sqlContext and a hiveContext in the same
> application
val ALLOW_MULTIPLE_CONTEXTS = booleanConf("spark.sql.allowMultipleContexts",
defaultValue = Some(true),
doc = "When set to true, creating multiple SQLContexts/HiveContexts is
allowed." +
"When set to false, only one SQLContext/HiveContext is allowed to be
created " +