val ALLOW_MULTIPLE_CONTEXTS = booleanConf("spark.sql.allowMultipleContexts",
    defaultValue = Some(true),
    doc = "When set to true, creating multiple SQLContexts/HiveContexts is 
allowed." +
      "When set to false, only one SQLContext/HiveContext is allowed to be 
created " +
      "through the constructor (new SQLContexts/HiveContexts created through 
newSession " +
      "method is allowed). Please note that this conf needs to be set in Spark 
Conf. Once" +
      "a SQLContext/HiveContext has been created, changing the value of this 
conf will not" +
      "have effect.",
isPublic = true)

I don’t think there is any performance pernalties of doing so.
From: Natu Lauchande [mailto:nlaucha...@gmail.com]
Sent: Tuesday, April 12, 2016 4:49 PM
To: user@spark.apache.org
Subject: Can i have a hive context and sql context in the same app ?

Hi,
Is it possible to have both a sqlContext and a hiveContext in the same 
application ?
If yes would there be any performance pernalties of doing so.

Regards,
Natu

Reply via email to