[jira] [Commented] (SPARK-9280) New HiveContext object unexpectedly loads configuration settings from history

2015-10-12 Thread Tien-Dung LE (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-9280?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14953158#comment-14953158
 ] 

Tien-Dung LE commented on SPARK-9280:
-

[~davies] I have just checked out with the latest spark master code. SqlContext 
(instead of HiveContext) returns correct number of partitions. Many thanks for 
this fix.

> New HiveContext object unexpectedly loads configuration settings from history 
> --
>
> Key: SPARK-9280
> URL: https://issues.apache.org/jira/browse/SPARK-9280
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 1.3.1, 1.4.1
>Reporter: Tien-Dung LE
>
> In a spark session, stopping a spark context and create a new spark context 
> and hive context does not clean the spark sql configuration. More precisely, 
> the new hive context still keeps the previous configuration settings. It 
> would be great if someone can let us know how to avoid this situation.
> {code:title=New hive context should not load the configurations from history}
> sqlContext.setConf( "spark.sql.shuffle.partitions", "10")
> sc.stop
> val sparkConf2 = new org.apache.spark.SparkConf()
> val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
> val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )
> sqlContext2.getConf( "spark.sql.shuffle.partitions", "20") 
> // got 20 as expected
> sqlContext2.setConf( "foo", "foo") 
> sqlContext2.getConf( "spark.sql.shuffle.partitions", "30")
> // expected 30 but got 10
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-9280) New HiveContext object unexpectedly loads configuration settings from history

2015-10-08 Thread Davies Liu (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-9280?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14949706#comment-14949706
 ] 

Davies Liu commented on SPARK-9280:
---

[~tien-dung.le] I check this with latest master (with SQLContext, because I 
can't create multiple active HiveContext for SerBy), it works as expected. 
Could you help to double check ?

> New HiveContext object unexpectedly loads configuration settings from history 
> --
>
> Key: SPARK-9280
> URL: https://issues.apache.org/jira/browse/SPARK-9280
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 1.3.1, 1.4.1
>Reporter: Tien-Dung LE
>
> In a spark session, stopping a spark context and create a new spark context 
> and hive context does not clean the spark sql configuration. More precisely, 
> the new hive context still keeps the previous configuration settings. It 
> would be great if someone can let us know how to avoid this situation.
> {code:title=New hive context should not load the configurations from history}
> sqlContext.setConf( "spark.sql.shuffle.partitions", "10")
> sc.stop
> val sparkConf2 = new org.apache.spark.SparkConf()
> val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
> val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )
> sqlContext2.getConf( "spark.sql.shuffle.partitions", "20") 
> // got 20 as expected
> sqlContext2.setConf( "foo", "foo") 
> sqlContext2.getConf( "spark.sql.shuffle.partitions", "30")
> // expected 30 but got 10
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-9280) New HiveContext object unexpectedly loads configuration settings from history

2015-07-28 Thread Michael Armbrust (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-9280?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14644824#comment-14644824
 ] 

Michael Armbrust commented on SPARK-9280:
-

Oh, hmm... I see what the issue is, that does seem like something is messed up 
with session initialization.

> New HiveContext object unexpectedly loads configuration settings from history 
> --
>
> Key: SPARK-9280
> URL: https://issues.apache.org/jira/browse/SPARK-9280
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.3.1, 1.4.1
>Reporter: Tien-Dung LE
>
> In a spark session, stopping a spark context and create a new spark context 
> and hive context does not clean the spark sql configuration. More precisely, 
> the new hive context still keeps the previous configuration settings. It 
> would be great if someone can let us know how to avoid this situation.
> {code:title=New hive context should not load the configurations from history}
> sqlContext.setConf( "spark.sql.shuffle.partitions", "10")
> sc.stop
> val sparkConf2 = new org.apache.spark.SparkConf()
> val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
> val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )
> sqlContext2.getConf( "spark.sql.shuffle.partitions", "20") 
> // got 20 as expected
> sqlContext2.setConf( "foo", "foo") 
> sqlContext2.getConf( "spark.sql.shuffle.partitions", "30")
> // expected 30 but got 10
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-9280) New HiveContext object unexpectedly loads configuration settings from history

2015-07-28 Thread Pierre Borckmans (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-9280?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14644063#comment-14644063
 ] 

Pierre Borckmans commented on SPARK-9280:
-

[~marmbrus] Is it an intended behaviour that we need to set a configuration 
(even a dummy one like the sqlContext2.setConf( "foo", "foo") ) before we can 
actually get one of the predefined configurations like 
spark.sql.shuffle.partitions?

Shouldn't the lazy val storing these configs be initialized on the first 
sqlContext.getConf call?

> New HiveContext object unexpectedly loads configuration settings from history 
> --
>
> Key: SPARK-9280
> URL: https://issues.apache.org/jira/browse/SPARK-9280
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.3.1, 1.4.1
>Reporter: Tien-Dung LE
>
> In a spark session, stopping a spark context and create a new spark context 
> and hive context does not clean the spark sql configuration. More precisely, 
> the new hive context still keeps the previous configuration settings. It 
> would be great if someone can let us know how to avoid this situation.
> {code:title=New hive context should not load the configurations from history}
> sqlContext.setConf( "spark.sql.shuffle.partitions", "10")
> sc.stop
> val sparkConf2 = new org.apache.spark.SparkConf()
> val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
> val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )
> sqlContext2.getConf( "spark.sql.shuffle.partitions", "20") 
> // got 20 as expected
> sqlContext2.setConf( "foo", "foo") 
> sqlContext2.getConf( "spark.sql.shuffle.partitions", "30")
> // expected 30 but got 10
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-9280) New HiveContext object unexpectedly loads configuration settings from history

2015-07-27 Thread Tien-Dung LE (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-9280?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14642840#comment-14642840
 ] 

Tien-Dung LE commented on SPARK-9280:
-

thanks [~pborck]. Detach the hive session state 
org.apache.hadoop.hive.ql.session.SessionState.detachSession() helps to avoid 
the issue.

> New HiveContext object unexpectedly loads configuration settings from history 
> --
>
> Key: SPARK-9280
> URL: https://issues.apache.org/jira/browse/SPARK-9280
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.3.1, 1.4.1
>Reporter: Tien-Dung LE
>
> In a spark session, stopping a spark context and create a new spark context 
> and hive context does not clean the spark sql configuration. More precisely, 
> the new hive context still keeps the previous configuration settings. It 
> would be great if someone can let us know how to avoid this situation.
> {code:title=New hive context should not load the configurations from history}
> sqlContext.setConf( "spark.sql.shuffle.partitions", "10")
> sc.stop
> val sparkConf2 = new org.apache.spark.SparkConf()
> val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
> val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )
> sqlContext2.getConf( "spark.sql.shuffle.partitions", "20") 
> // got 20 as expected
> sqlContext2.setConf( "foo", "foo") 
> sqlContext2.getConf( "spark.sql.shuffle.partitions", "30")
> // expected 30 but got 10
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org