Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/22180#discussion_r212790276
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/config.scala
---
@@ -192,6 +192,12 @@ package object config
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/22180#discussion_r212161599
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala
---
@@ -368,7 +369,11 @@ private[spark] class
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/22180
cc @gatorsmile @vanzin
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail
GitHub user yaooqinn opened a pull request:
https://github.com/apache/spark/pull/22180
[SPARK-25174][YARN]Limit the size of diagnostic message for am to
unregister itself from rm
## What changes were proposed in this pull request?
When using older versions of spark
Github user yaooqinn closed the pull request at:
https://github.com/apache/spark/pull/15071
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19840
@vanzin I am not very familiar with python part
[context.py#L191](https://github.com/yaooqinn/spark/blob/8ff5663fe9a32eae79c8ee6bc310409170a8da64/python/pyspark/context.py#L191),
so handle
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/21290#discussion_r187851318
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala ---
@@ -76,6 +75,7 @@ private[deploy] class SparkSubmitArguments(args
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/21290
cc @srowen
The last code change seems tp be related to you, plz help to review, thanks
---
-
To unsubscribe, e-mail
GitHub user yaooqinn opened a pull request:
https://github.com/apache/spark/pull/21290
[SPARK-24241][Submit]Do not fail fast when dynamic resource allocation
enabled with 0 executor
## What changes were proposed in this pull request?
```
~/spark-2.3.0-bin-hadoop2.7$ bin
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/20784#discussion_r177610275
--- Diff:
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala
---
@@ -121,6 +123,11 @@ private[hive
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/20784#discussion_r177610190
--- Diff:
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala
---
@@ -121,6 +123,11 @@ private[hive
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/20784
retest this please
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/20784
retest this please
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/20784#discussion_r177288126
--- Diff:
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala
---
@@ -121,6 +134,25 @@ private[hive
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/20898
proxy or not, i only find such an issue with proxy
https://github.com/apache/spark/pull/20784
---
-
To unsubscribe, e-mail
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/18666
@liufengdb yes you are right. what's more, the Hive's `SessionState` is too
overstaffed for spark to create an hive client, we may only need an
`IMetaStoreClient
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/18666
@liufengdb its necessary to create these for `add jar` cmd
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/18666
retest this please
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/18666
@gatorsmile would you plz take a look at this.
this pr mainly want to close HiveSessionState explicitly to delete
`hive.downloaded.resources.dir` which points to `"${system:java.io.t
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/18666
@samartinucci thanks for reminding of this, i have fixed the conflicts.
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/20784
yarn @vanzin
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/20784
cc @cloud-fan
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
GitHub user yaooqinn opened a pull request:
https://github.com/apache/spark/pull/20784
[SPARK-23639][SQL]Obtain token before init metastore client in SparkSQL CLI
## What changes were proposed in this pull request?
In SparkSQLCLI, SessionState generates before SparkContext
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/20571
yesï¼ single dash means the coming ones are all maven optionsï¼they will
be handled later by mvn command. we do not parse each single option in this
while loop from then till now
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/20571
OKï¼ i will push a commit very soon
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/20571
sorry for not replying in time. the logic here is that firstly unrecognized
- -options show warning message and usage informationï¼secondly the - options
end the while loop and treat the current
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/20571#discussion_r167440294
--- Diff: dev/make-distribution.sh ---
@@ -72,8 +76,15 @@ while (( "$#" )); do
--help)
exit_
GitHub user yaooqinn opened a pull request:
https://github.com/apache/spark/pull/20571
[SPARK-23383][Build][Minor]Make a distribution should exit with usage while
detecting wrong options
## What changes were proposed in this pull request?
```shell
./dev/make-distribution.sh
GitHub user yaooqinn opened a pull request:
https://github.com/apache/spark/pull/20469
[SPARK-23295][Build][Minor]Exclude Waring message when generating versions
in make-distribution.sh
## What changes were proposed in this pull request?
When we specified a wrong profile
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/20422
@squito add a test for index file. plz check it again, thanks.
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/20422
retest this please
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/20422
thanks guys for reviewing. yes, this is just a minor improvement which I
guess code here seem not very logical when I was trying to do some
optimizations for my customer's heavy shuffle case
GitHub user yaooqinn opened a pull request:
https://github.com/apache/spark/pull/20422
[SPARK-23253][Core][Shuffle]Only write shuffle temporary index file when
there is not an existing one
## What changes were proposed in this pull request?
Shuffle Index temporay file
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/18983
besides, redirecting output here only is needed when there is an instance
of a CliSessionState, otherwise, it will be done during the SessionState
initializing in HiveClientImpl
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/20145
retest this please
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/20145#discussion_r159587396
--- Diff:
sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveUtilsSuite.scala ---
@@ -42,4 +47,41 @@ class HiveUtilsSuite extends QueryTest
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/20145#discussion_r159585279
--- Diff:
sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveUtilsSuite.scala ---
@@ -42,4 +47,29 @@ class HiveUtilsSuite extends QueryTest
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/20145#discussion_r159585110
--- Diff:
sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveUtilsSuite.scala ---
@@ -42,4 +47,29 @@ class HiveUtilsSuite extends QueryTest
GitHub user yaooqinn opened a pull request:
https://github.com/apache/spark/pull/20145
[SPARK-22950]Handle ChildFirstURLClassLoader's parent
## What changes were proposed in this pull request?
ChildFirstClassLoader's parent is set to null, so we can't get jars from
its
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19840
@vanzin
according to @ueshin 's explanation, `PYSPARK_DRIVER_PYTHON` is only for
driver, if executor follows the order of
[SparkSubmitCommandBuilder.java#L304](https://github.com/apache/spark
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19840
@ueshin
[context.py#L191](https://github.com/yaooqinn/spark/blob/8ff5663fe9a32eae79c8ee6bc310409170a8da64/python/pyspark/context.py#L191)
set for both driver and executor
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19840
@ueshin i see.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19840
I can `spark.executorEnv.PYSPARK_PYTHON` in `sparkConf` at executor side ,
because it is set at
[context.py#L156](https://github.com/yaooqinn/spark/blob/8ff5663fe9a32eae79c8ee6bc310409170a8da64
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19840
@ueshin case 8 should be client deploy mode, excuse me for copy mistake,
fixed
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19840
use spark-2.2.0-bin-hadoop2.7 numpy
examples/src/main/python/mllib/correlations_example.py
### case 1
|key|value|
|---|---|
|**PYSPARK_DRIVER_PYTHON**|~/anaconda3/envs/py3/bin
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19840
@vanzin PYSPARK_DRIVER_PYTHON won't work because
[context.py#L191](https://github.com/yaooqinn/spark/blob/8ff5663fe9a32eae79c8ee6bc310409170a8da64/python/pyspark/context.py#L191)
does't deal
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19840
Yes, you are right. we should use same python executables. But the **same**
might mean binary same not just same path
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19840
https://user-images.githubusercontent.com/8326978/33471349-e570953e-d6a7-11e7-9fec-74963efe37d2.png;>
@jerryshao ENVs are specified ok by yarn, but the `pythonExec` is genera
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19840
i guess specifing `PYSPARK_PYTHON=~/anaconda3/envs/py3/bin/python`
overwrites spark.executorEnv.PYSPARK_PYTHON by
[context.py#L156](https://github.com/yaooqinn/spark/blob
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19840
@ueshin cluster mode working, client not
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
GitHub user yaooqinn opened a pull request:
https://github.com/apache/spark/pull/19840
[SPARK-22640][PYSPARK][YARN]switch python exec in executor side
## What changes were proposed in this pull request?
```
PYSPARK_PYTHON=~/anaconda3/envs/py3/bin/python \
bin/spark
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19719
LGTM
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19719#discussion_r150368818
--- Diff:
sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/HiveThriftServer2Suites.scala
---
@@ -521,7 +521,20 @@ class
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19719#discussion_r150368744
--- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveUtils.scala
---
@@ -66,6 +66,12 @@ private[spark] object HiveUtils extends Logging
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19719#discussion_r150368620
--- Diff:
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLEnv.scala
---
@@ -55,6 +55,7 @@ private[hive] object
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19719#discussion_r150368111
--- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveUtils.scala
---
@@ -66,6 +66,12 @@ private[spark] object HiveUtils extends Logging
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19719#discussion_r150366200
--- Diff:
sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/HiveThriftServer2Suites.scala
---
@@ -521,7 +521,20 @@ class
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19719
make it an AlternativeConfig maybe better
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19719#discussion_r150364666
--- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveUtils.scala
---
@@ -66,6 +66,12 @@ private[spark] object HiveUtils extends Logging
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19719#discussion_r150364544
--- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveUtils.scala
---
@@ -66,6 +66,12 @@ private[spark] object HiveUtils extends Logging
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19712
@cloud-fan referenced
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19712#discussion_r150161651
--- Diff:
sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/HiveThriftServer2Suites.scala
---
@@ -521,20 +521,7 @@ class
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19712#discussion_r150161137
--- Diff:
sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/HiveThriftServer2Suites.scala
---
@@ -521,20 +521,7 @@ class
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19712
cc again @gatorsmile and would you mind adding me to the jenkins' white
list? thanks, hoping not bother you
GitHub user yaooqinn opened a pull request:
https://github.com/apache/spark/pull/19712
[SPARK-22487][SQL][Hive]Remove the unused HIVE_EXECUTION_VERSION property
## What changes were proposed in this pull request?
Actually there is no hive client for executions in spark now
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19663#discussion_r149849250
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
---
@@ -687,6 +687,20 @@ private[spark] class Client
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19688
thanks @HyukjinKwon. fixed.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19688
not familiar with dos cmd, plz review again @jiangxb1987 @srowen , thanks
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19688
@jerryshao PR description added. I notice that this is small, but they seem
to be different issues.
---
-
To unsubscribe, e
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19663#discussion_r149561925
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
---
@@ -687,6 +687,20 @@ private[spark] class Client
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19663#discussion_r149561877
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
---
@@ -687,6 +687,20 @@ private[spark] class Client
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19663#discussion_r149561888
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
---
@@ -687,6 +687,20 @@ private[spark] class Client
GitHub user yaooqinn opened a pull request:
https://github.com/apache/spark/pull/19688
[SPARK-22466][Spark Submit]export SPARK_CONF_DIR while conf is default
## What changes were proposed in this pull request?
### Before
```
Kent@KentsMacBookPro î°
~/Documents
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19663#discussion_r149272398
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
---
@@ -705,6 +705,19 @@ private[spark] class Client
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19663#discussion_r149017871
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
---
@@ -705,6 +705,19 @@ private[spark] class Client
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19663#discussion_r149016913
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
---
@@ -705,6 +705,19 @@ private[spark] class Client
GitHub user yaooqinn opened a pull request:
https://github.com/apache/spark/pull/19663
[SPARK-21888][Hive]add hadoop/hive/hdfs configuration files in
SPARK_CONF_DIR to distribute archive
## What changes were proposed in this pull request?
When I ran self contained sql apps
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/18666#discussion_r147642263
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala
---
@@ -201,6 +201,16 @@ private[hive] class HiveClientImpl
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19363
hi @cloud-fan jenkins ok
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19363
retest this please
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19363
cc @viirya @cloud-fan @gatorsmile
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19363#discussion_r143904084
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/KeyValueGroupedDataset.scala ---
@@ -564,4 +565,30 @@ class KeyValueGroupedDataset[K, V] private
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19363
Okï¼i will add a jira tgt and fix RelationalGroupedDataset
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19363
cc again @cloud-fan @gatorsmile
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19363#discussion_r141509523
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/KeyValueGroupedDataset.scala ---
@@ -54,6 +55,14 @@ class KeyValueGroupedDataset[K, V] private[sql
GitHub user yaooqinn opened a pull request:
https://github.com/apache/spark/pull/19363
[Minor]Override toString of KeyValueGroupedDataset
## What changes were proposed in this pull request?
before
```scala
scala> val words = spark.read.textFile("fR
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19068
retest this please
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19068
@cloud-fan i met linkage err before, and now i simplify the logic, could
you trigger jenkins before reverting
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19273
ok to me, more tests are needed on #18648
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19068#discussion_r138625628
--- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveUtils.scala
---
@@ -232,6 +232,54 @@ private[spark] object HiveUtils extends Logging
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19068#discussion_r138625678
--- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveUtils.scala
---
@@ -232,6 +232,54 @@ private[spark] object HiveUtils extends Logging
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19068
jenkins unreachable cc @cloud-fan
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19068#discussion_r138557791
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala
---
@@ -132,43 +134,26 @@ private[hive] class HiveClientImpl
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19068#discussion_r138564306
--- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveUtils.scala
---
@@ -231,6 +231,42 @@ private[spark] object HiveUtils extends Logging
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19068#discussion_r138557772
--- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveUtils.scala
---
@@ -231,6 +231,42 @@ private[spark] object HiveUtils extends Logging
Github user yaooqinn commented on a diff in the pull request:
https://github.com/apache/spark/pull/19068#discussion_r138556873
--- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveUtils.scala
---
@@ -231,6 +231,42 @@ private[spark] object HiveUtils extends Logging
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19068
@cloud-fan The cliSessionState is meant to be reused but discarded for
isolated hive client classloader couldn't get it through `SessionState.get()`,
so hive client will generated a `SessionState
Github user yaooqinn closed the pull request at:
https://github.com/apache/spark/pull/17387
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19068
cc @cloud-fan @jiangxb1987 @gatorsmile pr title and descriptions updated
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user yaooqinn commented on the issue:
https://github.com/apache/spark/pull/19068
@dilipbiswal This is because the CliSessionState instance initialized in
SparkCLISQLDriver pointing to a dummy metastore and reused later in the hive
metastore client
1 - 100 of 233 matches
Mail list logo