[
https://issues.apache.org/jira/browse/SPARK-43865?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17728234#comment-17728234
]
pin_zhang commented on SPARK-43865:
---
It's not convinient to upload jar to all worker n
[
https://issues.apache.org/jira/browse/SPARK-43865?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang updated SPARK-43865:
--
Description:
1. Test with JDK 11 + SPARK340
object BugHS {
def main(args: Array[String]): Unit = {
[
https://issues.apache.org/jira/browse/SPARK-43865?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang updated SPARK-43865:
--
Description:
1. Test with JDK 11 + SPARK340
object BugHS {
def main(args: Array[String]): Unit = {
pin_zhang created SPARK-43865:
-
Summary: spark cluster deploy mode cannot initialize metastore
java.sql.SQLException: No suitable driver found for jdbc:mysql
Key: SPARK-43865
URL: https://issues.apache.org/jira/browse
[
https://issues.apache.org/jira/browse/SPARK-41168?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang updated SPARK-41168:
--
Description:
Spark master + 1 Spark workers (One is No space left on device)
1. Submit a app with 2 i
[
https://issues.apache.org/jira/browse/SPARK-41168?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang updated SPARK-41168:
--
Description:
Spark master + 1 Spark workers (One is No space left on device)
1. Submit a app with 2 i
[
https://issues.apache.org/jira/browse/SPARK-41168?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang updated SPARK-41168:
--
Description:
Spark master + 1 Spark workers (One is )
Submit a app with 2 instance,
1. Spark worker S
[
https://issues.apache.org/jira/browse/SPARK-41168?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang updated SPARK-41168:
--
Description:
1. Spark worker
Caused by large number of executors kept in spark master memory
2022-11
pin_zhang created SPARK-41168:
-
Summary: Spark Master OOM when Worker No space left on device
Key: SPARK-41168
URL: https://issues.apache.org/jira/browse/SPARK-41168
Project: Spark
Issue Type: Bu
[
https://issues.apache.org/jira/browse/SPARK-33946?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang updated SPARK-33946:
--
Description:
Test with hive setting:
* hive.server2.idle.session.timeout=6
* hive.server2.session.
pin_zhang created SPARK-33946:
-
Summary: Cannot connect to spark hive after session timeout
Key: SPARK-33946
URL: https://issues.apache.org/jira/browse/SPARK-33946
Project: Spark
Issue Type: Bug
[
https://issues.apache.org/jira/browse/SPARK-25804?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17070938#comment-17070938
]
pin_zhang commented on SPARK-25804:
---
Any comments on this issue?
> JDOPersistenceMana
[
https://issues.apache.org/jira/browse/SPARK-29423?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16949187#comment-16949187
]
pin_zhang commented on SPARK-29423:
---
The same result on Spark 2.4.3.
> leak on org.a
pin_zhang created SPARK-29423:
-
Summary: leak on
org.apache.spark.sql.execution.streaming.StreamingQueryListenerBus
Key: SPARK-29423
URL: https://issues.apache.org/jira/browse/SPARK-29423
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-21067?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16863796#comment-16863796
]
pin_zhang commented on SPARK-21067:
---
We also encounter this issue, any plan to fix thi
[
https://issues.apache.org/jira/browse/SPARK-27600?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang reopened SPARK-27600:
---
The issue is not resolved
> Unable to start Spark Hive Thrift Server when multiple hive server server
[
https://issues.apache.org/jira/browse/SPARK-27600?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16835437#comment-16835437
]
pin_zhang edited comment on SPARK-27600 at 5/8/19 9:02 AM:
---
[~
[
https://issues.apache.org/jira/browse/SPARK-27600?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16835437#comment-16835437
]
pin_zhang commented on SPARK-27600:
---
I think this is relate to a hive bug
https://iss
[
https://issues.apache.org/jira/browse/SPARK-27553?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16833044#comment-16833044
]
pin_zhang commented on SPARK-27553:
---
The operation log is not closed when close the se
pin_zhang created SPARK-27600:
-
Summary: Unable to start Spark Hive Thrift Server when multiple
hive server server share the same metastore
Key: SPARK-27600
URL: https://issues.apache.org/jira/browse/SPARK-27600
pin_zhang created SPARK-27553:
-
Summary: Operation log is not closed when close session
Key: SPARK-27553
URL: https://issues.apache.org/jira/browse/SPARK-27553
Project: Spark
Issue Type: Bug
[
https://issues.apache.org/jira/browse/SPARK-25804?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang updated SPARK-25804:
--
Description:
1. start-thriftserver.sh under SPARK2.3.1
2. Create Table and insert values
create
pin_zhang created SPARK-25804:
-
Summary: JDOPersistenceManager leak when query via JDBC
Key: SPARK-25804
URL: https://issues.apache.org/jira/browse/SPARK-25804
Project: Spark
Issue Type: Bug
[
https://issues.apache.org/jira/browse/SPARK-25169?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang updated SPARK-25169:
--
Component/s: (was: Spark Core)
SQL
> Multiple DataFrames cannot write to the same
pin_zhang created SPARK-25169:
-
Summary: Multiple DataFrames cannot write to the same folder
concurrently
Key: SPARK-25169
URL: https://issues.apache.org/jira/browse/SPARK-25169
Project: Spark
I
pin_zhang created SPARK-24749:
-
Summary: Cannot filter array with named_struct
Key: SPARK-24749
URL: https://issues.apache.org/jira/browse/SPARK-24749
Project: Spark
Issue Type: Bug
Com
[
https://issues.apache.org/jira/browse/SPARK-23371?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16373862#comment-16373862
]
pin_zhang commented on SPARK-23371:
---
# It's the spark that bundles two versions(1.6 and
pin_zhang created SPARK-23371:
-
Summary: Parquet Footer data is wrong on window in parquet format
partition table
Key: SPARK-23371
URL: https://issues.apache.org/jira/browse/SPARK-23371
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-23086?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang updated SPARK-23086:
--
Description:
* Hive metastore is mysql
* Set hive.server2.thrift.max.worker.threads=500
create tabl
pin_zhang created SPARK-23086:
-
Summary: Spark SQL cannot support high concurrency for lock in
HiveMetastoreCatalog
Key: SPARK-23086
URL: https://issues.apache.org/jira/browse/SPARK-23086
Project: Spark
pin_zhang created SPARK-22420:
-
Summary: Spark SQL return invalid json string for struct with
date/datetime field
Key: SPARK-22420
URL: https://issues.apache.org/jira/browse/SPARK-22420
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-21437?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16091100#comment-16091100
]
pin_zhang commented on SPARK-21437:
---
Hive doesn't have such limitation, we can create
pin_zhang created SPARK-21437:
-
Summary: Java Keyword cannot be used in table schema
Key: SPARK-21437
URL: https://issues.apache.org/jira/browse/SPARK-21437
Project: Spark
Issue Type: Bug
pin_zhang created SPARK-21105:
-
Summary: Useless empty files in hive table
Key: SPARK-21105
URL: https://issues.apache.org/jira/browse/SPARK-21105
Project: Spark
Issue Type: Improvement
pin_zhang created SPARK-18536:
-
Summary: Failed to save to hive table when case class with empty
field
Key: SPARK-18536
URL: https://issues.apache.org/jira/browse/SPARK-18536
Project: Spark
Issu
[
https://issues.apache.org/jira/browse/SPARK-17398?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang closed SPARK-17398.
-
Resolution: Fixed
Fix Version/s: 2.0.1
> Failed to query on external JSon Partitioned table
>
[
https://issues.apache.org/jira/browse/SPARK-17932?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang updated SPARK-17932:
--
Description:
SQL "show table extended like table_name " doesn't work in spark 2.0.0
that works in spar
pin_zhang created SPARK-17932:
-
Summary: Failed to run SQL "show table extended like table_name"
in Spark2.0.0
Key: SPARK-17932
URL: https://issues.apache.org/jira/browse/SPARK-17932
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-12008?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15482828#comment-15482828
]
pin_zhang commented on SPARK-12008:
---
Does Spark SQL have any plan to support authrizati
[
https://issues.apache.org/jira/browse/SPARK-17396?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15466180#comment-15466180
]
pin_zhang commented on SPARK-17396:
---
"Thread-1902" daemon prio=6 tid=0x14078800
[
https://issues.apache.org/jira/browse/SPARK-17396?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15464602#comment-15464602
]
pin_zhang commented on SPARK-17396:
---
1.Thousand of thread created look like
ForkJoinPoo
pin_zhang created SPARK-17398:
-
Summary: Failed to query on external JSon Partitioned table
Key: SPARK-17398
URL: https://issues.apache.org/jira/browse/SPARK-17398
Project: Spark
Issue Type: Bug
pin_zhang created SPARK-17396:
-
Summary: Threads number keep increasing when query on external CSV
partitioned table
Key: SPARK-17396
URL: https://issues.apache.org/jira/browse/SPARK-17396
Project: Spark
pin_zhang created SPARK-17395:
-
Summary: Queries on CSV partition table result in frequent GC
Key: SPARK-17395
URL: https://issues.apache.org/jira/browse/SPARK-17395
Project: Spark
Issue Type: B
[
https://issues.apache.org/jira/browse/SPARK-9686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15358916#comment-15358916
]
pin_zhang commented on SPARK-9686:
--
Any plan to fix this bug?
> Spark Thrift server does
pin_zhang created SPARK-12262:
-
Summary: describe extended doesn't return table on detail info
tabled stored as PARQUET format
Key: SPARK-12262
URL: https://issues.apache.org/jira/browse/SPARK-12262
Proje
[
https://issues.apache.org/jira/browse/SPARK-10290?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang closed SPARK-10290.
-
not a bug
> Spark can register temp table and hive table with the same table name
>
[
https://issues.apache.org/jira/browse/SPARK-12008?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15035159#comment-15035159
]
pin_zhang commented on SPARK-12008:
---
Any comments?
> Spark hive security authorization
pin_zhang created SPARK-12008:
-
Summary: Spark hive security authorization doesn't work as Apache
hive's
Key: SPARK-12008
URL: https://issues.apache.org/jira/browse/SPARK-12008
Project: Spark
Is
[
https://issues.apache.org/jira/browse/SPARK-11748?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15013030#comment-15013030
]
pin_zhang commented on SPARK-11748:
---
Apache hive 0.14 has added Support for Parquet Col
pin_zhang created SPARK-11748:
-
Summary: Result is null after alter column name of table stored as
Parquet
Key: SPARK-11748
URL: https://issues.apache.org/jira/browse/SPARK-11748
Project: Spark
pin_zhang created SPARK-10290:
-
Summary: Spark can register temp table and hive table with the
same table name
Key: SPARK-10290
URL: https://issues.apache.org/jira/browse/SPARK-10290
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-9686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14704374#comment-14704374
]
pin_zhang commented on SPARK-9686:
--
What's the status of this bug? will it be fixed in 1.
[
https://issues.apache.org/jira/browse/SPARK-9686?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang updated SPARK-9686:
-
Summary: Spark hive jdbc client cannot get table from metadata store (was:
Spark hive jdbc client cannot
pin_zhang created SPARK-9686:
Summary: Spark hive jdbc client cannot get table from metadata
Key: SPARK-9686
URL: https://issues.apache.org/jira/browse/SPARK-9686
Project: Spark
Issue Type: Bug
pin_zhang created SPARK-7480:
Summary: Get exception when DataFrame saveAsTable and run sql on
the same table at the same time
Key: SPARK-7480
URL: https://issues.apache.org/jira/browse/SPARK-7480
Project
[
https://issues.apache.org/jira/browse/SPARK-6923?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang updated SPARK-6923:
-
Comment: was deleted
(was: Hi, Cheng Hao
Thanks for your reply!
Do you mean if provide a wrapper for
[
https://issues.apache.org/jira/browse/SPARK-6923?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
pin_zhang updated SPARK-6923:
-
Comment: was deleted
(was: Hi, Cheng Hao
Thanks for your reply!
Do you mean if provide a wrapper for
[
https://issues.apache.org/jira/browse/SPARK-6923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14521278#comment-14521278
]
pin_zhang commented on SPARK-6923:
--
Hi, Cheng Hao
Thanks for your reply!
Do you mea
[
https://issues.apache.org/jira/browse/SPARK-6923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14521280#comment-14521280
]
pin_zhang commented on SPARK-6923:
--
Hi, Cheng Hao
Thanks for your reply!
Do you mea
[
https://issues.apache.org/jira/browse/SPARK-6923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14521277#comment-14521277
]
pin_zhang commented on SPARK-6923:
--
Hi, Cheng Hao
Thanks for your reply!
Do you mea
[
https://issues.apache.org/jira/browse/SPARK-6923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14510381#comment-14510381
]
pin_zhang edited comment on SPARK-6923 at 4/27/15 9:43 AM:
---
Hi,
[
https://issues.apache.org/jira/browse/SPARK-6923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14510381#comment-14510381
]
pin_zhang commented on SPARK-6923:
--
Hi, Michael
Can this CLI bug be fixed in Spark1.3?
[
https://issues.apache.org/jira/browse/SPARK-6923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14507182#comment-14507182
]
pin_zhang commented on SPARK-6923:
--
Hi, Michael
Can you help to comment. we have a such u
[
https://issues.apache.org/jira/browse/SPARK-6923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14504409#comment-14504409
]
pin_zhang commented on SPARK-6923:
--
Hi, Michael
We run spark app in Spark1.3, and use
[
https://issues.apache.org/jira/browse/SPARK-6923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14502748#comment-14502748
]
pin_zhang commented on SPARK-6923:
--
Hi, Michael
Can you help to comment on this issue?
>
[
https://issues.apache.org/jira/browse/SPARK-6923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14499141#comment-14499141
]
pin_zhang commented on SPARK-6923:
--
Do you means if save data frame to the table that use
[
https://issues.apache.org/jira/browse/SPARK-6923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14497701#comment-14497701
]
pin_zhang commented on SPARK-6923:
--
In spark1.1.0 client with the jdbc api to get the tab
pin_zhang created SPARK-6923:
Summary: Get invalid hive table columns after save DataFrame to
hive table
Key: SPARK-6923
URL: https://issues.apache.org/jira/browse/SPARK-6923
Project: Spark
Issu
69 matches
Mail list logo