[GitHub] [spark] shahidki31 commented on a change in pull request #26241: [SPARK-29585][WEBUI] Fix stagePage duration
shahidki31 commented on a change in pull request #26241: [SPARK-29585][WEBUI] Fix stagePage duration URL: https://github.com/apache/spark/pull/26241#discussion_r339914769 ## File path: core/src/main/scala/org/apache/spark/ui/jobs/StageTable.scala ## @@ -417,19 +417,8 @@ private[ui] class StageDataSource( case Some(t) => UIUtils.formatDate(t) case None => "Unknown" } -val finishTime = stageData.completionTime.map(_.getTime()).getOrElse(currentTime) - -// The submission time for a stage is misleading because it counts the time -// the stage waits to be launched. (SPARK-10930) -val duration = stageData.firstTaskLaunchedTime.map { date => - val time = date.getTime() - if (finishTime > time) { -finishTime - time - } else { -None -currentTime - time - } -} + +val duration = Some(stageData.executorRunTime) Review comment: Yes. `completed time - launched time = executorRunTime + ser/deser time + gettingResultTime`. right? Also, I think these 3 columns are getting displayed in the task table, Right?. If you really want, you may can add a new column and you can name it `totalTime`, which displays the summation of 3. But still, I am not sure that will match with stage duration, as stage may can have other parameters for duration. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] shahidki31 commented on a change in pull request #26241: [SPARK-29585][WEBUI] Fix stagePage duration
shahidki31 commented on a change in pull request #26241: [SPARK-29585][WEBUI] Fix stagePage duration URL: https://github.com/apache/spark/pull/26241#discussion_r339914769 ## File path: core/src/main/scala/org/apache/spark/ui/jobs/StageTable.scala ## @@ -417,19 +417,8 @@ private[ui] class StageDataSource( case Some(t) => UIUtils.formatDate(t) case None => "Unknown" } -val finishTime = stageData.completionTime.map(_.getTime()).getOrElse(currentTime) - -// The submission time for a stage is misleading because it counts the time -// the stage waits to be launched. (SPARK-10930) -val duration = stageData.firstTaskLaunchedTime.map { date => - val time = date.getTime() - if (finishTime > time) { -finishTime - time - } else { -None -currentTime - time - } -} + +val duration = Some(stageData.executorRunTime) Review comment: Yes. `completed time - launched time = executorRunTime + ser/deser time + gettingResultTime`. right? Also, I think these 3 columns are getting displayed in the task table, Right?. If you really want, you may can add a new column and you can name it `totalTime` which display the summation of 3. But still, I am not sure that will match with stage duration, as stage may can have other parameters for duration. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26280: [SPARK-14922][SPARK-17732][SPARK-23866][SQL] Support partition filter in ALTER TABLE DROP PARTITION and batch dropping PARTITIONS
AmplabJenkins removed a comment on issue #26280: [SPARK-14922][SPARK-17732][SPARK-23866][SQL] Support partition filter in ALTER TABLE DROP PARTITION and batch dropping PARTITIONS URL: https://github.com/apache/spark/pull/26280#issuecomment-547281415 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/17765/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26280: [SPARK-14922][SPARK-17732][SPARK-23866][SQL] Support partition filter in ALTER TABLE DROP PARTITION and batch dropping PARTITIONS
AmplabJenkins removed a comment on issue #26280: [SPARK-14922][SPARK-17732][SPARK-23866][SQL] Support partition filter in ALTER TABLE DROP PARTITION and batch dropping PARTITIONS URL: https://github.com/apache/spark/pull/26280#issuecomment-547281408 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26280: [SPARK-14922][SPARK-17732][SPARK-23866][SQL] Support partition filter in ALTER TABLE DROP PARTITION and batch dropping PARTITIONS
AmplabJenkins commented on issue #26280: [SPARK-14922][SPARK-17732][SPARK-23866][SQL] Support partition filter in ALTER TABLE DROP PARTITION and batch dropping PARTITIONS URL: https://github.com/apache/spark/pull/26280#issuecomment-547281415 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/17765/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26280: [SPARK-14922][SPARK-17732][SPARK-23866][SQL] Support partition filter in ALTER TABLE DROP PARTITION and batch dropping PARTITIONS
AmplabJenkins commented on issue #26280: [SPARK-14922][SPARK-17732][SPARK-23866][SQL] Support partition filter in ALTER TABLE DROP PARTITION and batch dropping PARTITIONS URL: https://github.com/apache/spark/pull/26280#issuecomment-547281408 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26291: [SPARK-29629][SQL] Support typed integer literal expression
AmplabJenkins removed a comment on issue #26291: [SPARK-29629][SQL] Support typed integer literal expression URL: https://github.com/apache/spark/pull/26291#issuecomment-547279279 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/17763/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26291: [SPARK-29629][SQL] Support typed integer literal expression
AmplabJenkins removed a comment on issue #26291: [SPARK-29629][SQL] Support typed integer literal expression URL: https://github.com/apache/spark/pull/26291#issuecomment-547279267 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26280: [SPARK-14922][SPARK-17732][SPARK-23866][SQL] Support partition filter in ALTER TABLE DROP PARTITION and batch dropping PARTITIONS
AmplabJenkins removed a comment on issue #26280: [SPARK-14922][SPARK-17732][SPARK-23866][SQL] Support partition filter in ALTER TABLE DROP PARTITION and batch dropping PARTITIONS URL: https://github.com/apache/spark/pull/26280#issuecomment-546862339 Can one of the admins verify this patch? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] SparkQA commented on issue #26290: [SPARK-29120][SQL][TESTS] Port create_view.sql
SparkQA commented on issue #26290: [SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290#issuecomment-547281049 **[Test build #112829 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112829/testReport)** for PR 26290 at commit [`fce4819`](https://github.com/apache/spark/commit/fce481975c2bce0ab484276a5505cf47de4fa8b2). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] SparkQA commented on issue #26280: [SPARK-14922][SPARK-17732][SPARK-23866][SQL] Support partition filter in ALTER TABLE DROP PARTITION and batch dropping PARTITIONS
SparkQA commented on issue #26280: [SPARK-14922][SPARK-17732][SPARK-23866][SQL] Support partition filter in ALTER TABLE DROP PARTITION and batch dropping PARTITIONS URL: https://github.com/apache/spark/pull/26280#issuecomment-547281050 **[Test build #112830 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112830/testReport)** for PR 26280 at commit [`ce453ed`](https://github.com/apache/spark/commit/ce453ed06150a4b93136ae9f43e1273a406a6d9e). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] SparkQA commented on issue #26291: [SPARK-29629][SQL] Support typed integer literal expression
SparkQA commented on issue #26291: [SPARK-29629][SQL] Support typed integer literal expression URL: https://github.com/apache/spark/pull/26291#issuecomment-547281068 **[Test build #112828 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112828/testReport)** for PR 26291 at commit [`b453a2e`](https://github.com/apache/spark/commit/b453a2e5b6b3f3f25937a97cfa59ca1c42c032ce). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] maropu commented on issue #26280: [SPARK-14922][SPARK-17732][SPARK-23866][SQL] Support partition filter in ALTER TABLE DROP PARTITION and batch dropping PARTITIONS
maropu commented on issue #26280: [SPARK-14922][SPARK-17732][SPARK-23866][SQL] Support partition filter in ALTER TABLE DROP PARTITION and batch dropping PARTITIONS URL: https://github.com/apache/spark/pull/26280#issuecomment-547280817 ok to test This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql
AmplabJenkins commented on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290#issuecomment-547279243 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/17764/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26291: [SPARK-29629][SQL] Support typed integer literal expression
AmplabJenkins commented on issue #26291: [SPARK-29629][SQL] Support typed integer literal expression URL: https://github.com/apache/spark/pull/26291#issuecomment-547279267 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql
AmplabJenkins removed a comment on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290#issuecomment-547279237 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26291: [SPARK-29629][SQL] Support typed integer literal expression
AmplabJenkins commented on issue #26291: [SPARK-29629][SQL] Support typed integer literal expression URL: https://github.com/apache/spark/pull/26291#issuecomment-547279279 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/17763/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql
AmplabJenkins commented on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290#issuecomment-547279237 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql
AmplabJenkins removed a comment on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290#issuecomment-547279243 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/17764/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] yaooqinn opened a new pull request #26291: [SPARK-29629][SQL] Support typed integer literal expression
yaooqinn opened a new pull request #26291: [SPARK-29629][SQL] Support typed integer literal expression URL: https://github.com/apache/spark/pull/26291 ### What changes were proposed in this pull request? ``` postgres=# select date '2001-09-28' + integer '7'; ?column? 2001-10-05 (1 row)postgres=# select integer '7'; int4 -- 7 (1 row) ``` Add support for typed integer literal expression from postgreSQL. ### Why are the changes needed? SPARK-27764 Feature Parity between PostgreSQL and Spark ### Does this PR introduce any user-facing change? support typed integer lit in SQL ### How was this patch tested? add uts This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] teeyog commented on a change in pull request #26241: [SPARK-29585][WEBUI] Fix stagePage duration
teeyog commented on a change in pull request #26241: [SPARK-29585][WEBUI] Fix stagePage duration URL: https://github.com/apache/spark/pull/26241#discussion_r339911079 ## File path: core/src/main/scala/org/apache/spark/ui/jobs/StageTable.scala ## @@ -417,19 +417,8 @@ private[ui] class StageDataSource( case Some(t) => UIUtils.formatDate(t) case None => "Unknown" } -val finishTime = stageData.completionTime.map(_.getTime()).getOrElse(currentTime) - -// The submission time for a stage is misleading because it counts the time -// the stage waits to be launched. (SPARK-10930) -val duration = stageData.firstTaskLaunchedTime.map { date => - val time = date.getTime() - if (finishTime > time) { -finishTime - time - } else { -None -currentTime - time - } -} + +val duration = Some(stageData.executorRunTime) Review comment: @shahidki31 I think the task duration should be calculated using the ```completed time - launched time``` instead of the executorRunTime. This ensures that a stage has only one task, and the stage duration and task duration are the same. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] turboFei commented on issue #26086: [SPARK-29302] Make the file name of a task for dynamic partition overwrite be unique
turboFei commented on issue #26086: [SPARK-29302] Make the file name of a task for dynamic partition overwrite be unique URL: https://github.com/apache/spark/pull/26086#issuecomment-547277051 > Is this different to #24142? There is a little different. I have read the comment of #24142 and it seems there is a risk that for non-FileOutputCommitter, if a task failed and failed to cleanup task output, spark would give duplicate result. In this PR, I only named a task file with taskId and attemptId for dynamic partition overwrite, for that dynamicPartitionOverwrite would keep a filesToMove, so there is no risk to cause duplicate result. PS: I think it is rarely to meet non-FileOutputCommitter case. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] turboFei edited a comment on issue #26086: [SPARK-29302] Make the file name of a task for dynamic partition overwrite be unique
turboFei edited a comment on issue #26086: [SPARK-29302] Make the file name of a task for dynamic partition overwrite be unique URL: https://github.com/apache/spark/pull/26086#issuecomment-547277051 > Is this different to #24142? @viirya Thanks for your reply. There is a little different. I have read the comments of #24142 and it seems there is a risk that for non-FileOutputCommitter, if a task failed and failed to cleanup task output, spark would give duplicate result. In this PR, I only named a task file with taskId and attemptId for dynamic partition overwrite, for that dynamicPartitionOverwrite would keep a filesToMove, so there is no risk to cause duplicate result. PS: I think it is rarely to meet non-FileOutputCommitter case. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] viirya commented on issue #26087: [SPARK-29427][SQL] Create KeyValueGroupedDataset from existing columns in DataFrame
viirya commented on issue #26087: [SPARK-29427][SQL] Create KeyValueGroupedDataset from existing columns in DataFrame URL: https://github.com/apache/spark/pull/26087#issuecomment-547275254 thanks @HyukjinKwon This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] viirya closed pull request #26265: [SPARK-29565][ML][PYTHON] OneHotEncoder should support single-column input/output
viirya closed pull request #26265: [SPARK-29565][ML][PYTHON] OneHotEncoder should support single-column input/output URL: https://github.com/apache/spark/pull/26265 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] viirya commented on issue #26265: [SPARK-29565][ML][PYTHON] OneHotEncoder should support single-column input/output
viirya commented on issue #26265: [SPARK-29565][ML][PYTHON] OneHotEncoder should support single-column input/output URL: https://github.com/apache/spark/pull/26265#issuecomment-547274264 Thanks! Merging to master. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] maropu commented on a change in pull request #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql
maropu commented on a change in pull request #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290#discussion_r339906396 ## File path: sql/core/src/test/resources/sql-tests/inputs/postgreSQL/create_view.sql ## @@ -0,0 +1,793 @@ +-- Portions Copyright (c) 1996-2019, PostgreSQL Global Development Group +-- +-- CREATE VIEW +-- https://github.com/postgres/postgres/blob/REL_12_STABLE/src/test/regress/sql/create_view.sql + +-- Spark doesn't support geometric types +-- CREATE VIEW street AS +--SELECT r.name, r.thepath, c.cname AS cname +--FROM ONLY road r, real_city c +--WHERE c.outline ## r.thepath; + +-- Spark doesn't support geometric types +-- CREATE VIEW iexit AS +--SELECT ih.name, ih.thepath, +-- interpt_pp(ih.thepath, r.thepath) AS exit +--FROM ihighway ih, ramp r +--WHERE ih.thepath ## r.thepath; + +CREATE TABLE emp ( + name string, + age int, + -- Spark doesn't support a geometric type `point` + -- location point + salary int, + manager string +) USING parquet; + +CREATE VIEW toyemp AS + SELECT name, age, /* location ,*/ 12*salary AS annualsal + FROM emp; + +-- Spark doesn't support the COMMENT clause that is not defined in the SQL standard +-- Test comments +-- COMMENT ON VIEW noview IS 'no view'; +-- COMMENT ON VIEW toyemp IS 'is a view'; +-- COMMENT ON VIEW toyemp IS NULL; + +DROP VIEW toyemp; + +-- These views are left around mainly to exercise special cases in pg_dump. + +-- [SPARK-19842] Informational Referential Integrity Constraints Support in Spark +-- CREATE TABLE view_base_table (key int PRIMARY KEY, data varchar(20)); +-- +-- CREATE VIEW key_dependent_view AS +--SELECT * FROM view_base_table GROUP BY key; +-- +-- ALTER TABLE view_base_table DROP CONSTRAINT view_base_table_pkey; -- fails + +-- CREATE VIEW key_dependent_view_no_cols AS +--SELECT FROM view_base_table GROUP BY key HAVING length(data) > 0; + +-- +-- CREATE OR REPLACE VIEW +-- + +CREATE TABLE viewtest_tbl (a int, b int) using parquet; +-- [SPARK-29386] Copy data between a file and a table +-- COPY viewtest_tbl FROM stdin; +-- 5 10 +-- 10 15 +-- 15 20 +-- 20 25 +-- \. +INSERT INTO viewtest_tbl VALUES (5, 10), (10, 15), (15, 20), (20, 25); + +CREATE OR REPLACE VIEW viewtest AS + SELECT * FROM viewtest_tbl; + +CREATE OR REPLACE VIEW viewtest AS + SELECT * FROM viewtest_tbl WHERE a > 10; + +SELECT * FROM viewtest; + +CREATE OR REPLACE VIEW viewtest AS + SELECT a, b FROM viewtest_tbl WHERE a > 5 ORDER BY b DESC; + +SELECT * FROM viewtest; + +-- should fail +-- Spark can accept the DDL query below +CREATE OR REPLACE VIEW viewtest AS + SELECT a FROM viewtest_tbl WHERE a <> 20; + +-- should fail +-- Spark can accept the DDL query below +CREATE OR REPLACE VIEW viewtest AS + SELECT 1, * FROM viewtest_tbl; + +-- should fail +-- Spark can accept the DDL query below +CREATE OR REPLACE VIEW viewtest AS + SELECT a, decimal(b) FROM viewtest_tbl; + +-- should work +CREATE OR REPLACE VIEW viewtest AS + SELECT a, b, 0 AS c FROM viewtest_tbl; + +DROP VIEW viewtest; +DROP TABLE viewtest_tbl; + +-- tests for temporary views + +-- Spark doesn't support the cascaded syntax below in `CREATE SCHEMA` +-- CREATE SCHEMA temp_view_test +-- CREATE TABLE base_table (a int, id int) using parquet +-- CREATE TABLE base_table2 (a int, id int) using parquet; +CREATE SCHEMA temp_view_test; +CREATE TABLE temp_view_test.base_table (a int, id int) using parquet; +CREATE TABLE temp_view_test.base_table2 (a int, id int) using parquet; + +-- Replace SET with USE +-- SET search_path TO temp_view_test, public; +USE temp_view_test; + +-- Since Spark doesn't support CREATE TEMPORARY TABLE, we used CREATE TEMPORARY VIEW instead +-- CREATE TEMPORARY TABLE temp_table (a int, id int); +CREATE TEMPORARY VIEW temp_table AS SELECT * FROM VALUES + (1, 1) as temp_table(a, id); + +-- should be created in temp_view_test schema +CREATE VIEW v1 AS SELECT * FROM base_table; +DESC TABLE EXTENDED v1; +-- should be created in temp object schema +-- [SPARK-X] Forcibly create a temporary view in CREATE VIEW if referencing a temporary view +CREATE VIEW v1_temp AS SELECT * FROM temp_table; Review comment: https://issues.apache.org/jira/browse/SPARK-29628 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] maropu commented on a change in pull request #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql
maropu commented on a change in pull request #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290#discussion_r339906306 ## File path: sql/core/src/test/resources/sql-tests/inputs/postgreSQL/create_view.sql ## @@ -0,0 +1,793 @@ +-- Portions Copyright (c) 1996-2019, PostgreSQL Global Development Group +-- +-- CREATE VIEW +-- https://github.com/postgres/postgres/blob/REL_12_STABLE/src/test/regress/sql/create_view.sql + +-- Spark doesn't support geometric types +-- CREATE VIEW street AS +--SELECT r.name, r.thepath, c.cname AS cname +--FROM ONLY road r, real_city c +--WHERE c.outline ## r.thepath; + +-- Spark doesn't support geometric types +-- CREATE VIEW iexit AS +--SELECT ih.name, ih.thepath, +-- interpt_pp(ih.thepath, r.thepath) AS exit +--FROM ihighway ih, ramp r +--WHERE ih.thepath ## r.thepath; + +CREATE TABLE emp ( + name string, + age int, + -- Spark doesn't support a geometric type `point` + -- location point + salary int, + manager string +) USING parquet; + +CREATE VIEW toyemp AS + SELECT name, age, /* location ,*/ 12*salary AS annualsal + FROM emp; + +-- Spark doesn't support the COMMENT clause that is not defined in the SQL standard +-- Test comments +-- COMMENT ON VIEW noview IS 'no view'; +-- COMMENT ON VIEW toyemp IS 'is a view'; +-- COMMENT ON VIEW toyemp IS NULL; + +DROP VIEW toyemp; + +-- These views are left around mainly to exercise special cases in pg_dump. + +-- [SPARK-19842] Informational Referential Integrity Constraints Support in Spark +-- CREATE TABLE view_base_table (key int PRIMARY KEY, data varchar(20)); +-- +-- CREATE VIEW key_dependent_view AS +--SELECT * FROM view_base_table GROUP BY key; +-- +-- ALTER TABLE view_base_table DROP CONSTRAINT view_base_table_pkey; -- fails + +-- CREATE VIEW key_dependent_view_no_cols AS +--SELECT FROM view_base_table GROUP BY key HAVING length(data) > 0; + +-- +-- CREATE OR REPLACE VIEW +-- + +CREATE TABLE viewtest_tbl (a int, b int) using parquet; +-- [SPARK-29386] Copy data between a file and a table +-- COPY viewtest_tbl FROM stdin; +-- 5 10 +-- 10 15 +-- 15 20 +-- 20 25 +-- \. +INSERT INTO viewtest_tbl VALUES (5, 10), (10, 15), (15, 20), (20, 25); + +CREATE OR REPLACE VIEW viewtest AS + SELECT * FROM viewtest_tbl; + +CREATE OR REPLACE VIEW viewtest AS + SELECT * FROM viewtest_tbl WHERE a > 10; + +SELECT * FROM viewtest; + +CREATE OR REPLACE VIEW viewtest AS + SELECT a, b FROM viewtest_tbl WHERE a > 5 ORDER BY b DESC; + +SELECT * FROM viewtest; + +-- should fail +-- Spark can accept the DDL query below +CREATE OR REPLACE VIEW viewtest AS + SELECT a FROM viewtest_tbl WHERE a <> 20; + +-- should fail +-- Spark can accept the DDL query below +CREATE OR REPLACE VIEW viewtest AS + SELECT 1, * FROM viewtest_tbl; + +-- should fail +-- Spark can accept the DDL query below +CREATE OR REPLACE VIEW viewtest AS + SELECT a, decimal(b) FROM viewtest_tbl; + +-- should work +CREATE OR REPLACE VIEW viewtest AS + SELECT a, b, 0 AS c FROM viewtest_tbl; + +DROP VIEW viewtest; +DROP TABLE viewtest_tbl; + +-- tests for temporary views + +-- Spark doesn't support the cascaded syntax below in `CREATE SCHEMA` +-- CREATE SCHEMA temp_view_test +-- CREATE TABLE base_table (a int, id int) using parquet +-- CREATE TABLE base_table2 (a int, id int) using parquet; +CREATE SCHEMA temp_view_test; +CREATE TABLE temp_view_test.base_table (a int, id int) using parquet; +CREATE TABLE temp_view_test.base_table2 (a int, id int) using parquet; + +-- Replace SET with USE +-- SET search_path TO temp_view_test, public; +USE temp_view_test; + +-- Since Spark doesn't support CREATE TEMPORARY TABLE, we used CREATE TEMPORARY VIEW instead +-- CREATE TEMPORARY TABLE temp_table (a int, id int); +CREATE TEMPORARY VIEW temp_table AS SELECT * FROM VALUES + (1, 1) as temp_table(a, id); + +-- should be created in temp_view_test schema +CREATE VIEW v1 AS SELECT * FROM base_table; +DESC TABLE EXTENDED v1; +-- should be created in temp object schema +-- [SPARK-X] Forcibly create a temporary view in CREATE VIEW if referencing a temporary view +CREATE VIEW v1_temp AS SELECT * FROM temp_table; +-- should be created in temp object schema +CREATE TEMPORARY VIEW v2_temp AS SELECT * FROM base_table; +DESC TABLE EXTENDED v2_temp; +-- should be created in temp_views schema +CREATE VIEW temp_view_test.v2 AS SELECT * FROM base_table; +DESC TABLE EXTENDED temp_view_test.v2; +-- should fail +-- [SPARK-X] Forcibly create a temporary view in CREATE VIEW if referencing a temporary view +CREATE VIEW temp_view_test.v3_temp AS SELECT * FROM temp_table; +-- should fail +-- Spark doesn't support the cascaded syntax below in `CREATE SCHEMA` +-- CREATE SCHEMA test_view_schema +-- CREATE TEMP VIEW testview AS SELECT 1; + +-- joins: if any of the join relations are temporary, the view +-- should also be temporary + +-- should
[GitHub] [spark] AmplabJenkins commented on issue #26177: [SPARK-29520][SS] Fix checks of negative intervals
AmplabJenkins commented on issue #26177: [SPARK-29520][SS] Fix checks of negative intervals URL: https://github.com/apache/spark/pull/26177#issuecomment-547271897 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/17762/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils
AmplabJenkins removed a comment on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils URL: https://github.com/apache/spark/pull/26261#issuecomment-547271850 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils
AmplabJenkins removed a comment on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils URL: https://github.com/apache/spark/pull/26261#issuecomment-547271856 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/17761/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils
AmplabJenkins commented on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils URL: https://github.com/apache/spark/pull/26261#issuecomment-547271856 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/17761/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils
AmplabJenkins commented on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils URL: https://github.com/apache/spark/pull/26261#issuecomment-547271850 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26177: [SPARK-29520][SS] Fix checks of negative intervals
AmplabJenkins commented on issue #26177: [SPARK-29520][SS] Fix checks of negative intervals URL: https://github.com/apache/spark/pull/26177#issuecomment-547271892 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26177: [SPARK-29520][SS] Fix checks of negative intervals
AmplabJenkins removed a comment on issue #26177: [SPARK-29520][SS] Fix checks of negative intervals URL: https://github.com/apache/spark/pull/26177#issuecomment-547271897 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/17762/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26177: [SPARK-29520][SS] Fix checks of negative intervals
AmplabJenkins removed a comment on issue #26177: [SPARK-29520][SS] Fix checks of negative intervals URL: https://github.com/apache/spark/pull/26177#issuecomment-547271892 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] SparkQA commented on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils
SparkQA commented on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils URL: https://github.com/apache/spark/pull/26261#issuecomment-547271480 **[Test build #112826 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112826/testReport)** for PR 26261 at commit [`38f7c78`](https://github.com/apache/spark/commit/38f7c78e51ed4a19a24aee6aca08fc58dd101f20). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] SparkQA commented on issue #26177: [SPARK-29520][SS] Fix checks of negative intervals
SparkQA commented on issue #26177: [SPARK-29520][SS] Fix checks of negative intervals URL: https://github.com/apache/spark/pull/26177#issuecomment-547271483 **[Test build #112827 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112827/testReport)** for PR 26177 at commit [`b212976`](https://github.com/apache/spark/commit/b21297666bb838c3f53193dc9583b60350c09fd8). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] cloud-fan commented on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils
cloud-fan commented on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils URL: https://github.com/apache/spark/pull/26261#issuecomment-547271108 retest this please This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] cloud-fan commented on issue #26177: [SPARK-29520][SS] Fix checks of negative intervals
cloud-fan commented on issue #26177: [SPARK-29520][SS] Fix checks of negative intervals URL: https://github.com/apache/spark/pull/26177#issuecomment-547270604 retest this please This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] huaxingao commented on issue #26269: [SPARK-29612][SQL] ALTER TABLE (RECOVER PARTITIONS) should look up catalog/table like v2 commands
huaxingao commented on issue #26269: [SPARK-29612][SQL] ALTER TABLE (RECOVER PARTITIONS) should look up catalog/table like v2 commands URL: https://github.com/apache/spark/pull/26269#issuecomment-547269991 Thanks! @cloud-fan @viirya This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] cloud-fan commented on issue #26281: [SPARK-29618] remove V1_BATCH_WRITE table capability
cloud-fan commented on issue #26281: [SPARK-29618] remove V1_BATCH_WRITE table capability URL: https://github.com/apache/spark/pull/26281#issuecomment-547269808 I don't get your point. The only change to data source developers is that, they don't need to specify the `V1_BATCH_WRITE` table capability when they implement v1 flalback write (`V1WriteBuilder`). This is a pure simplification to DS v2 implementations that need v1 fallback write. How is it related to "Table creation through V2 in DataFrameWriter.save"? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26265: [SPARK-29565][ML][PYTHON] OneHotEncoder should support single-column input/output
AmplabJenkins removed a comment on issue #26265: [SPARK-29565][ML][PYTHON] OneHotEncoder should support single-column input/output URL: https://github.com/apache/spark/pull/26265#issuecomment-547269282 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/112821/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26265: [SPARK-29565][ML][PYTHON] OneHotEncoder should support single-column input/output
AmplabJenkins removed a comment on issue #26265: [SPARK-29565][ML][PYTHON] OneHotEncoder should support single-column input/output URL: https://github.com/apache/spark/pull/26265#issuecomment-547269273 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26265: [SPARK-29565][ML][PYTHON] OneHotEncoder should support single-column input/output
AmplabJenkins commented on issue #26265: [SPARK-29565][ML][PYTHON] OneHotEncoder should support single-column input/output URL: https://github.com/apache/spark/pull/26265#issuecomment-547269282 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/112821/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26265: [SPARK-29565][ML][PYTHON] OneHotEncoder should support single-column input/output
AmplabJenkins commented on issue #26265: [SPARK-29565][ML][PYTHON] OneHotEncoder should support single-column input/output URL: https://github.com/apache/spark/pull/26265#issuecomment-547269273 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] SparkQA removed a comment on issue #26265: [SPARK-29565][ML][PYTHON] OneHotEncoder should support single-column input/output
SparkQA removed a comment on issue #26265: [SPARK-29565][ML][PYTHON] OneHotEncoder should support single-column input/output URL: https://github.com/apache/spark/pull/26265#issuecomment-547254754 **[Test build #112821 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112821/testReport)** for PR 26265 at commit [`9d5d6ff`](https://github.com/apache/spark/commit/9d5d6ff3c5e8794ba14f2af2b274ed166d8350d2). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] SparkQA commented on issue #26265: [SPARK-29565][ML][PYTHON] OneHotEncoder should support single-column input/output
SparkQA commented on issue #26265: [SPARK-29565][ML][PYTHON] OneHotEncoder should support single-column input/output URL: https://github.com/apache/spark/pull/26265#issuecomment-547269089 **[Test build #112821 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112821/testReport)** for PR 26265 at commit [`9d5d6ff`](https://github.com/apache/spark/commit/9d5d6ff3c5e8794ba14f2af2b274ed166d8350d2). * This patch passes all tests. * This patch merges cleanly. * This patch adds no public classes. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] maropu commented on a change in pull request #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql
maropu commented on a change in pull request #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290#discussion_r339903249 ## File path: sql/core/src/test/resources/sql-tests/inputs/postgreSQL/create_view.sql ## @@ -0,0 +1,793 @@ +-- Portions Copyright (c) 1996-2019, PostgreSQL Global Development Group +-- +-- CREATE VIEW +-- https://github.com/postgres/postgres/blob/REL_12_STABLE/src/test/regress/sql/create_view.sql + +-- Spark doesn't support geometric types +-- CREATE VIEW street AS +--SELECT r.name, r.thepath, c.cname AS cname +--FROM ONLY road r, real_city c +--WHERE c.outline ## r.thepath; + +-- Spark doesn't support geometric types +-- CREATE VIEW iexit AS +--SELECT ih.name, ih.thepath, +-- interpt_pp(ih.thepath, r.thepath) AS exit +--FROM ihighway ih, ramp r +--WHERE ih.thepath ## r.thepath; + +CREATE TABLE emp ( + name string, + age int, + -- Spark doesn't support a geometric type `point` + -- location point + salary int, + manager string +) USING parquet; + +CREATE VIEW toyemp AS + SELECT name, age, /* location ,*/ 12*salary AS annualsal + FROM emp; + +-- Spark doesn't support the COMMENT clause that is not defined in the SQL standard +-- Test comments +-- COMMENT ON VIEW noview IS 'no view'; +-- COMMENT ON VIEW toyemp IS 'is a view'; +-- COMMENT ON VIEW toyemp IS NULL; + +DROP VIEW toyemp; + +-- These views are left around mainly to exercise special cases in pg_dump. + +-- [SPARK-19842] Informational Referential Integrity Constraints Support in Spark +-- CREATE TABLE view_base_table (key int PRIMARY KEY, data varchar(20)); +-- +-- CREATE VIEW key_dependent_view AS +--SELECT * FROM view_base_table GROUP BY key; +-- +-- ALTER TABLE view_base_table DROP CONSTRAINT view_base_table_pkey; -- fails + +-- CREATE VIEW key_dependent_view_no_cols AS +--SELECT FROM view_base_table GROUP BY key HAVING length(data) > 0; + +-- +-- CREATE OR REPLACE VIEW +-- + +CREATE TABLE viewtest_tbl (a int, b int) using parquet; +-- [SPARK-29386] Copy data between a file and a table +-- COPY viewtest_tbl FROM stdin; +-- 5 10 +-- 10 15 +-- 15 20 +-- 20 25 +-- \. +INSERT INTO viewtest_tbl VALUES (5, 10), (10, 15), (15, 20), (20, 25); + +CREATE OR REPLACE VIEW viewtest AS + SELECT * FROM viewtest_tbl; + +CREATE OR REPLACE VIEW viewtest AS + SELECT * FROM viewtest_tbl WHERE a > 10; + +SELECT * FROM viewtest; + +CREATE OR REPLACE VIEW viewtest AS + SELECT a, b FROM viewtest_tbl WHERE a > 5 ORDER BY b DESC; + +SELECT * FROM viewtest; + +-- should fail +-- Spark can accept the DDL query below +CREATE OR REPLACE VIEW viewtest AS + SELECT a FROM viewtest_tbl WHERE a <> 20; + +-- should fail +-- Spark can accept the DDL query below +CREATE OR REPLACE VIEW viewtest AS + SELECT 1, * FROM viewtest_tbl; + +-- should fail +-- Spark can accept the DDL query below +CREATE OR REPLACE VIEW viewtest AS + SELECT a, decimal(b) FROM viewtest_tbl; + +-- should work +CREATE OR REPLACE VIEW viewtest AS + SELECT a, b, 0 AS c FROM viewtest_tbl; + +DROP VIEW viewtest; +DROP TABLE viewtest_tbl; + +-- tests for temporary views + +-- Spark doesn't support the cascaded syntax below in `CREATE SCHEMA` +-- CREATE SCHEMA temp_view_test +-- CREATE TABLE base_table (a int, id int) using parquet +-- CREATE TABLE base_table2 (a int, id int) using parquet; +CREATE SCHEMA temp_view_test; +CREATE TABLE temp_view_test.base_table (a int, id int) using parquet; +CREATE TABLE temp_view_test.base_table2 (a int, id int) using parquet; + +-- Replace SET with USE +-- SET search_path TO temp_view_test, public; +USE temp_view_test; + +-- Since Spark doesn't support CREATE TEMPORARY TABLE, we used CREATE TEMPORARY VIEW instead +-- CREATE TEMPORARY TABLE temp_table (a int, id int); +CREATE TEMPORARY VIEW temp_table AS SELECT * FROM VALUES + (1, 1) as temp_table(a, id); + +-- should be created in temp_view_test schema +CREATE VIEW v1 AS SELECT * FROM base_table; +DESC TABLE EXTENDED v1; +-- should be created in temp object schema +-- [SPARK-X] Forcibly create a temporary view in CREATE VIEW if referencing a temporary view +CREATE VIEW v1_temp AS SELECT * FROM temp_table; +-- should be created in temp object schema +CREATE TEMPORARY VIEW v2_temp AS SELECT * FROM base_table; +DESC TABLE EXTENDED v2_temp; +-- should be created in temp_views schema +CREATE VIEW temp_view_test.v2 AS SELECT * FROM base_table; +DESC TABLE EXTENDED temp_view_test.v2; +-- should fail +-- [SPARK-X] Forcibly create a temporary view in CREATE VIEW if referencing a temporary view +CREATE VIEW temp_view_test.v3_temp AS SELECT * FROM temp_table; +-- should fail +-- Spark doesn't support the cascaded syntax below in `CREATE SCHEMA` +-- CREATE SCHEMA test_view_schema +-- CREATE TEMP VIEW testview AS SELECT 1; + +-- joins: if any of the join relations are temporary, the view +-- should also be temporary + +-- should
[GitHub] [spark] cloud-fan closed pull request #26269: [SPARK-29612][SQL] ALTER TABLE (RECOVER PARTITIONS) should look up catalog/table like v2 commands
cloud-fan closed pull request #26269: [SPARK-29612][SQL] ALTER TABLE (RECOVER PARTITIONS) should look up catalog/table like v2 commands URL: https://github.com/apache/spark/pull/26269 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] maropu commented on a change in pull request #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql
maropu commented on a change in pull request #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290#discussion_r339903229 ## File path: sql/core/src/test/resources/sql-tests/inputs/postgreSQL/create_view.sql ## @@ -0,0 +1,793 @@ +-- Portions Copyright (c) 1996-2019, PostgreSQL Global Development Group +-- +-- CREATE VIEW +-- https://github.com/postgres/postgres/blob/REL_12_STABLE/src/test/regress/sql/create_view.sql + +-- Spark doesn't support geometric types +-- CREATE VIEW street AS +--SELECT r.name, r.thepath, c.cname AS cname +--FROM ONLY road r, real_city c +--WHERE c.outline ## r.thepath; + +-- Spark doesn't support geometric types +-- CREATE VIEW iexit AS +--SELECT ih.name, ih.thepath, +-- interpt_pp(ih.thepath, r.thepath) AS exit +--FROM ihighway ih, ramp r +--WHERE ih.thepath ## r.thepath; + +CREATE TABLE emp ( + name string, + age int, + -- Spark doesn't support a geometric type `point` + -- location point + salary int, + manager string +) USING parquet; + +CREATE VIEW toyemp AS + SELECT name, age, /* location ,*/ 12*salary AS annualsal + FROM emp; + +-- Spark doesn't support the COMMENT clause that is not defined in the SQL standard +-- Test comments +-- COMMENT ON VIEW noview IS 'no view'; +-- COMMENT ON VIEW toyemp IS 'is a view'; +-- COMMENT ON VIEW toyemp IS NULL; + +DROP VIEW toyemp; + +-- These views are left around mainly to exercise special cases in pg_dump. + +-- [SPARK-19842] Informational Referential Integrity Constraints Support in Spark +-- CREATE TABLE view_base_table (key int PRIMARY KEY, data varchar(20)); +-- +-- CREATE VIEW key_dependent_view AS +--SELECT * FROM view_base_table GROUP BY key; +-- +-- ALTER TABLE view_base_table DROP CONSTRAINT view_base_table_pkey; -- fails + +-- CREATE VIEW key_dependent_view_no_cols AS +--SELECT FROM view_base_table GROUP BY key HAVING length(data) > 0; + +-- +-- CREATE OR REPLACE VIEW +-- + +CREATE TABLE viewtest_tbl (a int, b int) using parquet; +-- [SPARK-29386] Copy data between a file and a table +-- COPY viewtest_tbl FROM stdin; +-- 5 10 +-- 10 15 +-- 15 20 +-- 20 25 +-- \. +INSERT INTO viewtest_tbl VALUES (5, 10), (10, 15), (15, 20), (20, 25); + +CREATE OR REPLACE VIEW viewtest AS + SELECT * FROM viewtest_tbl; + +CREATE OR REPLACE VIEW viewtest AS + SELECT * FROM viewtest_tbl WHERE a > 10; + +SELECT * FROM viewtest; + +CREATE OR REPLACE VIEW viewtest AS + SELECT a, b FROM viewtest_tbl WHERE a > 5 ORDER BY b DESC; + +SELECT * FROM viewtest; + +-- should fail +-- Spark can accept the DDL query below +CREATE OR REPLACE VIEW viewtest AS + SELECT a FROM viewtest_tbl WHERE a <> 20; + +-- should fail +-- Spark can accept the DDL query below +CREATE OR REPLACE VIEW viewtest AS + SELECT 1, * FROM viewtest_tbl; + +-- should fail +-- Spark can accept the DDL query below +CREATE OR REPLACE VIEW viewtest AS + SELECT a, decimal(b) FROM viewtest_tbl; + +-- should work +CREATE OR REPLACE VIEW viewtest AS + SELECT a, b, 0 AS c FROM viewtest_tbl; + +DROP VIEW viewtest; +DROP TABLE viewtest_tbl; + +-- tests for temporary views + +-- Spark doesn't support the cascaded syntax below in `CREATE SCHEMA` +-- CREATE SCHEMA temp_view_test +-- CREATE TABLE base_table (a int, id int) using parquet +-- CREATE TABLE base_table2 (a int, id int) using parquet; +CREATE SCHEMA temp_view_test; +CREATE TABLE temp_view_test.base_table (a int, id int) using parquet; +CREATE TABLE temp_view_test.base_table2 (a int, id int) using parquet; + +-- Replace SET with USE +-- SET search_path TO temp_view_test, public; +USE temp_view_test; + +-- Since Spark doesn't support CREATE TEMPORARY TABLE, we used CREATE TEMPORARY VIEW instead +-- CREATE TEMPORARY TABLE temp_table (a int, id int); +CREATE TEMPORARY VIEW temp_table AS SELECT * FROM VALUES + (1, 1) as temp_table(a, id); + +-- should be created in temp_view_test schema +CREATE VIEW v1 AS SELECT * FROM base_table; +DESC TABLE EXTENDED v1; +-- should be created in temp object schema +-- [SPARK-X] Forcibly create a temporary view in CREATE VIEW if referencing a temporary view +CREATE VIEW v1_temp AS SELECT * FROM temp_table; +-- should be created in temp object schema +CREATE TEMPORARY VIEW v2_temp AS SELECT * FROM base_table; +DESC TABLE EXTENDED v2_temp; +-- should be created in temp_views schema +CREATE VIEW temp_view_test.v2 AS SELECT * FROM base_table; +DESC TABLE EXTENDED temp_view_test.v2; +-- should fail +-- [SPARK-X] Forcibly create a temporary view in CREATE VIEW if referencing a temporary view +CREATE VIEW temp_view_test.v3_temp AS SELECT * FROM temp_table; +-- should fail +-- Spark doesn't support the cascaded syntax below in `CREATE SCHEMA` +-- CREATE SCHEMA test_view_schema +-- CREATE TEMP VIEW testview AS SELECT 1; + +-- joins: if any of the join relations are temporary, the view +-- should also be temporary + +-- should
[GitHub] [spark] viirya commented on issue #26086: [SPARK-29302] Make the file name of a task for dynamic partition overwrite be unique
viirya commented on issue #26086: [SPARK-29302] Make the file name of a task for dynamic partition overwrite be unique URL: https://github.com/apache/spark/pull/26086#issuecomment-547268384 Is this different to #24142? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds
AmplabJenkins removed a comment on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds URL: https://github.com/apache/spark/pull/26134#issuecomment-547267869 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/112814/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds
AmplabJenkins removed a comment on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds URL: https://github.com/apache/spark/pull/26134#issuecomment-547267866 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds
AmplabJenkins commented on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds URL: https://github.com/apache/spark/pull/26134#issuecomment-547267869 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/112814/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds
AmplabJenkins commented on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds URL: https://github.com/apache/spark/pull/26134#issuecomment-547267866 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] viirya closed pull request #26087: [SPARK-29427][SQL] Create KeyValueGroupedDataset from existing columns in DataFrame
viirya closed pull request #26087: [SPARK-29427][SQL] Create KeyValueGroupedDataset from existing columns in DataFrame URL: https://github.com/apache/spark/pull/26087 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] viirya commented on issue #26087: [SPARK-29427][SQL] Create KeyValueGroupedDataset from existing columns in DataFrame
viirya commented on issue #26087: [SPARK-29427][SQL] Create KeyValueGroupedDataset from existing columns in DataFrame URL: https://github.com/apache/spark/pull/26087#issuecomment-547267636 if so, let me close this for now. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] maropu commented on a change in pull request #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql
maropu commented on a change in pull request #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290#discussion_r339902655 ## File path: sql/core/src/test/resources/sql-tests/inputs/postgreSQL/create_view.sql ## @@ -0,0 +1,793 @@ +-- Portions Copyright (c) 1996-2019, PostgreSQL Global Development Group +-- +-- CREATE VIEW +-- https://github.com/postgres/postgres/blob/REL_12_STABLE/src/test/regress/sql/create_view.sql + +-- Spark doesn't support geometric types +-- CREATE VIEW street AS +--SELECT r.name, r.thepath, c.cname AS cname +--FROM ONLY road r, real_city c +--WHERE c.outline ## r.thepath; + +-- Spark doesn't support geometric types +-- CREATE VIEW iexit AS +--SELECT ih.name, ih.thepath, +-- interpt_pp(ih.thepath, r.thepath) AS exit +--FROM ihighway ih, ramp r +--WHERE ih.thepath ## r.thepath; + +CREATE TABLE emp ( + name string, + age int, + -- Spark doesn't support a geometric type `point` + -- location point + salary int, + manager string +) USING parquet; + +CREATE VIEW toyemp AS + SELECT name, age, /* location ,*/ 12*salary AS annualsal + FROM emp; + +-- Spark doesn't support the COMMENT clause that is not defined in the SQL standard +-- Test comments +-- COMMENT ON VIEW noview IS 'no view'; +-- COMMENT ON VIEW toyemp IS 'is a view'; +-- COMMENT ON VIEW toyemp IS NULL; + +DROP VIEW toyemp; + +-- These views are left around mainly to exercise special cases in pg_dump. + +-- [SPARK-19842] Informational Referential Integrity Constraints Support in Spark +-- CREATE TABLE view_base_table (key int PRIMARY KEY, data varchar(20)); +-- +-- CREATE VIEW key_dependent_view AS +--SELECT * FROM view_base_table GROUP BY key; +-- +-- ALTER TABLE view_base_table DROP CONSTRAINT view_base_table_pkey; -- fails + +-- CREATE VIEW key_dependent_view_no_cols AS +--SELECT FROM view_base_table GROUP BY key HAVING length(data) > 0; + +-- +-- CREATE OR REPLACE VIEW +-- + +CREATE TABLE viewtest_tbl (a int, b int) using parquet; +-- [SPARK-29386] Copy data between a file and a table +-- COPY viewtest_tbl FROM stdin; +-- 5 10 +-- 10 15 +-- 15 20 +-- 20 25 +-- \. +INSERT INTO viewtest_tbl VALUES (5, 10), (10, 15), (15, 20), (20, 25); + +CREATE OR REPLACE VIEW viewtest AS + SELECT * FROM viewtest_tbl; + +CREATE OR REPLACE VIEW viewtest AS + SELECT * FROM viewtest_tbl WHERE a > 10; + +SELECT * FROM viewtest; + +CREATE OR REPLACE VIEW viewtest AS + SELECT a, b FROM viewtest_tbl WHERE a > 5 ORDER BY b DESC; + +SELECT * FROM viewtest; + +-- should fail +-- Spark can accept the DDL query below +CREATE OR REPLACE VIEW viewtest AS + SELECT a FROM viewtest_tbl WHERE a <> 20; + +-- should fail +-- Spark can accept the DDL query below +CREATE OR REPLACE VIEW viewtest AS + SELECT 1, * FROM viewtest_tbl; + +-- should fail +-- Spark can accept the DDL query below +CREATE OR REPLACE VIEW viewtest AS + SELECT a, decimal(b) FROM viewtest_tbl; + +-- should work +CREATE OR REPLACE VIEW viewtest AS + SELECT a, b, 0 AS c FROM viewtest_tbl; + +DROP VIEW viewtest; +DROP TABLE viewtest_tbl; + +-- tests for temporary views + +-- Spark doesn't support the cascaded syntax below in `CREATE SCHEMA` +-- CREATE SCHEMA temp_view_test +-- CREATE TABLE base_table (a int, id int) using parquet +-- CREATE TABLE base_table2 (a int, id int) using parquet; +CREATE SCHEMA temp_view_test; +CREATE TABLE temp_view_test.base_table (a int, id int) using parquet; +CREATE TABLE temp_view_test.base_table2 (a int, id int) using parquet; + +-- Replace SET with USE +-- SET search_path TO temp_view_test, public; +USE temp_view_test; + +-- Since Spark doesn't support CREATE TEMPORARY TABLE, we used CREATE TEMPORARY VIEW instead +-- CREATE TEMPORARY TABLE temp_table (a int, id int); +CREATE TEMPORARY VIEW temp_table AS SELECT * FROM VALUES + (1, 1) as temp_table(a, id); + +-- should be created in temp_view_test schema +CREATE VIEW v1 AS SELECT * FROM base_table; +DESC TABLE EXTENDED v1; +-- should be created in temp object schema +-- [SPARK-X] Forcibly create a temporary view in CREATE VIEW if referencing a temporary view +CREATE VIEW v1_temp AS SELECT * FROM temp_table; Review comment: ``` // In Spark org.apache.spark.sql.AnalysisException Not allowed to create a permanent view `v1_temp` by referencing a temporary view `temp_table`; // In PostgreSQL NOTICE: view "v1_temp" will be a temporary view ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...
[GitHub] [spark] SparkQA removed a comment on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds
SparkQA removed a comment on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds URL: https://github.com/apache/spark/pull/26134#issuecomment-547237650 **[Test build #112814 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112814/testReport)** for PR 26134 at commit [`4e97bc0`](https://github.com/apache/spark/commit/4e97bc05f35fc9ddb05e945d1fe1a83f66abd0c7). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils
AmplabJenkins removed a comment on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils URL: https://github.com/apache/spark/pull/26261#issuecomment-547266937 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/112817/ Test FAILed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] SparkQA commented on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds
SparkQA commented on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds URL: https://github.com/apache/spark/pull/26134#issuecomment-547267376 **[Test build #112814 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112814/testReport)** for PR 26134 at commit [`4e97bc0`](https://github.com/apache/spark/commit/4e97bc05f35fc9ddb05e945d1fe1a83f66abd0c7). * This patch passes all tests. * This patch merges cleanly. * This patch adds no public classes. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils
AmplabJenkins removed a comment on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils URL: https://github.com/apache/spark/pull/26261#issuecomment-547266933 Merged build finished. Test FAILed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils
AmplabJenkins commented on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils URL: https://github.com/apache/spark/pull/26261#issuecomment-547266933 Merged build finished. Test FAILed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] SparkQA removed a comment on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils
SparkQA removed a comment on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils URL: https://github.com/apache/spark/pull/26261#issuecomment-547247780 **[Test build #112817 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112817/testReport)** for PR 26261 at commit [`38f7c78`](https://github.com/apache/spark/commit/38f7c78e51ed4a19a24aee6aca08fc58dd101f20). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils
AmplabJenkins commented on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils URL: https://github.com/apache/spark/pull/26261#issuecomment-547266937 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/112817/ Test FAILed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] shahidki31 commented on a change in pull request #26241: [SPARK-29585][WEBUI] Fix stagePage duration
shahidki31 commented on a change in pull request #26241: [SPARK-29585][WEBUI] Fix stagePage duration URL: https://github.com/apache/spark/pull/26241#discussion_r339902038 ## File path: core/src/main/scala/org/apache/spark/ui/jobs/StageTable.scala ## @@ -417,19 +417,8 @@ private[ui] class StageDataSource( case Some(t) => UIUtils.formatDate(t) case None => "Unknown" } -val finishTime = stageData.completionTime.map(_.getTime()).getOrElse(currentTime) - -// The submission time for a stage is misleading because it counts the time -// the stage waits to be launched. (SPARK-10930) -val duration = stageData.firstTaskLaunchedTime.map { date => - val time = date.getTime() - if (finishTime > time) { -finishTime - time - } else { -None -currentTime - time - } -} + +val duration = Some(stageData.executorRunTime) Review comment: Thanks @teeyog. I think this was the behavior in the previous versions of spark. I am not sure we need to change it. What ever you are proposing is `duration = executorRunTime + ser/deser time + gettingResultTime`. But, Already these 3 columns are getting disaplyed in the task table right? If we unify all would we need to display individual columns? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] SparkQA commented on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils
SparkQA commented on issue #26261: [SPARK-29607][SQL] Move static methods from CalendarInterval to IntervalUtils URL: https://github.com/apache/spark/pull/26261#issuecomment-547266799 **[Test build #112817 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112817/testReport)** for PR 26261 at commit [`38f7c78`](https://github.com/apache/spark/commit/38f7c78e51ed4a19a24aee6aca08fc58dd101f20). * This patch **fails Spark unit tests**. * This patch merges cleanly. * This patch adds no public classes. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql
AmplabJenkins removed a comment on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290#issuecomment-547266292 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/17760/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] maropu commented on a change in pull request #26238: [SPARK-29110][SQL][TESTS] Port window.sql (Part 4)
maropu commented on a change in pull request #26238: [SPARK-29110][SQL][TESTS] Port window.sql (Part 4) URL: https://github.com/apache/spark/pull/26238#discussion_r339901831 ## File path: sql/core/src/test/resources/sql-tests/results/postgreSQL/window_part4.sql.out ## @@ -0,0 +1,507 @@ +-- Automatically generated by SQLQueryTestSuite +-- Number of queries: 39 + + +-- !query 0 +SELECT i,AVG(v) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,1),(2,2),(3,NULL),(4,NULL)) t(i,v) +-- !query 0 schema +struct +-- !query 0 output +1 1.5 +2 2.0 +3 NULL +4 NULL + + +-- !query 1 +SELECT i,AVG(v) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,1),(2,2),(3,NULL),(4,NULL)) t(i,v) +-- !query 1 schema +struct +-- !query 1 output +1 1.5 +2 2.0 +3 NULL +4 NULL + + +-- !query 2 +SELECT i,AVG(v) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,1),(2,2),(3,NULL),(4,NULL)) t(i,v) +-- !query 2 schema +struct +-- !query 2 output +1 1.5 +2 2.0 +3 NULL +4 NULL + + +-- !query 3 +SELECT i,AVG(v) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,1.5),(2,2.5),(3,NULL),(4,NULL)) t(i,v) +-- !query 3 schema +struct +-- !query 3 output +1 2 +2 2.5 +3 NULL +4 NULL + + +-- !query 4 +SELECT i,SUM(v) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,1),(2,2),(3,NULL),(4,NULL)) t(i,v) +-- !query 4 schema +struct +-- !query 4 output +1 3 +2 2 +3 NULL +4 NULL + + +-- !query 5 +SELECT i,SUM(v) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,1),(2,2),(3,NULL),(4,NULL)) t(i,v) +-- !query 5 schema +struct +-- !query 5 output +1 3 +2 2 +3 NULL +4 NULL + + +-- !query 6 +SELECT i,SUM(v) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,1),(2,2),(3,NULL),(4,NULL)) t(i,v) +-- !query 6 schema +struct +-- !query 6 output +1 3 +2 2 +3 NULL +4 NULL + + +-- !query 7 +SELECT i,SUM(v) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,1.1),(2,2.2),(3,NULL),(4,NULL)) t(i,v) +-- !query 7 schema +struct +-- !query 7 output +1 3.3 +2 2.2 +3 NULL +4 NULL + + +-- !query 8 +SELECT SUM(n) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,1.01),(2,2),(3,3)) v(i,n) +-- !query 8 schema +struct +-- !query 8 output +3 +5 +6.01 + + +-- !query 9 +SELECT i,COUNT(v) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,1),(2,2),(3,NULL),(4,NULL)) t(i,v) +-- !query 9 schema +struct +-- !query 9 output +1 2 +2 1 +3 0 +4 0 + + +-- !query 10 +SELECT i,COUNT(*) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,1),(2,2),(3,NULL),(4,NULL)) t(i,v) +-- !query 10 schema +struct +-- !query 10 output +1 4 +2 3 +3 2 +4 1 + + +-- !query 11 +SELECT VAR_POP(n) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,600),(2,470),(3,170),(4,430),(5,300)) r(i,n) +-- !query 11 schema +struct +-- !query 11 output +0.0 +11266. +13868.7502 +21703.9996 +4225.0 + + +-- !query 12 +SELECT VAR_POP(n) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,600),(2,470),(3,170),(4,430),(5,300)) r(i,n) +-- !query 12 schema +struct +-- !query 12 output +0.0 +11266. +13868.7502 +21703.9996 +4225.0 + + +-- !query 13 +SELECT VAR_POP(n) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,600),(2,470),(3,170),(4,430),(5,300)) r(i,n) +-- !query 13 schema +struct +-- !query 13 output +0.0 +11266. +13868.7502 +21703.9996 +4225.0 + + +-- !query 14 +SELECT VAR_POP(n) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,600),(2,470),(3,170),(4,430),(5,300)) r(i,n) +-- !query 14 schema +struct +-- !query 14 output +0.0 +11266. +13868.7502 +21703.9996 +4225.0 + + +-- !query 15 +SELECT VAR_SAMP(n) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,600),(2,470),(3,170),(4,430),(5,300)) r(i,n) +-- !query 15 schema +struct +-- !query 15 output +16900.0 +18491.6668 +27129.9996 +8450.0 +NaN + + +-- !query 16 +SELECT VAR_SAMP(n) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,600),(2,470),(3,170),(4,430),(5,300)) r(i,n) +-- !query 16 schema +struct +-- !query 16 output +16900.0 +18491.6668 +27129.9996 +8450.0 +NaN + + +-- !query 17 +SELECT VAR_SAMP(n) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) + FROM (VALUES(1,600),(2,470),(3,170),(4,430),(5,300)) r(i,n)
[GitHub] [spark] AmplabJenkins removed a comment on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql
AmplabJenkins removed a comment on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290#issuecomment-547266287 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql
AmplabJenkins commented on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290#issuecomment-547266292 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/17760/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql
AmplabJenkins commented on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290#issuecomment-547266287 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] weixiuli commented on issue #26280: [SPARK-14922][SPARK-17732][SPARK-23866][SQL] Support partition filter in ALTER TABLE DROP PARTITION and batch dropping PARTITIONS
weixiuli commented on issue #26280: [SPARK-14922][SPARK-17732][SPARK-23866][SQL] Support partition filter in ALTER TABLE DROP PARTITION and batch dropping PARTITIONS URL: https://github.com/apache/spark/pull/26280#issuecomment-547265979 Ping @cloud-fan @mgaido91 @maropu @DazhuangSu Kindly review, thanks. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] SparkQA commented on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql
SparkQA commented on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290#issuecomment-547265974 **[Test build #112825 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112825/testReport)** for PR 26290 at commit [`9339a07`](https://github.com/apache/spark/commit/9339a07283e38c8c5c7a145f55250be3e88988e6). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] turboFei commented on issue #26159: [SPARK-29506][SQL] Use dynamicPartitionOverwrite in FileCommitProtocol when insert into hive table
turboFei commented on issue #26159: [SPARK-29506][SQL] Use dynamicPartitionOverwrite in FileCommitProtocol when insert into hive table URL: https://github.com/apache/spark/pull/26159#issuecomment-547265868 @viirya Hi, for the issue mentioned by @rezasafi , I have created a ticket to fix it. Could you help take a look? Thanks in advance! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] maropu commented on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql
maropu commented on issue #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290#issuecomment-547265805 Since I've not filed some issues in jira, I set `WIP` now. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] maropu opened a new pull request #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql
maropu opened a new pull request #26290: [WIP][SPARK-29120][SQL][TESTS] Port create_view.sql URL: https://github.com/apache/spark/pull/26290 ### What changes were proposed in this pull request? This PR ports create_view.sql from PostgreSQL regression tests https://github.com/postgres/postgres/blob/REL_12_STABLE/src/test/regress/sql/create_view.sql The expected results can be found in the link: https://github.com/postgres/postgres/blob/REL_12_STABLE/src/test/regress/expected/create_view.out ### Why are the changes needed? To check behaviour differences between Spark and PostgreSQL ### Does this PR introduce any user-facing change? No ### How was this patch tested? Pass the Jenkins. And, Comparison with PgSQL results This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] cloud-fan commented on a change in pull request #26182: [SPARK-29523][SQL] SHOW COLUMNS should do multi-catalog resolution.
cloud-fan commented on a change in pull request #26182: [SPARK-29523][SQL] SHOW COLUMNS should do multi-catalog resolution. URL: https://github.com/apache/spark/pull/26182#discussion_r339900738 ## File path: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/AstBuilder.scala ## @@ -2911,4 +2911,28 @@ class AstBuilder(conf: SQLConf) extends SqlBaseBaseVisitor[AnyRef] with Logging override def visitRefreshTable(ctx: RefreshTableContext): LogicalPlan = withOrigin(ctx) { RefreshTableStatement(visitMultipartIdentifier(ctx.multipartIdentifier())) } + + /** + * A command for users to list the column names for a table. + * This function creates a [[ShowColumnsStatement]] logical plan. + * + * The syntax of using this command in SQL is: + * {{{ + * SHOW COLUMNS (FROM | IN) tableName=multipartIdentifier + *((FROM | IN) namespace=multipartIdentifier)? + * }}} + */ + override def visitShowColumns(ctx: ShowColumnsContext): LogicalPlan = withOrigin(ctx) { +import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._ + +val table = visitMultipartIdentifier(ctx.table) +val namespace = Option(ctx.namespace).map(visitMultipartIdentifier) +if (namespace.isDefined && namespace.get.length > 1 && table.length > 1) { Review comment: Since we don't have v2 SHOW COLUMNS yet, so in `ResolveSessionCatalog`, we should simply check that 1. the table name has <= 2 parts 2. the namespace name should have only one part if specified This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] JkSelf commented on issue #26289: [SPARK-28560][SQL][followup] support the build side to local shuffle reader as far as possible in BroadcastHashJoin
JkSelf commented on issue #26289: [SPARK-28560][SQL][followup] support the build side to local shuffle reader as far as possible in BroadcastHashJoin URL: https://github.com/apache/spark/pull/26289#issuecomment-547264699 The failed test may be not related. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] SparkQA commented on issue #22952: [SPARK-20568][SS] Provide option to clean up completed files in streaming query
SparkQA commented on issue #22952: [SPARK-20568][SS] Provide option to clean up completed files in streaming query URL: https://github.com/apache/spark/pull/22952#issuecomment-547264254 **[Test build #112824 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112824/testReport)** for PR 22952 at commit [`dd9d4ad`](https://github.com/apache/spark/commit/dd9d4ad14272801e1ce618851d1093884cbfc217). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26289: [SPARK-28560][SQL][followup] support the build side to local shuffle reader as far as possible in BroadcastHashJoin
AmplabJenkins removed a comment on issue #26289: [SPARK-28560][SQL][followup] support the build side to local shuffle reader as far as possible in BroadcastHashJoin URL: https://github.com/apache/spark/pull/26289#issuecomment-547263994 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/112816/ Test FAILed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] HyukjinKwon commented on issue #26087: [SPARK-29427][SQL] Create KeyValueGroupedDataset from existing columns in DataFrame
HyukjinKwon commented on issue #26087: [SPARK-29427][SQL] Create KeyValueGroupedDataset from existing columns in DataFrame URL: https://github.com/apache/spark/pull/26087#issuecomment-547264380 To me, adding a new API seems a bit overkill to handle rather corner case TBH. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] SparkQA commented on issue #26258: [SPARK-29604][SQL] Force initialize SessionState before initializing HiveClient in SparkSQLEnv
SparkQA commented on issue #26258: [SPARK-29604][SQL] Force initialize SessionState before initializing HiveClient in SparkSQLEnv URL: https://github.com/apache/spark/pull/26258#issuecomment-547264226 **[Test build #112823 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112823/testReport)** for PR 26258 at commit [`23c197b`](https://github.com/apache/spark/commit/23c197b741cc4e1fa129d27e34c2011f13ae94b9). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26289: [SPARK-28560][SQL][followup] support the build side to local shuffle reader as far as possible in BroadcastHashJoin
AmplabJenkins removed a comment on issue #26289: [SPARK-28560][SQL][followup] support the build side to local shuffle reader as far as possible in BroadcastHashJoin URL: https://github.com/apache/spark/pull/26289#issuecomment-547263987 Merged build finished. Test FAILed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26289: [SPARK-28560][SQL][followup] support the build side to local shuffle reader as far as possible in BroadcastHashJoin
AmplabJenkins commented on issue #26289: [SPARK-28560][SQL][followup] support the build side to local shuffle reader as far as possible in BroadcastHashJoin URL: https://github.com/apache/spark/pull/26289#issuecomment-547263987 Merged build finished. Test FAILed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26289: [SPARK-28560][SQL][followup] support the build side to local shuffle reader as far as possible in BroadcastHashJoin
AmplabJenkins commented on issue #26289: [SPARK-28560][SQL][followup] support the build side to local shuffle reader as far as possible in BroadcastHashJoin URL: https://github.com/apache/spark/pull/26289#issuecomment-547263994 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/112816/ Test FAILed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] SparkQA removed a comment on issue #26289: [SPARK-28560][SQL][followup] support the build side to local shuffle reader as far as possible in BroadcastHashJoin
SparkQA removed a comment on issue #26289: [SPARK-28560][SQL][followup] support the build side to local shuffle reader as far as possible in BroadcastHashJoin URL: https://github.com/apache/spark/pull/26289#issuecomment-547244971 **[Test build #112816 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112816/testReport)** for PR 26289 at commit [`5b7ff2d`](https://github.com/apache/spark/commit/5b7ff2dc680e37032cee048f987e3809d6ebfd94). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] HeartSaVioR commented on issue #26258: [SPARK-29604][SQL] Force initialize SessionState before initializing HiveClient in SparkSQLEnv
HeartSaVioR commented on issue #26258: [SPARK-29604][SQL] Force initialize SessionState before initializing HiveClient in SparkSQLEnv URL: https://github.com/apache/spark/pull/26258#issuecomment-547263860 retest this, please This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] cloud-fan commented on issue #26269: [SPARK-29612][SQL] ALTER TABLE (RECOVER PARTITIONS) should look up catalog/table like v2 commands
cloud-fan commented on issue #26269: [SPARK-29612][SQL] ALTER TABLE (RECOVER PARTITIONS) should look up catalog/table like v2 commands URL: https://github.com/apache/spark/pull/26269#issuecomment-547263870 thanks, merging to master! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] SparkQA commented on issue #26289: [SPARK-28560][SQL][followup] support the build side to local shuffle reader as far as possible in BroadcastHashJoin
SparkQA commented on issue #26289: [SPARK-28560][SQL][followup] support the build side to local shuffle reader as far as possible in BroadcastHashJoin URL: https://github.com/apache/spark/pull/26289#issuecomment-547263888 **[Test build #112816 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112816/testReport)** for PR 26289 at commit [`5b7ff2d`](https://github.com/apache/spark/commit/5b7ff2dc680e37032cee048f987e3809d6ebfd94). * This patch **fails Spark unit tests**. * This patch merges cleanly. * This patch adds the following public classes _(experimental)_: * `case class OptimizeLocalShuffleReader(` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] HeartSaVioR commented on issue #22952: [SPARK-20568][SS] Provide option to clean up completed files in streaming query
HeartSaVioR commented on issue #22952: [SPARK-20568][SS] Provide option to clean up completed files in streaming query URL: https://github.com/apache/spark/pull/22952#issuecomment-547263889 retest this, please This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] HeartSaVioR commented on issue #26287: [SPARK-28158][SQL][FOLLOWUP] HiveUserDefinedTypeSuite: don't use RandomDataGenerator to create row for UDT backed by ArrayType
HeartSaVioR commented on issue #26287: [SPARK-28158][SQL][FOLLOWUP] HiveUserDefinedTypeSuite: don't use RandomDataGenerator to create row for UDT backed by ArrayType URL: https://github.com/apache/spark/pull/26287#issuecomment-547263642 Thanks all for reviewing and merging! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] jiangxb1987 commented on issue #26243: Prepare Spark release v3.0.0-preview-rc1
jiangxb1987 commented on issue #26243: Prepare Spark release v3.0.0-preview-rc1 URL: https://github.com/apache/spark/pull/26243#issuecomment-547263736 FYI, this PR has been reverted by https://github.com/apache/spark/commit/b33a58c0c6a40aa312cbd72f763d5e519182c3d1 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] jiangxb1987 closed pull request #26243: Prepare Spark release v3.0.0-preview-rc1
jiangxb1987 closed pull request #26243: Prepare Spark release v3.0.0-preview-rc1 URL: https://github.com/apache/spark/pull/26243 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] jiangxb1987 commented on issue #26243: Prepare Spark release v3.0.0-preview-rc1
jiangxb1987 commented on issue #26243: Prepare Spark release v3.0.0-preview-rc1 URL: https://github.com/apache/spark/pull/26243#issuecomment-547262976 Thanks, merging to master! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds
AmplabJenkins removed a comment on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds URL: https://github.com/apache/spark/pull/26134#issuecomment-547260952 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins removed a comment on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds
AmplabJenkins removed a comment on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds URL: https://github.com/apache/spark/pull/26134#issuecomment-547260958 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/112813/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds
AmplabJenkins commented on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds URL: https://github.com/apache/spark/pull/26134#issuecomment-547260952 Merged build finished. Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] AmplabJenkins commented on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds
AmplabJenkins commented on issue #26134: [SPARK-29486][SQL] CalendarInterval should have 3 fields: months, days and microseconds URL: https://github.com/apache/spark/pull/26134#issuecomment-547260958 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/112813/ Test PASSed. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org