Unsubscribe

2023-02-07 Thread Tushar Machavolu
Unsubscribe

Re: Spark with GPU

2023-02-07 Thread Alessandro Bellina
For Apache Spark a stand-alone worker can manage all the resources of the box, including all GPUs. So a spark worker could be set up to manage N gpus in the box via *spark.worker.resource.gpu.amount,* and then *spark.executor.resource.gpu.amount, *as provided by on app submit, assigns GPU

Re: How to upgrade a spark structure streaming application

2023-02-07 Thread Mich Talebzadeh
Hi, Check the thread on graceful shutdown. That might help On Tue, 7 Feb 2023 at 12:47, Yoel Benharrous wrote: > Hi all, > > I would like to ask how you perform a Spark Streaming application upgrade? > I didn't find any builtin solution. > I found some people writing a marker on file system

Fwd: Graceful shutdown SPARK Structured Streaming

2023-02-07 Thread Mich Talebzadeh
-- Forwarded message - From: Mich Talebzadeh Date: Thu, 6 May 2021 at 20:07 Subject: Re: Graceful shutdown SPARK Structured Streaming To: ayan guha Cc: Gourav Sengupta , user @spark < user@spark.apache.org> That is a valid question and I am not aware of any new addition to

[Spark SQL] : Delete is only supported on V2 tables.

2023-02-07 Thread Jeevan Chhajed
Hi, How do we create V2 tables? I tried a couple of things using sql but was unable to do so. Can you share links/content it will be of much help. Is delete support on V2 tables still under dev ? Thanks, Jeevan

How to upgrade a spark structure streaming application

2023-02-07 Thread Yoel Benharrous
Hi all, I would like to ask how you perform a Spark Streaming application upgrade? I didn't find any builtin solution. I found some people writing a marker on file system and polling periodically to stop running query. Thanks, Yoel

SQL GROUP BY alias with dots, was: Spark SQL question

2023-02-07 Thread Enrico Minack
Hi, you are right, that is an interesting question. Looks like GROUP BY is doing something funny / magic here (spark-shell 3.3.1 and 3.5.0-SNAPSHOT): With an alias, it behaves as you have pointed out: spark.range(3).createTempView("ids_without_dots") spark.sql("SELECT * FROM

Unsubscribe

2023-02-07 Thread Spyros Gasteratos
Unsubscribe