Code fails when AQE enabled in Spark 3.1

2022-01-19 Thread Gaspar Muñoz
Hi guys,

hundreds of spark jobs run on my company every day. We are running Spark
3.1.2 and we want enable Adaptive Query Execution (AQE) for all of them.
We can't upgrade to 3.2 right now so we want enable it explicitly using
appropriate conf when spark submit.

Some of them fails when enable AQE but I can't discover what is happening.
In order to give your information I prepared a small snippet for spark
shell that fails in Spark 3.1 when AQE enabled and works when disabled. It
also work in 3.2 but I think maybe is a bug that can be fixed for 3.1.3.

The code and explanation can be found here:
https://issues.apache.org/jira/browse/SPARK-37898

Regards
-- 
Gaspar Muñoz Soria


RE: Does Spark 3.1.2/3.2 support log4j 2.17.1+, and how? your target release day for Spark3.3?

2022-01-19 Thread Bode, Meikel, NM-X-DS
Hi,

New releases are announced via mailing lists 
user@spark.apache.org & 
d...@spark.apache.org.

Best,
Meikel

From: Theodore J Griesenbrock 
Sent: Mittwoch, 19. Januar 2022 18:50
To: sro...@gmail.com
Cc: Juan Liu ; user@spark.apache.org
Subject: RE: Does Spark 3.1.2/3.2 support log4j 2.17.1+, and how? your target 
release day for Spark3.3?

Sie erhalten nicht oft E-Mail von "t...@ibm.com". Weitere 
Informationen, warum dies wichtig 
ist
Again, sorry to bother you.

What is the best option available to ensure we get notified when a new version 
is released for Apache Spark?  I do not see any RSS feeds, nor do I see any 
e-mail subscription option for this page:  
https://spark.apache.org/news/index.html

Please let me know what we can do to ensure we stay up to date with the news.

Thanks!

-T.J.


T.J. Griesenbrock
Technical Release Manager
Watson Health
He/Him/His

+1 (602) 377-7673 (Text only)
t...@ibm.com

IBM


- Original message -
From: "Sean Owen" mailto:sro...@gmail.com>>
To: "Juan Liu" mailto:liuj...@cn.ibm.com>>
Cc: "Theodore J Griesenbrock" mailto:t...@ibm.com>>, "User" 
mailto:user@spark.apache.org>>
Subject: [EXTERNAL] Re: Does Spark 3.1.2/3.2 support log4j 2.17.1+, and how? 
your target release day for Spark3.3?
Date: Thu, Jan 13, 2022 08:05

Yes, Spark does not use the SocketServer mentioned in CVE-2019-17571, however, 
so is not affected.
3.3.0 would probably be out in a couple months.

On Thu, Jan 13, 2022 at 3:14 AM Juan Liu 
mailto:liuj...@cn.ibm.com>> wrote:
We are informed that CVE-2021-4104 is not only problem with Log4J 1.x. There is 
one more CVE-2019-17571, and as Apache announced EOL in 2015, so Spark 3.3.0 
will be very expected. Do you think middle 2022 is a reasonable time for Spark 
3.3.0 release?

Juan Liu (刘娟) PMP®




Release Management, Watson Health, China Development Lab
Email: liuj...@cn.ibm.com
Phone: 86-10-82452506













- To 
unsubscribe e-mail: 
user-unsubscr...@spark.apache.org


RE: Regarding spark-3.2.0 decommission features.

2022-01-19 Thread Patidar, Mohanlal (Nokia - IN/Bangalore)
Gentle reminder!!!

Br,
-Mohan Patidar



From: Patidar, Mohanlal (Nokia - IN/Bangalore)
Sent: Tuesday, January 18, 2022 2:02 PM
To: user@spark.apache.org
Cc: Rao, Abhishek (Nokia - IN/Bangalore) ; Gowda Tp, 
Thimme (Nokia - IN/Bangalore) ; Sharma, Prakash 
(Nokia - IN/Bangalore) ; Tarun, N (Nokia - 
IN/Bangalore) ; Badagandi, Srinivas B. (Nokia - 
IN/Bangalore) 
Subject: Regarding spark-3.2.0 decommission features.

Hi,
 We're using Spark 3.2.0 and we have enabled the spark decommission 
feature. As part of validating this feature, we wanted to check if the rdd 
blocks and shuffle blocks from the decommissioned executors are migrated to 
other executors.
However, we could not see this happening. Below is the configuration we used.

  1.  Spark Configuration used:
 spark.local.dir /mnt/spark-ldir
 spark.decommission.enabled true
 spark.storage.decommission.enabled true
 spark.storage.decommission.rddBlocks.enabled true
 spark.storage.decommission.shuffleBlocks.enabled true
 spark.dynamicAllocation.enabled true
  2.  Brought up spark-driver and executors on the different nodes.
NAME
  READY  STATUS   NODE
decommission-driver 
1/1 Running   Node1
gzip-compression-test-ae0b0b7e4d7fbe40-exec-1  1/1 
Running   Node1
gzip-compression-test-ae0b0b7e4d7fbe40-exec-2  1/1 
Running   Node2
gzip-compression-test-ae0b0b7e4d7fbe40-exec-3  1/1 
Running   Node1
gzip-compression-test-ae0b0b7e4d7fbe40-exec-4  1/1 
Running   Node2
gzip-compression-test-ae0b0b7e4d7fbe40-exec-5  1/1 
Running   Node1
  3.  Bringdown Node2 so status of pods as are following.

NAME
  READY  STATUS   NODE
decommission-driver 
1/1 Running   Node1
gzip-compression-test-ae0b0b7e4d7fbe40-exec-1  1/1 
Running   Node1
gzip-compression-test-ae0b0b7e4d7fbe40-exec-2  1/1 
TerminatingNode2
gzip-compression-test-ae0b0b7e4d7fbe40-exec-3  1/1 
Running   Node1
gzip-compression-test-ae0b0b7e4d7fbe40-exec-4  1/1 
TerminatingNode2
gzip-compression-test-ae0b0b7e4d7fbe40-exec-5  1/1 
Running   Node1
  4.  Driver logs:
{"type":"log", "level":"INFO", "time":"2022-01-12T08:55:28.296Z", 
"timezone":"UTC", "log":"Adding decommission script to lifecycle"}
{"type":"log", "level":"INFO", "time":"2022-01-12T08:55:28.459Z", 
"timezone":"UTC", "log":"Adding decommission script to lifecycle"}
{"type":"log", "level":"INFO", "time":"2022-01-12T08:55:28.564Z", 
"timezone":"UTC", "log":"Adding decommission script to lifecycle"}
{"type":"log", "level":"INFO", "time":"2022-01-12T08:55:28.601Z", 
"timezone":"UTC", "log":"Adding decommission script to lifecycle"}
{"type":"log", "level":"INFO", "time":"2022-01-12T08:55:28.667Z", 
"timezone":"UTC", "log":"Adding decommission script to lifecycle"}
{"type":"log", "level":"INFO", "time":"2022-01-12T08:58:21.885Z", 
"timezone":"UTC", "log":"Notify executor 5 to decommissioning."}
{"type":"log", "level":"INFO", "time":"2022-01-12T08:58:21.887Z", 
"timezone":"UTC", "log":"Notify executor 1 to decommissioning."}
{"type":"log", "level":"INFO", "time":"2022-01-12T08:58:21.887Z", 
"timezone":"UTC", "log":"Notify executor 3 to decommissioning."}
{"type":"log", "level":"INFO", "time":"2022-01-12T08:58:21.887Z", 
"timezone":"UTC", "log":"Mark BlockManagers (BlockManagerId(5, X.X.X.X, 33359, 
None), BlockManagerId(1, X.X.X.X, 38655, None), BlockManagerId(3, X.X.X.X, 
35797, None)) as being decommissioning."}
{"type":"log", "level":"INFO", "time":"2022-01-12T08:59:24.426Z", 
"timezone":"UTC", "log":"Executor 2 is removed. Remove reason statistics: 
(gracefully decommissioned: 0, decommision unfinished: 0, driver killed: 0, 
unexpectedly exited: 1)."}
{"type":"log", "level":"INFO", "time":"2022-01-12T08:59:24.426Z", 
"timezone":"UTC", "log":"Executor 4 is removed. Remove reason statistics: 
(gracefully decommissioned: 0, decommision unfinished: 0, driver killed: 0, 
unexpectedly exited: 2)."}
  5.  Verified by Execute into all live executors(1,3,5) and checked at 
location (/mnt/spark-ldir/) so only one blockManger id present, not seeing any 
other blockManager id copied to this location.
Example:
$kubectl exec -it 
gzip-compression-test-ae0b0b7e4d7fbe40-exec-1   -n test bash
$cd /mnt/spark-ldir/
$ blockmgr-60872c99-e7d6-43ba-a43e-a97fc9f619ca

Since the migration was not 

Re: Profiling spark application

2022-01-19 Thread Wes Peng

Give a look at this:
https://github.com/LucaCanali/sparkMeasure

On 2022/1/20 1:18, Prasad Bhalerao wrote:
Is there any way we can profile spark applications which will show no. 
of invocations of spark api and their execution time etc etc just the 
way jprofiler shows all the details?


-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Profiling spark application

2022-01-19 Thread Deepak Sharma
You can take a look at jvm profiler that was open sourced by uber:
https://github.com/uber-common/jvm-profiler



On Thu, Jan 20, 2022 at 11:20 AM Prasad Bhalerao <
prasadbhalerao1...@gmail.com> wrote:

> Hi,
>
> It will require code changes and I am looking at some third party code , I
> am looking for something which I can just hook to jvm and get the stats..
>
> Thanks,
> Prasad
>
> On Thu, Jan 20, 2022 at 11:00 AM Sonal Goyal 
> wrote:
>
>> Hi Prasad,
>>
>> Have you checked the SparkListener -
>> https://mallikarjuna_g.gitbooks.io/spark/content/spark-SparkListener.html
>> ?
>>
>> Cheers,
>> Sonal
>> https://github.com/zinggAI/zingg
>>
>>
>>
>> On Thu, Jan 20, 2022 at 10:49 AM Prasad Bhalerao <
>> prasadbhalerao1...@gmail.com> wrote:
>>
>>> Hello,
>>>
>>> Is there any way we can profile spark applications which will show no.
>>> of invocations of spark api and their execution time etc etc just the way
>>> jprofiler shows all the details?
>>>
>>>
>>> Thanks,
>>> Prasad
>>>
>>

-- 
Thanks
Deepak
www.bigdatabig.com
www.keosha.net


Re: Profiling spark application

2022-01-19 Thread Prasad Bhalerao
Hi,

It will require code changes and I am looking at some third party code , I
am looking for something which I can just hook to jvm and get the stats..

Thanks,
Prasad

On Thu, Jan 20, 2022 at 11:00 AM Sonal Goyal  wrote:

> Hi Prasad,
>
> Have you checked the SparkListener -
> https://mallikarjuna_g.gitbooks.io/spark/content/spark-SparkListener.html
> ?
>
> Cheers,
> Sonal
> https://github.com/zinggAI/zingg
>
>
>
> On Thu, Jan 20, 2022 at 10:49 AM Prasad Bhalerao <
> prasadbhalerao1...@gmail.com> wrote:
>
>> Hello,
>>
>> Is there any way we can profile spark applications which will show no. of
>> invocations of spark api and their execution time etc etc just the way
>> jprofiler shows all the details?
>>
>>
>> Thanks,
>> Prasad
>>
>


Re: Profiling spark application

2022-01-19 Thread Sonal Goyal
Hi Prasad,

Have you checked the SparkListener -
https://mallikarjuna_g.gitbooks.io/spark/content/spark-SparkListener.html ?

Cheers,
Sonal
https://github.com/zinggAI/zingg



On Thu, Jan 20, 2022 at 10:49 AM Prasad Bhalerao <
prasadbhalerao1...@gmail.com> wrote:

> Hello,
>
> Is there any way we can profile spark applications which will show no. of
> invocations of spark api and their execution time etc etc just the way
> jprofiler shows all the details?
>
>
> Thanks,
> Prasad
>


Profiling spark application

2022-01-19 Thread Prasad Bhalerao
Hello,

Is there any way we can profile spark applications which will show no. of
invocations of spark api and their execution time etc etc just the way
jprofiler shows all the details?


Thanks,
Prasad


RE: Does Spark 3.1.2/3.2 support log4j 2.17.1+, and how? your target release day for Spark3.3?

2022-01-19 Thread Theodore J Griesenbrock
Again, sorry to bother you.
 
What is the best option available to ensure we get notified when a new version is released for Apache Spark?  I do not see any RSS feeds, nor do I see any e-mail subscription option for this page:  https://spark.apache.org/news/index.html
 
Please let me know what we can do to ensure we stay up to date with the news.
 
Thanks!
 
-T.J.
 
 
T.J. Griesenbrock
Technical Release Manager
Watson Health
He/Him/His
 
+1 (602) 377-7673 (Text only)t...@ibm.com 
IBM
 
 
- Original message -From: "Sean Owen" To: "Juan Liu" Cc: "Theodore J Griesenbrock" , "User" Subject: [EXTERNAL] Re: Does Spark 3.1.2/3.2 support log4j 2.17.1+, and how? your target release day for Spark3.3?Date: Thu, Jan 13, 2022 08:05  
Yes, Spark does not use the SocketServer mentioned in CVE-2019-17571, however, so is not affected.
3.3.0 would probably be out in a couple months. 

On Thu, Jan 13, 2022 at 3:14 AM Juan Liu  wrote:
We are informed that CVE-2021-4104 is not only problem with Log4J 1.x. There is one more CVE-2019-17571, and as Apache announced EOL in 2015, so Spark 3.3.0 will be very expected. Do you think middle 2022 is a reasonable time for Spark 3.3.0 release?  
Juan Liu (刘娟) PMP®    Release Management, Watson Health, China Development LabEmail: liuj...@cn.ibm.comPhone: 86-10-82452506           
 


-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Does Spark 3.1.2/3.2 support log4j 2.17.1+, and how? your target release day for Spark3.3?

2022-01-19 Thread Sean Owen
This very user@ list -- announcements will go to all the lists.

On Wed, Jan 19, 2022 at 11:50 AM Theodore J Griesenbrock 
wrote:

> Again, sorry to bother you.
>
> What is the best option available to ensure we get notified when a new
> version is released for Apache Spark?  I do not see any RSS feeds, nor do I
> see any e-mail subscription option for this page:
> https://spark.apache.org/news/index.html
>
>


Re: Issue: Spring-Boot vs. Apache Spark Dependencies

2022-01-19 Thread Sean Owen
The usual answer is to shade dependencies like this in your application, so
that both can coexist.

On Wed, Jan 19, 2022 at 8:42 AM Heyde, Andreas 
wrote:

> I had only one wish, that someone can have a look at this issue/topic.
>
> The problem is, that janino 3.0.16 and 3.1.4 has a complete other
> package/file structure, and so they are incompatible.
>
> The problem is, that was not done I a major version change, it’s a minor
> change.
>
>
>
> So I don’t know, which version to use.
>
> The newer one from Spring-boot the older one from spark.
>
>
>
> I have to exclude one of theme explicitly, but I am not the expert, what
> the best idea is.
>
>
>
> With "mvn dependency:tree" you see that the managed version is
> overwritten, but not the package, from which this come.
>
> So I mentioned, in the internet there are lot of articles discussing bypasses 
> the problem.
>
>
>
>
>
> These is the spark version I am using.
>
> 
>
>
> org.apache.spark
>
>
> spark-sql_2.13
>
>
> 3.2.0
>
>
> 
>
>
> 
>
>
> 
>
>
> 
>
>
> org.codehaus.janino
>
>
> commons-compiler
>
>
>  
>
>
> 
>
>
> org.codehaus.janino
>
>
> janino
>
>
> 
>
>
> 
>
> 
>
>
>
> Regards
>
>
>
> *Von:* Sean Owen 
> *Gesendet:* Mittwoch, 19. Januar 2022 14:25
> *An:* Heyde, Andreas 
> *Cc:* user@spark.apache.org
> *Betreff:* Re: Issue: Spring-Boot vs. Apache Spark Dependencies
>
>
>
> I did not see your message in the moderation queue, not sure why. Indeed
> you should send to user@, not to individuals. Your message is here now.
>
> Do you mean Spark 3.2.0? yes it is on Janino 3.0.16. You can use things
> like "mvn dependency:tree" on the project build to see dependencies, by the
> way.
>
>
>
> That does not seem too old, but there are newer versions. Janino is
> intended as an internal dependency for Spark, and I suppose there has not
> yet been a reason to update it.
>
> You are welcome to open a pull request that tries to update to the latest
> version, to see if it works.
>
> You didn't say what the problem is though.
>
>
>
>
>
> On Wed, Jan 19, 2022 at 6:14 AM Heyde, Andreas 
> wrote:
>
> After sending mail to Matei and yesterday subscribing
> user@spark.apache.org
>
> I cannot see my problem in
>
> iss...@spark.apache.org, past month - Apache Mail Archives
>
>
>
>
>
> Is there something wrong or where I have to write my issue new  or send a
> change request
>
> Regards
>
> /Andreas
>
> *Von:* Heyde, Andreas
> *Gesendet:* Dienstag, 18. Januar 2022 08:23
> *An:* 'user-subscr...@spark.apache.org' 
> *Betreff:* Spring-Boot vs. Apache Spark Dependencies
>
>
>
>
>
> Hi Andreas,
>
>
>
> I think it shouldn’t be hard to update it, but please just email the
> mailing list (https://spark.apache.org/community.html) or open an issue
> on JIRA to get the developers who work on that part of the code to look at
> it. I don’t think there will be major incompatibilities between those two
> versions. It may also be possible to configure your project’s build to use
> Janino 3.1 and still link against Spark (overriding its version) by
> packaging Janino 3.1 with your Spark job.
>
>
>
> Matei
>
>
>
> On Jan 17, 2022, at 12:17 AM, Heyde, Andreas 
> wrote:
>
>
>
> Hi Matei,
>
> I hope to have found a good contact person with you.
>
> We are using one of the latest spring-boot versions (2.5.5) with latest
> spark 2.13.
>
>
>
> These spring framework has a hard dependency to janino 3.1.6.
>
> Spark is using 3.0.16.
>
> It was difficult to see. (Step by Step adding dependencies to a dummy
> project)
>
> Some articles on the Internet discussing bypasses the problem.
>
>
>
> Is there a reason, why you using such a older version?!
>
> Is there a way to add your janino dependency only  to logback or using the 
> newest version of janino or specifying a version range [3.0.0,)
>
>
>
>
>
> Regards
>
> /Andreas
>
>
>
>
>
>
>
> Mit freundlichen Grüßen/Kind regards
>
> Andreas Heyde
>
> *DZ BANK AG*
> IT
> Entwicklung Bewertung
> F/ITEB
> Platz der Republik
> 60325 Frankfurt am Main
> Postanschrift
> 60265 Frankfurt am Main
>
> T +49 69 7447 3403
> F +49 69 7447 3477
> mailto:andreas.he...@dzbank.de 
>
>
>
>
>
> *DZ BANK AG*
> Deutsche Zentral-Genossenschaftsbank, Frankfurt am Main
> Platz der Republik, 60325 Frankfurt am Main
> Deutschland / Germany
>
> https://www.dzbank.de
> mailto:m...@dzbank.de , T +49 69 7447 01, F +49 69 7447
> 1685
>
> https://twitter.com/dzbank 
>
> Vorstand/Board of Directors: Uwe Fröhlich
> (Co-Vorstandsvorsitzender/Co-Chief Executive Officer),
> Dr. Cornelius Riese (Co-Vorstandsvorsitzender/Co-Chief Executive Officer),
> Uwe Berghaus,
> Dr. Christian Brauckmann, Ulrike Brouzi, Wolfgang Köhler, Michael Speth,
> Thomas Ullrich
>
> Aufsichtsratsvorsitzender/Chairman of the Supervisory Board: Henning
> Deneke-Jöhrens
> Sitz/Registered Office: Stadt Frankfurt am Main, Amtsgericht Frankfurt am
> Main
> Handelsregister/Register of Companies: HRB 45651.
>
> 

AW: Issue: Spring-Boot vs. Apache Spark Dependencies

2022-01-19 Thread Heyde, Andreas
I had only one wish, that someone can have a look at this issue/topic.
The problem is, that janino 3.0.16 and 3.1.4 has a complete other package/file 
structure, and so they are incompatible.
The problem is, that was not done I a major version change, it’s a minor change.

So I don’t know, which version to use.
The newer one from Spring-boot the older one from spark.

I have to exclude one of theme explicitly, but I am not the expert, what the 
best idea is.

With "mvn dependency:tree" you see that the managed version is overwritten, but 
not the package, from which this come.

So I mentioned, in the internet there are lot of articles discussing bypasses 
the problem.



These is the spark version I am using.


 org.apache.spark

 spark-sql_2.13

 3.2.0

 

  

  

  

  org.codehaus.janino

  
commons-compiler

  

  

  org.codehaus.janino

  janino

  

 


Regards

Von: Sean Owen 
Gesendet: Mittwoch, 19. Januar 2022 14:25
An: Heyde, Andreas 
Cc: user@spark.apache.org
Betreff: Re: Issue: Spring-Boot vs. Apache Spark Dependencies

I did not see your message in the moderation queue, not sure why. Indeed you 
should send to user@, not to individuals. Your message is here now.
Do you mean Spark 3.2.0? yes it is on Janino 3.0.16. You can use things like 
"mvn dependency:tree" on the project build to see dependencies, by the way.

That does not seem too old, but there are newer versions. Janino is intended as 
an internal dependency for Spark, and I suppose there has not yet been a reason 
to update it.
You are welcome to open a pull request that tries to update to the latest 
version, to see if it works.
You didn't say what the problem is though.


On Wed, Jan 19, 2022 at 6:14 AM Heyde, Andreas 
mailto:andreas.he...@dzbank.de>> wrote:
After sending mail to Matei and yesterday subscribing 
user@spark.apache.org
I cannot see my problem in
iss...@spark.apache.org, past month - Apache Mail 
Archives


Is there something wrong or where I have to write my issue new  or send a 
change request
Regards
/Andreas
Von: Heyde, Andreas
Gesendet: Dienstag, 18. Januar 2022 08:23
An: 'user-subscr...@spark.apache.org' 
mailto:user-subscr...@spark.apache.org>>
Betreff: Spring-Boot vs. Apache Spark Dependencies


Hi Andreas,

I think it shouldn’t be hard to update it, but please just email the mailing 
list (https://spark.apache.org/community.html) or open an issue on JIRA to get 
the developers who work on that part of the code to look at it. I don’t think 
there will be major incompatibilities between those two versions. It may also 
be possible to configure your project’s build to use Janino 3.1 and still link 
against Spark (overriding its version) by packaging Janino 3.1 with your Spark 
job.

Matei

On Jan 17, 2022, at 12:17 AM, Heyde, Andreas 
mailto:andreas.he...@dzbank.de>> wrote:

Hi Matei,
I hope to have found a good contact person with you.
We are using one of the latest spring-boot versions (2.5.5) with latest spark 
2.13.

These spring framework has a hard dependency to janino 3.1.6.
Spark is using 3.0.16.
It was difficult to see. (Step by Step adding dependencies to a dummy project)

Some articles on the Internet discussing bypasses the problem.



Is there a reason, why you using such a older version?!

Is there a way to add your janino dependency only  to logback or using the 
newest version of janino or 

Re: Issue: Spring-Boot vs. Apache Spark Dependencies

2022-01-19 Thread Sean Owen
I did not see your message in the moderation queue, not sure why. Indeed
you should send to user@, not to individuals. Your message is here now.
Do you mean Spark 3.2.0? yes it is on Janino 3.0.16. You can use things
like "mvn dependency:tree" on the project build to see dependencies, by the
way.

That does not seem too old, but there are newer versions. Janino is
intended as an internal dependency for Spark, and I suppose there has not
yet been a reason to update it.
You are welcome to open a pull request that tries to update to the latest
version, to see if it works.
You didn't say what the problem is though.


On Wed, Jan 19, 2022 at 6:14 AM Heyde, Andreas 
wrote:

> After sending mail to Matei and yesterday subscribing
> user@spark.apache.org
>
> I cannot see my problem in
>
> iss...@spark.apache.org, past month - Apache Mail Archives
>
>
>
>
>
> Is there something wrong or where I have to write my issue new  or send a
> change request
>
> Regards
>
> /Andreas
>
> *Von:* Heyde, Andreas
> *Gesendet:* Dienstag, 18. Januar 2022 08:23
> *An:* 'user-subscr...@spark.apache.org' 
> *Betreff:* Spring-Boot vs. Apache Spark Dependencies
>
>
>
>
>
> Hi Andreas,
>
>
>
> I think it shouldn’t be hard to update it, but please just email the
> mailing list (https://spark.apache.org/community.html) or open an issue
> on JIRA to get the developers who work on that part of the code to look at
> it. I don’t think there will be major incompatibilities between those two
> versions. It may also be possible to configure your project’s build to use
> Janino 3.1 and still link against Spark (overriding its version) by
> packaging Janino 3.1 with your Spark job.
>
>
>
> Matei
>
>
>
> On Jan 17, 2022, at 12:17 AM, Heyde, Andreas 
> wrote:
>
>
>
> Hi Matei,
>
> I hope to have found a good contact person with you.
>
> We are using one of the latest spring-boot versions (2.5.5) with latest
> spark 2.13.
>
>
>
> These spring framework has a hard dependency to janino 3.1.6.
>
> Spark is using 3.0.16.
>
> It was difficult to see. (Step by Step adding dependencies to a dummy
> project)
>
> Some articles on the Internet discussing bypasses the problem.
>
>
>
> Is there a reason, why you using such a older version?!
>
> Is there a way to add your janino dependency only  to logback or using the 
> newest version of janino or specifying a version range [3.0.0,)
>
>
>
>
>
> Regards
>
> /Andreas
>
>
>
>
>
>
>
> Mit freundlichen Grüßen/Kind regards
>
> Andreas Heyde
>
> *DZ BANK AG*
> IT
> Entwicklung Bewertung
> F/ITEB
> Platz der Republik
> 60325 Frankfurt am Main
> Postanschrift
> 60265 Frankfurt am Main
>
> T +49 69 7447 3403
> F +49 69 7447 3477
> mailto:andreas.he...@dzbank.de 
>
>
>
>
>
> *DZ BANK AG*
> Deutsche Zentral-Genossenschaftsbank, Frankfurt am Main
> Platz der Republik, 60325 Frankfurt am Main
> Deutschland / Germany
>
> https://www.dzbank.de
> mailto:m...@dzbank.de , T +49 69 7447 01, F +49 69 7447
> 1685
>
> https://twitter.com/dzbank 
>
> Vorstand/Board of Directors: Uwe Fröhlich
> (Co-Vorstandsvorsitzender/Co-Chief Executive Officer),
> Dr. Cornelius Riese (Co-Vorstandsvorsitzender/Co-Chief Executive Officer),
> Uwe Berghaus,
> Dr. Christian Brauckmann, Ulrike Brouzi, Wolfgang Köhler, Michael Speth,
> Thomas Ullrich
>
> Aufsichtsratsvorsitzender/Chairman of the Supervisory Board: Henning
> Deneke-Jöhrens
> Sitz/Registered Office: Stadt Frankfurt am Main, Amtsgericht Frankfurt am
> Main
> Handelsregister/Register of Companies: HRB 45651.
>
> ___
>
> Die mit dieser E-Mail-Kommunikation erhobenen personenbezogenen Daten
> werden ausschließlich zu diesem Zweck bzw. zur Bearbeitung Ihres Anliegens
> weiterverarbeitet.
> Weitere Informationen zum Datenschutz finden Sie unter
> https://www.dzbank.de/datenschutzhinweise
>
> The personal data collected by this e-mail communication
> will be processed exclusively for this purpose or to process your request.
> Further information with regards to your rights under data protection law
> can be found on
> our website at https://www.dzbank.com/dataprotection
>


WG: Issue: Spring-Boot vs. Apache Spark Dependencies

2022-01-19 Thread Heyde, Andreas
see below, got mailing error.

Von: Heyde, Andreas 
Gesendet: Mittwoch, 19. Januar 2022 13:14
An: user@spark.apache.org; user-requ...@spark.apache.org
Betreff: Issue: Spring-Boot vs. Apache Spark Dependencies

After sending mail to Matei and yesterday subscribing 
user@spark.apache.org
I cannot see my problem in
iss...@spark.apache.org, past month - Apache Mail 
Archives


Is there something wrong or where I have to write my issue new  or send a 
change request
Regards
/Andreas
Von: Heyde, Andreas
Gesendet: Dienstag, 18. Januar 2022 08:23
An: 'user-subscr...@spark.apache.org' 
mailto:user-subscr...@spark.apache.org>>
Betreff: Spring-Boot vs. Apache Spark Dependencies


Hi Andreas,

I think it shouldn’t be hard to update it, but please just email the mailing 
list (https://spark.apache.org/community.html) or open an issue on JIRA to get 
the developers who work on that part of the code to look at it. I don’t think 
there will be major incompatibilities between those two versions. It may also 
be possible to configure your project’s build to use Janino 3.1 and still link 
against Spark (overriding its version) by packaging Janino 3.1 with your Spark 
job.

Matei

On Jan 17, 2022, at 12:17 AM, Heyde, Andreas 
mailto:andreas.he...@dzbank.de>> wrote:

Hi Matei,
I hope to have found a good contact person with you.
We are using one of the latest spring-boot versions (2.5.5) with latest spark 
2.13.

These spring framework has a hard dependency to janino 3.1.6.
Spark is using 3.0.16.
It was difficult to see. (Step by Step adding dependencies to a dummy project)

Some articles on the Internet discussing bypasses the problem.



Is there a reason, why you using such a older version?!

Is there a way to add your janino dependency only  to logback or using the 
newest version of janino or specifying a version range [3.0.0,)


Regards
/Andreas



Mit freundlichen Grüßen/Kind regards

Andreas Heyde

DZ BANK AG
IT
Entwicklung Bewertung
F/ITEB
Platz der Republik
60325 Frankfurt am Main
Postanschrift
60265 Frankfurt am Main

T +49 69 7447 3403
F +49 69 7447 3477
mailto:andreas.he...@dzbank.de




DZ BANK AG
Deutsche Zentral-Genossenschaftsbank, Frankfurt am Main
Platz der Republik, 60325 Frankfurt am Main
Deutschland / Germany

https://www.dzbank.de
mailto:m...@dzbank.de, T +49 69 7447 01, F +49 69 7447 1685

https://twitter.com/dzbank

Vorstand/Board of Directors: Uwe Fröhlich (Co-Vorstandsvorsitzender/Co-Chief 
Executive Officer),
Dr. Cornelius Riese (Co-Vorstandsvorsitzender/Co-Chief Executive Officer), Uwe 
Berghaus,
Dr. Christian Brauckmann, Ulrike Brouzi, Wolfgang Köhler, Michael Speth, Thomas 
Ullrich

Aufsichtsratsvorsitzender/Chairman of the Supervisory Board: Henning 
Deneke-Jöhrens
Sitz/Registered Office: Stadt Frankfurt am Main, Amtsgericht Frankfurt am Main
Handelsregister/Register of Companies: HRB 45651.

___

Die mit dieser E-Mail-Kommunikation erhobenen personenbezogenen Daten
werden ausschließlich zu diesem Zweck bzw. zur Bearbeitung Ihres Anliegens 
weiterverarbeitet.
Weitere Informationen zum Datenschutz finden Sie unter 
https://www.dzbank.de/datenschutzhinweise

The personal data collected by this e-mail communication
will be processed exclusively for this purpose or to process your request.
Further information with regards to your rights under data protection law can 
be found on
our website at https://www.dzbank.com/dataprotection

___

DZ BANK AG
Deutsche Zentral-Genossenschaftsbank, Frankfurt am Main
Platz der Republik, 60325 Frankfurt am Main
Deutschland/Germany

https://www.dzbank.de
mailto:m...@dzbank.de, T +49 69 7447 01, F +49 69 7447 1685

https://twitter.com/dzbank

Vorstand/Board of Directors: Uwe Fröhlich (Co-Vorstandsvorsitzender/Co-Chief 
Executive Officer),
Dr. Cornelius Riese (Co-Vorstandsvorsitzender/Co-Chief Executive Officer), Uwe 
Berghaus,
Dr. Christian Brauckmann, Ulrike Brouzi, Wolfgang Köhler, Michael Speth, Thomas 
Ullrich

Aufsichtsratsvorsitzender/Chairman of the Supervisory Board: Henning 
Deneke-Jöhrens
Sitz/Registered Office: Stadt Frankfurt am Main, Amtsgericht Frankfurt am Main
Handelsregister/Register of Companies: HRB 45651.

___

Die mit dieser E-Mail-Kommunikation erhobenen personenbezogenen Daten
werden ausschließlich zu diesem Zweck bzw. zur Bearbeitung Ihres Anliegens 
weiterverarbeitet.
Weitere Informationen zum Datenschutz finden Sie unter 
https://www.dzbank.de/datenschutzhinweise

The personal data collected by this e-mail communication
will be processed exclusively for this purpose or to process your request.
Further information with regards to your rights under data protection law can 
be found on
our website at 

Issue: Spring-Boot vs. Apache Spark Dependencies

2022-01-19 Thread Heyde, Andreas
After sending mail to Matei and yesterday subscribing 
user@spark.apache.org
I cannot see my problem in
iss...@spark.apache.org, past month - Apache Mail 
Archives


Is there something wrong or where I have to write my issue new  or send a 
change request
Regards
/Andreas
Von: Heyde, Andreas
Gesendet: Dienstag, 18. Januar 2022 08:23
An: 'user-subscr...@spark.apache.org' 
Betreff: Spring-Boot vs. Apache Spark Dependencies


Hi Andreas,

I think it shouldn’t be hard to update it, but please just email the mailing 
list (https://spark.apache.org/community.html) or open an issue on JIRA to get 
the developers who work on that part of the code to look at it. I don’t think 
there will be major incompatibilities between those two versions. It may also 
be possible to configure your project’s build to use Janino 3.1 and still link 
against Spark (overriding its version) by packaging Janino 3.1 with your Spark 
job.

Matei

On Jan 17, 2022, at 12:17 AM, Heyde, Andreas 
mailto:andreas.he...@dzbank.de>> wrote:

Hi Matei,
I hope to have found a good contact person with you.
We are using one of the latest spring-boot versions (2.5.5) with latest spark 
2.13.

These spring framework has a hard dependency to janino 3.1.6.
Spark is using 3.0.16.
It was difficult to see. (Step by Step adding dependencies to a dummy project)

Some articles on the Internet discussing bypasses the problem.



Is there a reason, why you using such a older version?!

Is there a way to add your janino dependency only  to logback or using the 
newest version of janino or specifying a version range [3.0.0,)


Regards
/Andreas



Mit freundlichen Grüßen/Kind regards

Andreas Heyde

DZ BANK AG
IT
Entwicklung Bewertung
F/ITEB
Platz der Republik
60325 Frankfurt am Main
Postanschrift
60265 Frankfurt am Main

T +49 69 7447 3403
F +49 69 7447 3477
mailto:andreas.he...@dzbank.de


___

DZ BANK AG
Deutsche Zentral-Genossenschaftsbank, Frankfurt am Main
Platz der Republik, 60325 Frankfurt am Main
Deutschland/Germany

https://www.dzbank.de
mailto:m...@dzbank.de, T +49 69 7447 01, F +49 69 7447 1685

https://twitter.com/dzbank

Vorstand/Board of Directors: Uwe Fröhlich (Co-Vorstandsvorsitzender/Co-Chief 
Executive Officer),
Dr. Cornelius Riese (Co-Vorstandsvorsitzender/Co-Chief Executive Officer), Uwe 
Berghaus,
Dr. Christian Brauckmann, Ulrike Brouzi, Wolfgang Köhler, Michael Speth, Thomas 
Ullrich

Aufsichtsratsvorsitzender/Chairman of the Supervisory Board: Henning 
Deneke-Jöhrens
Sitz/Registered Office: Stadt Frankfurt am Main, Amtsgericht Frankfurt am Main
Handelsregister/Register of Companies: HRB 45651.

___

Die mit dieser E-Mail-Kommunikation erhobenen personenbezogenen Daten
werden ausschließlich zu diesem Zweck bzw. zur Bearbeitung Ihres Anliegens 
weiterverarbeitet.
Weitere Informationen zum Datenschutz finden Sie unter 
https://www.dzbank.de/datenschutzhinweise

The personal data collected by this e-mail communication
will be processed exclusively for this purpose or to process your request.
Further information with regards to your rights under data protection law can 
be found on
our website at https://www.dzbank.com/dataprotection


Re: Self contained Spark application with local master without spark-submit

2022-01-19 Thread Паша
Hi Colin,

Yes, you can. You only need to define master="local" and build the jar with
Main class defined in the manifest.



[image: facebook] 
[image: twitter] 
[image: linkedin] 
[image: instagram] 

Pasha Finkelshteyn

Developer Advocate for Data Engineering

JetBrains



asm0...@jetbrains.com
https://linktr.ee/asm0dey

Find out more 



ср, 19 янв. 2022 г. в 12:07, Colin Williams <
colin.williams.seat...@gmail.com>:

> Hello,
>
> I noticed I can run spark applications with a local master via sbt run
> and also via the IDE. I'd like to run a single threaded worker
> application as a self contained jar.
>
> What does sbt run employ that allows it to run a local master?
>
> Can I build an uber jar and run without spark-submit?
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Self contained Spark application with local master without spark-submit

2022-01-19 Thread Colin Williams
Hello,

I noticed I can run spark applications with a local master via sbt run
and also via the IDE. I'd like to run a single threaded worker
application as a self contained jar.

What does sbt run employ that allows it to run a local master?

Can I build an uber jar and run without spark-submit?

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org