Spark 3.41 with Java 11 performance on k8s serverless/autopilot

2023-08-07 Thread Mich Talebzadeh
nd CSP takes care of the rest. FYI, I am running Java 11 and spark 3.4.1 on the host submitting spark-submit. The docker file is also built on java 11, Spark 3.4,1 and Pyspark The tag explains it spark-py:3.4.1-scala_2.12-11-jre-slim-buster-java11PlusPackages The problem I notice is that cluster

Re: Docker images for Spark 3.1.1 and Spark 3.1.2 with Java 11 and Java 8 from docker hub

2022-02-20 Thread Mich Talebzadeh
13:50, Mich Talebzadeh wrote: > I have loaded docker files into my docker repository on docker hub and it > is public. > > > These are built on Spark 3.1.2 OR 3.1.1, with Scala 2.12 and with Java 11 > OR Java 8 on OS jre-slim-buster. The ones built on 3.1.1 with Java

Docker images for Spark 3.1.1 and Spark 3.1.2 with Java 11 and Java 8 from docker hub

2022-02-20 Thread Mich Talebzadeh
I have loaded docker files into my docker repository on docker hub and it is public. These are built on Spark 3.1.2 OR 3.1.1, with Scala 2.12 and with Java 11 OR Java 8 on OS jre-slim-buster. The ones built on 3.1.1 with Java 8 should work with GCP No additional packages are added to PySpark

Re: Submitting insert query from beeline failing on executor server with java 11

2021-03-17 Thread kaki mahesh raja
HI Jungtaek Lim , Thanks for the response, so we have no option only to wait till hadoop officially supports java 11. Thanks and regards, kaki mahesh raja -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com

Re: Submitting insert query from beeline failing on executor server with java 11

2021-03-16 Thread Jungtaek Lim
Hmm... I read the page again, and it looks like we are in a gray area. Hadoop community supports JDK 11 starting from Hadoop 3.3, while we haven't reached adding Hadoop 3.3 as a dependency. It may not make a real issue on runtime with Hadoop 3.x as Spark is using a part of Hadoop (client layer

Re: Submitting insert query from beeline failing on executor server with java 11

2021-03-16 Thread Jungtaek Lim
Hadoop 2.x doesn't support JDK 11. See Hadoop Java version compatibility with JDK: https://cwiki.apache.org/confluence/display/HADOOP/Hadoop+Java+Versions That said, you'll need to use Spark 3.x with Hadoop 3.1 profile to make Spark work with JDK 11. On Tue, Mar 16, 2021 at 10:06 PM Sean Owen

Re: Submitting insert query from beeline failing on executor server with java 11

2021-03-16 Thread Sean Owen
That looks like you didn't compile with Java 11 actually. How did you try to do so? On Tue, Mar 16, 2021, 7:50 AM kaki mahesh raja wrote: > HI All, > > We have compiled spark with java 11 ("11.0.9.1") and when testing the > thrift > server we are seeing that insert

Submitting insert query from beeline failing on executor server with java 11

2021-03-16 Thread kaki mahesh raja
HI All, We have compiled spark with java 11 ("11.0.9.1") and when testing the thrift server we are seeing that insert query from operator using beeline failing with the below error. {"type":"log", "level":"ERROR", "time":"2021-

Re: Spark 3.0.1 giving warning while running with Java 11

2021-01-15 Thread Sachit Murarka
Sure Sean. Thanks for confirmation. On Fri, 15 Jan 2021, 10:57 Sean Owen, wrote: > You can ignore that. Spark 3.x works with Java 11 but it will generate > some warnings that are safe to disregard. > > On Thu, Jan 14, 2021 at 11:26 PM Sachit Murarka > wrote: > >> Hi Al

Re: Spark 3.0.1 giving warning while running with Java 11

2021-01-14 Thread Sean Owen
You can ignore that. Spark 3.x works with Java 11 but it will generate some warnings that are safe to disregard. On Thu, Jan 14, 2021 at 11:26 PM Sachit Murarka wrote: > Hi All, > > Getting warning while running spark3.0.1 with Java11 . > > > WARNING: An illegal reflective ac

Spark 3.0.1 giving warning while running with Java 11

2021-01-14 Thread Sachit Murarka
Hi All, Getting warning while running spark3.0.1 with Java11 . WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform ( file:/opt/spark/jars/spark-unsafe_2.12-3.0.1.jar) to constructor

11

2020-12-16 Thread 张洪斌
发自网易邮箱大师

Re: Spark Compatibility with Java 11

2020-07-14 Thread Ankur Mittal
Thanks a lot. On Tue, Jul 14, 2020 at 12:51 PM Prashant Sharma wrote: > Hi Ankur, > > Java 11 support was added in Spark 3.0. > https://issues.apache.org/jira/browse/SPARK-24417 > > Thanks, > > > On Tue, Jul 14, 2020 at 6:12 PM Ankur Mittal > wrote: > >>

Re: Spark Compatibility with Java 11

2020-07-14 Thread Prashant Sharma
Hi Ankur, Java 11 support was added in Spark 3.0. https://issues.apache.org/jira/browse/SPARK-24417 Thanks, On Tue, Jul 14, 2020 at 6:12 PM Ankur Mittal wrote: > Hi, > > I am using Spark 2.X and need to execute Java 11 .Its not able to execute > Java 11 using Spark 2.X. > >

Spark Compatibility with Java 11

2020-07-14 Thread Ankur Mittal
Hi, I am using Spark 2.X and need to execute Java 11 .Its not able to execute Java 11 using Spark 2.X. Is there any way we can use Java 11 with Spark2.X? Has this issue been resolved in Spark 3.0 ? -- Regards Ankur Mittal

Re: Java 11 support in Spark 2.5

2020-01-02 Thread Jatin Puri
>From this >(http://apache-spark-developers-list.1001551.n3.nabble.com/DISCUSS-Spark-2-5-release-td27963.html#a27966), > looks like there is no confirmation yet if at all Spark 2.5 would have JDK 11 >support. Spark 3 would most likely be out soon (tentatively this quarter as per m

Java 11 support in Spark 2.5

2020-01-01 Thread Sinha, Breeta (Nokia - IN/Bangalore)
Hi All, Wanted to know if Java 11 support is added in Spark 2.5. If so, what is the expected timeline for Spark 2.5 release? Kind Regards, Breeta Sinha

Re: can Spark 2.4 work on JDK 11?

2018-09-29 Thread Felix Cheung
Not officially. We have seen problem with JDK 10 as well. It will be great if you or someone would like to contribute to get it to work.. From: kant kodali Sent: Tuesday, September 25, 2018 2:31 PM To: user @spark Subject: can Spark 2.4 work on JDK 11? Hi All

can Spark 2.4 work on JDK 11?

2018-09-25 Thread kant kodali
Hi All, can Spark 2.4 work on JDK 11? I feel like there are lot of features that are added in JDK 9, 10, 11 that can make deployment process a whole lot better and of course some more syntax sugar similar to Scala. Thanks!

ApacheCon CFP closing soon (11 February)

2017-01-18 Thread Rich Bowen
Hello, fellow Apache enthusiast. Thanks for your participation, and interest in, the projects of the Apache Software Foundation. I wanted to remind you that the Call For Papers (CFP) for ApacheCon North America, and Apache: Big Data North America, closes in less than a month. If you've been

Re: Schedule lunchtime today for a free webinar IoT data ingestion in Spark Streaming using Kaa 11 a.m. PDT (2 p.m. EDT)

2015-08-04 Thread orozvadovskyy
and Spark. Best wishes, Oleh Rozvadovskyy CyberVision Inc - Вихідне повідомлення - Від: Oleh Rozvadovskyy orozvadovs...@cybervisiontech.com Кому: user@spark.apache.org Надіслано: Четвер, 23 Липень 2015 р 17:48:11 Тема: Schedule lunchtime today for a free webinar IoT data

Schedule lunchtime today for a free webinar IoT data ingestion in Spark Streaming using Kaa 11 a.m. PDT (2 p.m. EDT)

2015-07-23 Thread Oleh Rozvadovskyy
Hi there! Only couple of hours left to our first webinar on* IoT data ingestion in Spark Streaming using Kaa*. During the webinar we will build a solution that ingests real-time data from Intel Edison into Apache Spark for stream processing. This solution includes a client, middleware, and

Re: Guava 11 dependency issue in Spark 1.2.0

2015-01-19 Thread Romi Kuntsman
I have recently encountered a similar problem with Guava version collision with Hadoop. Isn't it more correct to upgrade Hadoop to use the latest Guava? Why are they staying in version 11, does anyone know? *Romi Kuntsman*, *Big Data Engineer* http://www.totango.com On Wed, Jan 7, 2015 at 7:59

Re: Guava 11 dependency issue in Spark 1.2.0

2015-01-19 Thread Romi Kuntsman
On Mon, Jan 19, 2015 at 4:03 PM, Romi Kuntsman r...@totango.com wrote: I have recently encountered a similar problem with Guava version collision with Hadoop. Isn't it more correct to upgrade Hadoop to use the latest Guava? Why are they staying in version 11, does anyone know? *Romi Kuntsman

Re: Guava 11 dependency issue in Spark 1.2.0

2015-01-19 Thread Ted Yu
. Isn't it more correct to upgrade Hadoop to use the latest Guava? Why are they staying in version 11, does anyone know? Romi Kuntsman, Big Data Engineer http://www.totango.com On Wed, Jan 7, 2015 at 7:59 AM, Niranda Perera niranda.per...@gmail.com wrote: Hi Sean, I removed

Re: Guava 11 dependency issue in Spark 1.2.0

2015-01-06 Thread Sean Owen
-dev Guava was not downgraded to 11. That PR was not merged. It was part of a discussion about, indeed, what to do about potential Guava version conflicts. Spark uses Guava, but so does Hadoop, and so do user programs. Spark uses 14.0.1 in fact: https://github.com/apache/spark/blob/master

Re: Guava 11 dependency issue in Spark 1.2.0

2015-01-06 Thread Niranda Perera
Hi Sean, My mistake, Guava 11 dependency came from the hadoop-commons indeed. I'm running the following simple app in spark 1.2.0 standalone local cluster (2 workers) with Hadoop 1.2.1 public class AvroSparkTest { public static void main(String[] args) throws Exception { SparkConf

Re: Guava 11 dependency issue in Spark 1.2.0

2015-01-06 Thread Sean Owen
niranda.per...@gmail.com wrote: Hi Sean, My mistake, Guava 11 dependency came from the hadoop-commons indeed. I'm running the following simple app in spark 1.2.0 standalone local cluster (2 workers) with Hadoop 1.2.1 public class AvroSparkTest { public static void main(String[] args) throws

Guava 11 dependency issue in Spark 1.2.0

2015-01-06 Thread Niranda Perera
:114) While looking into this I found out that Guava was downgraded to version 11 in this PR. https://github.com/apache/spark/pull/1610 In this PR OpenHashSet.scala:261 line hashInt has been changed to hashLong. But when I actually run my app, java.lang.NoSuchMethodError

Hive 11 / CDH 4.6/ Spark 0.9.1 dilemmna

2014-08-06 Thread Anurag Tangri
I posted this in cdh-user mailing list yesterday and think this should have been the right audience for this: = Hi All, Not sure if anyone else faced this same issue or not. We installed CDH 4.6 that uses Hive 0.10. And we have Spark 0.9.1 that comes with Hive 11. Now our hive jobs

Re: Hive 11 / CDH 4.6/ Spark 0.9.1 dilemmna

2014-08-06 Thread Sean Owen
this same issue or not. We installed CDH 4.6 that uses Hive 0.10. And we have Spark 0.9.1 that comes with Hive 11. Now our hive jobs that work on CDH, fail in Shark. Anyone else facing same issues and any work-arounds ? Can we re-compile shark 0.9.1 with hive 10 or compile hive 11 on CDH 4.6