nd CSP takes care of the
rest. FYI, I am running Java 11 and spark 3.4.1 on the host submitting
spark-submit. The docker file is also built on java 11, Spark 3.4,1 and
Pyspark
The tag explains it
spark-py:3.4.1-scala_2.12-11-jre-slim-buster-java11PlusPackages
The problem I notice is that cluster
13:50, Mich Talebzadeh
wrote:
> I have loaded docker files into my docker repository on docker hub and it
> is public.
>
>
> These are built on Spark 3.1.2 OR 3.1.1, with Scala 2.12 and with Java 11
> OR Java 8 on OS jre-slim-buster. The ones built on 3.1.1 with Java
I have loaded docker files into my docker repository on docker hub and it
is public.
These are built on Spark 3.1.2 OR 3.1.1, with Scala 2.12 and with Java 11
OR Java 8 on OS jre-slim-buster. The ones built on 3.1.1 with Java 8
should work with GCP
No additional packages are added to PySpark
HI Jungtaek Lim ,
Thanks for the response, so we have no option only to wait till hadoop
officially supports java 11.
Thanks and regards,
kaki mahesh raja
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com
Hmm... I read the page again, and it looks like we are in a gray area.
Hadoop community supports JDK 11 starting from Hadoop 3.3, while we haven't
reached adding Hadoop 3.3 as a dependency. It may not make a real issue on
runtime with Hadoop 3.x as Spark is using a part of Hadoop (client layer
Hadoop 2.x doesn't support JDK 11. See Hadoop Java version compatibility
with JDK:
https://cwiki.apache.org/confluence/display/HADOOP/Hadoop+Java+Versions
That said, you'll need to use Spark 3.x with Hadoop 3.1 profile to make
Spark work with JDK 11.
On Tue, Mar 16, 2021 at 10:06 PM Sean Owen
That looks like you didn't compile with Java 11 actually. How did you try
to do so?
On Tue, Mar 16, 2021, 7:50 AM kaki mahesh raja
wrote:
> HI All,
>
> We have compiled spark with java 11 ("11.0.9.1") and when testing the
> thrift
> server we are seeing that insert
HI All,
We have compiled spark with java 11 ("11.0.9.1") and when testing the thrift
server we are seeing that insert query from operator using beeline failing
with the below error.
{"type":"log", "level":"ERROR", "time":"2021-
Sure Sean. Thanks for confirmation.
On Fri, 15 Jan 2021, 10:57 Sean Owen, wrote:
> You can ignore that. Spark 3.x works with Java 11 but it will generate
> some warnings that are safe to disregard.
>
> On Thu, Jan 14, 2021 at 11:26 PM Sachit Murarka
> wrote:
>
>> Hi Al
You can ignore that. Spark 3.x works with Java 11 but it will generate some
warnings that are safe to disregard.
On Thu, Jan 14, 2021 at 11:26 PM Sachit Murarka
wrote:
> Hi All,
>
> Getting warning while running spark3.0.1 with Java11 .
>
>
> WARNING: An illegal reflective ac
Hi All,
Getting warning while running spark3.0.1 with Java11 .
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (
file:/opt/spark/jars/spark-unsafe_2.12-3.0.1.jar) to constructor
发自网易邮箱大师
Thanks a lot.
On Tue, Jul 14, 2020 at 12:51 PM Prashant Sharma
wrote:
> Hi Ankur,
>
> Java 11 support was added in Spark 3.0.
> https://issues.apache.org/jira/browse/SPARK-24417
>
> Thanks,
>
>
> On Tue, Jul 14, 2020 at 6:12 PM Ankur Mittal
> wrote:
>
>>
Hi Ankur,
Java 11 support was added in Spark 3.0.
https://issues.apache.org/jira/browse/SPARK-24417
Thanks,
On Tue, Jul 14, 2020 at 6:12 PM Ankur Mittal
wrote:
> Hi,
>
> I am using Spark 2.X and need to execute Java 11 .Its not able to execute
> Java 11 using Spark 2.X.
>
>
Hi,
I am using Spark 2.X and need to execute Java 11 .Its not able to execute
Java 11 using Spark 2.X.
Is there any way we can use Java 11 with Spark2.X?
Has this issue been resolved in Spark 3.0 ?
--
Regards
Ankur Mittal
>From this
>(http://apache-spark-developers-list.1001551.n3.nabble.com/DISCUSS-Spark-2-5-release-td27963.html#a27966),
> looks like there is no confirmation yet if at all Spark 2.5 would have JDK 11
>support.
Spark 3 would most likely be out soon (tentatively this quarter as per m
Hi All,
Wanted to know if Java 11 support is added in Spark 2.5.
If so, what is the expected timeline for Spark 2.5 release?
Kind Regards,
Breeta Sinha
Not officially. We have seen problem with JDK 10 as well. It will be great if
you or someone would like to contribute to get it to work..
From: kant kodali
Sent: Tuesday, September 25, 2018 2:31 PM
To: user @spark
Subject: can Spark 2.4 work on JDK 11?
Hi All
Hi All,
can Spark 2.4 work on JDK 11? I feel like there are lot of features that
are added in JDK 9, 10, 11 that can make deployment process a whole lot
better and of course some more syntax sugar similar to Scala.
Thanks!
Hello, fellow Apache enthusiast. Thanks for your participation, and
interest in, the projects of the Apache Software Foundation.
I wanted to remind you that the Call For Papers (CFP) for ApacheCon
North America, and Apache: Big Data North America, closes in less than a
month. If you've been
and
Spark.
Best wishes,
Oleh Rozvadovskyy
CyberVision Inc
- Вихідне повідомлення -
Від: Oleh Rozvadovskyy orozvadovs...@cybervisiontech.com
Кому: user@spark.apache.org
Надіслано: Четвер, 23 Липень 2015 р 17:48:11
Тема: Schedule lunchtime today for a free webinar IoT data
Hi there!
Only couple of hours left to our first webinar on* IoT data ingestion in
Spark Streaming using Kaa*.
During the webinar we will build a solution that ingests real-time data
from Intel Edison into Apache Spark for stream processing. This solution
includes a client, middleware, and
I have recently encountered a similar problem with Guava version collision
with Hadoop.
Isn't it more correct to upgrade Hadoop to use the latest Guava? Why are
they staying in version 11, does anyone know?
*Romi Kuntsman*, *Big Data Engineer*
http://www.totango.com
On Wed, Jan 7, 2015 at 7:59
On Mon, Jan 19, 2015 at 4:03 PM, Romi Kuntsman r...@totango.com wrote:
I have recently encountered a similar problem with Guava version collision
with Hadoop.
Isn't it more correct to upgrade Hadoop to use the latest Guava? Why are
they staying in version 11, does anyone know?
*Romi Kuntsman
.
Isn't it more correct to upgrade Hadoop to use the latest Guava? Why are they
staying in version 11, does anyone know?
Romi Kuntsman, Big Data Engineer
http://www.totango.com
On Wed, Jan 7, 2015 at 7:59 AM, Niranda Perera niranda.per...@gmail.com
wrote:
Hi Sean,
I removed
-dev
Guava was not downgraded to 11. That PR was not merged. It was part of a
discussion about, indeed, what to do about potential Guava version
conflicts. Spark uses Guava, but so does Hadoop, and so do user programs.
Spark uses 14.0.1 in fact:
https://github.com/apache/spark/blob/master
Hi Sean,
My mistake, Guava 11 dependency came from the hadoop-commons indeed.
I'm running the following simple app in spark 1.2.0 standalone local
cluster (2 workers) with Hadoop 1.2.1
public class AvroSparkTest {
public static void main(String[] args) throws Exception {
SparkConf
niranda.per...@gmail.com
wrote:
Hi Sean,
My mistake, Guava 11 dependency came from the hadoop-commons indeed.
I'm running the following simple app in spark 1.2.0 standalone local
cluster (2 workers) with Hadoop 1.2.1
public class AvroSparkTest {
public static void main(String[] args) throws
:114)
While looking into this I found out that Guava was downgraded to version 11
in this PR.
https://github.com/apache/spark/pull/1610
In this PR OpenHashSet.scala:261 line hashInt has been changed to hashLong.
But when I actually run my app, java.lang.NoSuchMethodError
I posted this in cdh-user mailing list yesterday and think this should have
been the right audience for this:
=
Hi All,
Not sure if anyone else faced this same issue or not.
We installed CDH 4.6 that uses Hive 0.10.
And we have Spark 0.9.1 that comes with Hive 11.
Now our hive jobs
this same issue or not.
We installed CDH 4.6 that uses Hive 0.10.
And we have Spark 0.9.1 that comes with Hive 11.
Now our hive jobs that work on CDH, fail in Shark.
Anyone else facing same issues and any work-arounds ?
Can we re-compile shark 0.9.1 with hive 10 or compile hive 11 on CDH 4.6
31 matches
Mail list logo