Sure let me check Jira
Regards,
Vaquar khan
On Thu, Jun 21, 2018, 4:42 PM Takeshi Yamamuro
wrote:
> In this ticket SPARK-24201, the ambiguous statement in the doc had been
> pointed out.
> can you make pr for that?
>
> On Fri, Jun 22, 2018 at 6:17 AM, vaquar khan
> wrote:
>
>>
In this ticket SPARK-24201, the ambiguous statement in the doc had been
pointed out.
can you make pr for that?
On Fri, Jun 22, 2018 at 6:17 AM, vaquar khan wrote:
> https://spark.apache.org/docs/2.3.0/
>
> Avoid confusion we need to updated doc with supported java version "*Java8
> + " *word
https://spark.apache.org/docs/2.3.0/
Avoid confusion we need to updated doc with supported java version "*Java8
+ " *word confusing for users
Spark runs on Java 8+, Python 2.7+/3.4+ and R 3.1+. For the Scala API,
Spark 2.3.0 uses Scala 2.11. You will need to use a compatible Scala
version
Hi Rahul,
This will work only in Java 8.
Installation does not work with both version 9 and 10
Thanks,
Christopher
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
-
To unsubscribe e-mail:
AM
> *To:* user@spark.apache.org
> *Subject:* Spark 2.3.1 not working on Java 10
>
> Dear Team,
>
> I have installed Java 10, Scala 2.12.6 and spark 2.3.1 in my desktop
> having Ubuntu 16.04. I am getting error opening spark-shell.
>
> Failed to initialize compiler: object java.
I'm not sure we have completed support for Java 10
From: Rahul Agrawal
Sent: Thursday, June 21, 2018 7:22:42 AM
To: user@spark.apache.org
Subject: Spark 2.3.1 not working on Java 10
Dear Team,
I have installed Java 10, Scala 2.12.6 and spark 2.3.1 in my
Dear Team,
I have installed Java 10, Scala 2.12.6 and spark 2.3.1 in my desktop having
Ubuntu 16.04. I am getting error opening spark-shell.
Failed to initialize compiler: object java.lang.Object in compiler mirror
not found.
Please let me know if there is any way to run spark in Java 10.