Re: Data analysis issues

2023-11-02 Thread Mich Talebzadeh
Hi, Your mileage varies so to speak.Whether or not the data you use to analyze in Spark through RStudio will be seen by Spark's back-end depends on how you deploy Spark and RStudio. If you are deploying Spark and RStudio on your own premises or in a private cloud environment, then the data you

Re: Spark / Scala conflict

2023-11-02 Thread Harry Jamison
Thanks Alonso, I think this gives me some ideas. My code is written in Python, and I use spark-submit to submit it. I am not sure what code is written in scala.  Maybe the Phoenix driver based on the stack trace? How do I tell which version of scala that was compiled against? Is there a jar

RE: jackson-databind version mismatch

2023-11-02 Thread moshik.vitas
Thanks for replying, The issue was import of spring-boot-dependencies on my dependencyManagement pom that forced invalid jar version. Removed this section and got valid spark dependencies. Regards, Moshik Vitas From: Bjørn Jørgensen Sent: Thursday, 2 November 2023 10:40 To:

Data analysis issues

2023-11-02 Thread Jauru Lin
Hello all, I have a question about Apache Spark, I would like to ask if I use Rstudio to connect to Spark to analyze data, will the data I use be seen by Spark's back-end personnel? Hope someone can solve my problem. Thanks!

Re: Re: jackson-databind version mismatch

2023-11-02 Thread eab...@163.com
Hi, But in fact, it does have those packages. D:\02_bigdata\spark-3.5.0-bin-hadoop3\jars 2023/09/09 10:0875,567 jackson-annotations-2.15.2.jar 2023/09/09 10:08 549,207 jackson-core-2.15.2.jar 2023/09/09 10:08

Re: jackson-databind version mismatch

2023-11-02 Thread Bjørn Jørgensen
[SPARK-43225][BUILD][SQL] Remove jackson-core-asl and jackson-mapper-asl from pre-built distribution tor. 2. nov. 2023 kl. 09:15 skrev Bjørn Jørgensen : > In spark 3.5.0 removed jackson-core-asl and jackson-mapper-asl those > are with groupid

Re: Spark / Scala conflict

2023-11-02 Thread Aironman DirtDiver
The error message Caused by: java.lang.ClassNotFoundException: scala.Product$class indicates that the Spark job is trying to load a class that is not available in the classpath. This can happen if the Spark job is compiled with a different version of Scala than the version of Scala that is used to

Re: jackson-databind version mismatch

2023-11-02 Thread Bjørn Jørgensen
In spark 3.5.0 removed jackson-core-asl and jackson-mapper-asl those are with groupid org.codehaus.jackson. Those others jackson-* are with groupid com.fasterxml.jackson.core tor. 2.