Re: Migration to Spark 3.2

2022-01-31 Thread Aurélien Mazoyer
Hi Stephen, I managed to solve my issue, I had a conflicting version of jackson databind that came from parent pom. Thank you, Aurelien Le dim. 30 janv. 2022 à 23:28, Aurélien Mazoyer a écrit : > Hi Stephen, > > Thank you for your answer. Yes, I changed the scope to "provided" but got > the

Re: Migration to Spark 3.2

2022-01-30 Thread Aurélien Mazoyer
Hi Stephen, Thank you for your answer. Yes, I changed the scope to "provided" but got the same error :-( FYI. I am getting this error while running tests. Regards, Aurelien Le jeu. 27 janv. 2022 à 23:57, Stephen Coy a écrit : > Hi Aurélien, > > Your Jackson versions look fine. > > What

Re: Migration to Spark 3.2

2022-01-27 Thread Stephen Coy
Hi Aurélien, Your Jackson versions look fine. What happens if you change the scope of your Jackson dependencies to “provided”? This should result in your application using the versions provided by Spark and avoid this potential collision. Cheers, Steve C On 27 Jan 2022, at 9:48 pm, Aurélien

Re: Migration to Spark 3.2

2022-01-27 Thread Aurélien Mazoyer
Hi Stephen, Thank you for your answer! Here it is, it seems that jackson dependencies are correct, no? : Thanks, [INFO] com.krrier:spark-lib-full:jar:0.0.1-SNAPSHOT [INFO] +- com.krrier:backend:jar:0.0.1-SNAPSHOT:compile [INFO] | \- com.krrier:data:jar:0.0.1-SNAPSHOT:compile [INFO] +-

Re: Migration to Spark 3.2

2022-01-26 Thread Stephen Coy
Hi Aurélien! Please run mvn dependency:tree and check it for Jackson dependencies. Feel free to respond with the output if you have any questions about it. Cheers, Steve C > On 22 Jan 2022, at 10:49 am, Aurélien Mazoyer wrote: > > Hello, > > I migrated my code to Spark 3.2 and I am

Re: Migration to Spark 3.2

2022-01-25 Thread Aurélien Mazoyer
Hello, Sorry for asking twice, but anyone has any idea which issue I could be facing with this depencency problem :-/? Thank you, Aurelien Le sam. 22 janv. 2022 à 00:49, Aurélien Mazoyer a écrit : > Hello, > > I migrated my code to Spark 3.2 and I am facing some issues. When I run my > unit

Migration to Spark 3.2

2022-01-21 Thread Aurélien Mazoyer
Hello, I migrated my code to Spark 3.2 and I am facing some issues. When I run my unit tests via Maven, I get this error: java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.rdd.RDDOperationScope$ which is not super nice. However, when I run my test via Intellij, I get