orbigtuna commented on issue #8322:
URL: 
https://github.com/apache/incubator-gluten/issues/8322#issuecomment-2560895212

        24/12/24 17:26:36 INFO SparkContext: Running Spark version 3.5.2
        24/12/24 17:26:36 INFO SparkContext: OS info Linux, 
3.10.0-1160.el7.x86_64, amd64
        24/12/24 17:26:36 INFO SparkContext: Java version 1.8.0_321
        24/12/24 17:26:36 INFO ResourceUtils: 
==============================================================
        24/12/24 17:26:36 INFO ResourceUtils: No custom resources configured 
for spark.driver.
        24/12/24 17:26:36 INFO ResourceUtils: 
==============================================================
        24/12/24 17:26:36 INFO SparkContext: Submitted application: query2.sql
        24/12/24 17:26:36 INFO ResourceProfile: Default ResourceProfile 
created, executor resources: Map(cores -> name: cores, amount: 4, script: , 
vendor: , memory -> name: memory, amount: 2048, script: , vendor: , offHeap -> 
name: offHeap, amount: 6144, script: , vendor: ), task resources: Map(cpus -> 
name: cpus, amount: 1.0)
        24/12/24 17:26:36 INFO ResourceProfile: Limiting resource is cpus at 4 
tasks per executor
        24/12/24 17:26:36 INFO ResourceProfileManager: Added ResourceProfile 
id: 0
        24/12/24 17:26:36 INFO SecurityManager: Changing view acls groups to: 
        24/12/24 17:26:36 INFO SecurityManager: Changing modify acls groups to: 
        24/12/24 17:26:37 INFO Utils: Successfully started service 
'sparkDriver' on port 38781.
        24/12/24 17:26:37 INFO SparkEnv: Registering MapOutputTracker
        24/12/24 17:26:37 ERROR SparkContext: Error initializing SparkContext.
                 java.lang.reflect.InvocationTargetException
                at 
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
                at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
                at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
                at 
java.lang.reflect.Constructor.newInstance(Constructor.java:423)
                at 
org.apache.spark.util.Utils$.instantiateSerializerOrShuffleManager(Utils.scala:2558)
                at org.apache.spark.SparkEnv$.create(SparkEnv.scala:318)
                at 
org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:194)
                at 
org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
                at org.apache.spark.SparkContext.<init>(SparkContext.scala:478)
                at 
org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2883)
                at 
org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:1099)
                at scala.Option.getOrElse(Option.scala:189)
                at 
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:1093)
                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                at java.lang.reflect.Method.invoke(Method.java:498)
                at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
                at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1029)
                at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:194)
                at 
org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:217)
                at 
org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
                at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1120)
                at 
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1129)
                at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
        Caused by: java.lang.NoSuchMethodError: 
org.apache.spark.shuffle.IndexShuffleBlockResolver.<init>(Lorg/apache/spark/SparkConf;Lorg/apache/spark/storage/BlockManager;)V
                at 
org.apache.spark.shuffle.sort.ColumnarShuffleManager.<init>(ColumnarShuffleManager.scala:38)
                ... 29 more


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to