b7wch opened a new issue, #12832:
URL: https://github.com/apache/hudi/issues/12832

   I think I have found a bug of "HoodieCompactor", HoodieCompactor has bean 
set two options with the same "-sc" parameter.
   Anyone could confirm that?
   
   ## Source Code
   ```
       @Parameter(names = {"--skip-clean", "-sc"}, description = "do not 
trigger clean after compaction", required = false)
       public Boolean skipClean = true;
       @Parameter(names = {"--schedule", "-sc"}, description = "Schedule 
compaction", required = false)
       public Boolean runSchedule = false;
       @Parameter(names = {"--mode", "-m"}, description = "Set job mode: Set 
\"schedule\" means make a compact plan; "
           + "Set \"execute\" means execute a compact plan at given instant 
which means --instant-time is needed here; "
           + "Set \"scheduleAndExecute\" means make a compact plan first and 
execute that plan immediately", required = false)
       
   ```
   
   
   ## Command
   ```
   spark-submit --class org.apache.hudi.utilities.HoodieCompactor \
     
${HUDI_REPO}hudi/packaging/hudi-utilities-bundle/target/hudi-utilities-bundle_2.12-1.1.0-SNAPSHOT.jar
 \
   --jars 
${HUDI_JARS}/hudi-spark3.4-bundle_2.12-1.0.0.jar,${HUDI_JARS}/hadoop-aws-3.3.4.jar,${HUDI_JARS}//aws-java-sdk-bundle-1.12.367.jar
 \
   --conf spark.hadoop.fs.s3a.endpoint=http://127.0.0.1:19000 \
   --conf spark.hadoop.fs.s3a.access.key=xxxxx \
   --conf spark.hadoop.fs.s3a.secret.key=xxxxx \
   --conf spark.hadoop.fs.s3a.path.style.access=true \
   --conf spark.hadoop.fs.s3a.connection.ssl.enable=false \
   --conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem \
   --base-path s3a://test/hudi-test-topic \
   --table-name hudi-test-topic \
   --schema-file s3a://test/schema.avsc \
   --instant-time 20250211112324137 \
   --parallelism 2 \
   --spark-memory 1g
   ````
   ## Output
   ```
   Exception in thread "main" 
org.apache.hudi.com.beust.jcommander.ParameterException: Found the option -sc 
multiple times
        at 
org.apache.hudi.com.beust.jcommander.JCommander.addDescription(JCommander.java:627)
        at 
org.apache.hudi.com.beust.jcommander.JCommander.createDescriptions(JCommander.java:594)
        at 
org.apache.hudi.com.beust.jcommander.JCommander.<init>(JCommander.java:249)
        at 
org.apache.hudi.utilities.HoodieCompactor.main(HoodieCompactor.java:173)
        at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
        at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:568)
        at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1020)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:215)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1111)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1120)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   25/02/12 17:06:22 INFO ShutdownHookManager: Shutdown hook called
   25/02/12 17:06:22 INFO ShutdownHookManager: Deleting directory 
/private/var/folders/vy/55g0v9hj7k30l58rx3crpgj40000gn/T/spark-91863f32-7ad6-47bc-9277-cc5eae852871
   ````


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to