tedshim opened a new issue, #3775:
URL: https://github.com/apache/incubator-seatunnel/issues/3775

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/incubator-seatunnel/issues?q=is%3Aissue+label%3A%22bug%22)
 and found no similar issues.
   
   
   ### What happened
   
   S3A connection is not working well.
   
   I confirmed the normal operation with the test code.
   
   It is our company’s server, and it is accessed through VPN.
   So, access the IP.
   
   It seems that it was only considered when using AWS resources.
   We use our own server based on S3A.
   
   
   ### SeaTunnel Version
   
   2.3.0-release
   
   
   ### SeaTunnel Config
   
   ```conf
   source {
       S3File {
           path = "/sample"
           bucket = "s3a://ne-anonymous-sampledata"
           access_key = "XXXXXXX"
           secret_key = "XXXXXXX"
           type = "text"
           hadoop_s3_properties {
               "fs.s3a.aws.credentials.provider" = 
"org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider"
               "fs.s3a.endpoint" = "http://192.168.80.5";
           }
           schema {
               fields {
                   name = string
                   phonenumber = string
                   housing = string
                   code = string
                   city = string
                   borough = string
                   town = string
                   landnumber = string
                   buildingname = string
                   lumpsumdeposit = string
                   deposit = string
                   lease = string
                   exp_priarea = string
                   sup_area = string
               }
           }
           result_table_name = "tmp_table"
       }
   }
   ```
   
   
   ### Running Command
   
   ```shell
   package org.apache.seatunnel.example.spark.v2;
   
   public class SeaTunnelApiExample {
   
       public static void main(String[] args) throws FileNotFoundException, 
URISyntaxException, CommandException {
           String configurePath = args.length > 0 ?  args[0] : 
"/examples/spark.batch.conf";
           ExampleUtils.builder(configurePath);
       }
   }
   ```
   
   
   ### Error Exception
   
   ```log
   xception in thread "main" 
org.apache.seatunnel.core.starter.exception.CommandExecuteException: Status 
Code: 403, AWS Service: Amazon S3, AWS Request ID: RZZS6WCYP4YZ4M9Z, AWS Error 
Code: null, AWS Error Message: Forbidden
        at 
org.apache.seatunnel.core.starter.spark.command.SparkApiTaskExecuteCommand.execute(SparkApiTaskExecuteCommand.java:58)
        at org.apache.seatunnel.core.starter.Seatunnel.run(Seatunnel.java:39)
        at 
org.apache.seatunnel.example.spark.v2.ExampleUtils.builder(ExampleUtils.java:43)
        at 
org.apache.seatunnel.example.spark.v2.SeaTunnelApiExample.main(SeaTunnelApiExample.java:29)
   ```
   
   
   ### Flink or Spark Version
   
   Spark-2.4.8 (with hadoop 2.7)
   
   ### Java or Scala Version
   
   Java 1.8
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to