conect opened a new issue, #807:
URL: https://github.com/apache/sedona/issues/807

   ## Expected behavior
   Geojson is read and then written as dataframe to Postgresql DB
   
   ## Actual behavior
   Following error occurs:
   java.lang.IllegalArgumentException: Can't get JDBC type for array<tinyint>
        at 
org.apache.spark.sql.errors.QueryExecutionErrors$.cannotGetJdbcTypeError(QueryExecutionErrors.scala:674)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$getJdbcType$2(JdbcUtils.scala:181)
        at scala.Option.getOrElse(Option.scala:189)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.getJdbcType(JdbcUtils.scala:181)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$savePartition$5(JdbcUtils.scala:705)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$savePartition$5$adapted(JdbcUtils.scala:705)
   ## Steps to reproduce the problem
   ```
   SparkSession spark = SparkSession.builder()
                   .master("local[*]") // Delete this if run in cluster mode
                   .appName("readTestScala") // Change this to a proper name
   // Enable Sedona custom Kryo serializer
                   .config("spark.serializer", KryoSerializer.class.getName())
                   .config("spark.kryo.registrator", 
SedonaKryoRegistrator.class.getName())
                   .getOrCreate(); // 
org.apache.sedona.core.serde.SedonaKryoRegistrator
   SedonaSQLRegistrator.registerAll(spark);
   SedonaVizRegistrator.registerAll(spark);
   
   SpatialRDD spatialRDD1 = 
GeoJsonReader.readToGeometryRDD(JavaSparkContext.fromSparkContext(spark.sparkContext()),"./src/main/resources/features.json",false,false);
    spatialRDD1.CRSTransform("epsg:4326","epsg:3857");
    Dataset<Row> d2 = Adapter.toDf(spatialRDD1,spark);
    d2.show();
    System.out.println(d2.schema());
    d2.write()
     .format("jdbc")
     .mode(SaveMode.Append )
     .option("url", "jdbc:postgresql://somedb")
     .option("dbtable", "sometable")
     .option("user", "someuser")
     .option("password", "somepw")
     .save();
   ```
   ## Settings
   
   Sedona version = 
    <dependency>
               <groupId>org.apache.sedona</groupId>
               <artifactId>sedona-python-adapter-3.0_2.12</artifactId>
               <version>1.3.1-incubating</version>
           </dependency>
           <dependency>
               <groupId>org.apache.sedona</groupId>
               <artifactId>sedona-viz-3.0_2.12</artifactId>
               <version>1.3.1-incubating</version>
           </dependency>
   
   Apache Spark version = 
     <dependency>
               <groupId>org.apache.spark</groupId>
               <artifactId>spark-sql_2.12</artifactId>
               <version>3.2.1</version>
           </dependency>
   
   Apache Flink version = ?
   
   API type =  Java
   
   JRE version = 1.8
   
   Environment = Standalone


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@sedona.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to