[ https://issues.apache.org/jira/browse/SPARK-36513?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Xiansen Chen updated SPARK-36513: --------------------------------- Description: Hi, I'm trying to use openGauss as the Spark Datasource and implementing some datasource interfaces. I would like to know if there is any solution. Thank you! But when I try to get the column which is VARCHAR type, I got an error like this: !image-2021-08-14-22-47-56-474.png! When I use getInt() or getDouble() for other type, the code is working fine. And here is some code: {code:java} object OpenGaussTable { /*Table products*/ val schema: StructType = new StructType().add("name", StringType) } ... class OpenGaussPartitionReader(connectionProperties: ConnectionProperties) extends PartitionReader[InternalRow] { private val connection = DriverManager.getConnection( connectionProperties.url, connectionProperties.user, connectionProperties.password ) private val statement = connection.createStatement() private val resultSet = statement.executeQuery(s"select * from ${connectionProperties.tableName}") override def next(): Boolean = resultSet.next() override def get(): InternalRow = InternalRow(resultSet.getString(1)) override def close(): Unit = connection.close() }{code} the complete code address: [https://pastebin.ubuntu.com/p/vCBssjhc2r/] [https://pastebin.ubuntu.com/p/jCcDbb79NG/] was: Hi, I'm trying to use openGauss as the Spark Datasource and implementing some datasource interfaces. I would like to know if there is any solution. Thank you! But when I try to get the column which is VARCHAR type, I got an error like this: !image-2021-08-14-22-33-32-787.png! When I use getInt() or getDouble() for other type, the code is working fine. And here is some code: {code:java} object OpenGaussTable { /*Table products*/ val schema: StructType = new StructType().add("name", StringType) } ... class OpenGaussPartitionReader(connectionProperties: ConnectionProperties) extends PartitionReader[InternalRow] { private val connection = DriverManager.getConnection( connectionProperties.url, connectionProperties.user, connectionProperties.password ) private val statement = connection.createStatement() private val resultSet = statement.executeQuery(s"select * from ${connectionProperties.tableName}") override def next(): Boolean = resultSet.next() override def get(): InternalRow = InternalRow(resultSet.getString(1)) override def close(): Unit = connection.close() }{code} the complete code address: [https://pastebin.ubuntu.com/p/vCBssjhc2r/] https://pastebin.ubuntu.com/p/jCcDbb79NG/ > Using openGauss ( like postgres) as Spark Datasource, Spark thorws cast errors > ------------------------------------------------------------------------------ > > Key: SPARK-36513 > URL: https://issues.apache.org/jira/browse/SPARK-36513 > Project: Spark > Issue Type: Bug > Components: Java API, SQL > Affects Versions: 3.3.0 > Environment: Spark 3.3.0 > Scala 2.12 > Reporter: Xiansen Chen > Priority: Major > Attachments: image-2021-08-14-22-47-56-474.png > > > Hi, I'm trying to use openGauss as the Spark Datasource and implementing some > datasource interfaces. I would like to know if there is any solution. Thank > you! > > But when I try to get the column which is VARCHAR type, I got an error like > this: > !image-2021-08-14-22-47-56-474.png! > When I use getInt() or getDouble() for other type, the code is working fine. > And here is some code: > > {code:java} > object OpenGaussTable { > /*Table products*/ > val schema: StructType = new StructType().add("name", StringType) > } > ... > class OpenGaussPartitionReader(connectionProperties: ConnectionProperties) > extends PartitionReader[InternalRow] { > private val connection = DriverManager.getConnection( > connectionProperties.url, connectionProperties.user, > connectionProperties.password > ) > private val statement = connection.createStatement() > private val resultSet = statement.executeQuery(s"select * from > ${connectionProperties.tableName}") > override def next(): Boolean = resultSet.next() > override def get(): InternalRow = InternalRow(resultSet.getString(1)) > override def close(): Unit = connection.close() > }{code} > > the complete code address: > [https://pastebin.ubuntu.com/p/vCBssjhc2r/] > [https://pastebin.ubuntu.com/p/jCcDbb79NG/] > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org