[ https://issues.apache.org/jira/browse/SPARK-36996?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17428105#comment-17428105 ]
Senthil Kumar commented on SPARK-36996: --------------------------------------- I m working on this > fixing "SQL column nullable setting not retained as part of spark read" issue > ----------------------------------------------------------------------------- > > Key: SPARK-36996 > URL: https://issues.apache.org/jira/browse/SPARK-36996 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 3.0.0, 3.1.0, 3.1.1, 3.1.2 > Reporter: Senthil Kumar > Priority: Major > > Sql 'nullable' columns are not retaining 'nullable' type as it is while > reading from Spark read using jdbc format. > > SQL : > ------------ > > mysql> CREATE TABLE Persons(Id int NOT NULL, FirstName varchar(255), LastName > varchar(255), Age int); > > mysql> desc Persons; > +-----------+--------------+------+-----+---------+-------+ > | Field | Type | Null | Key | Default | Extra | > +-----------+--------------+------+-----+---------+-------+ > | Id | int | NO | | NULL | | > | FirstName | varchar(255) | YES | | NULL | | > | LastName | varchar(255) | YES | | NULL | | > | Age | int | YES | | NULL | | > +-----------+--------------+------+-----+---------+-------+ > > But in Spark we get all the columns as "Nullable": > ============= > scala> val df = > spark.read.format("jdbc").option("database","Test_DB").option("user", > "root").option("password", "").option("driver", > "com.mysql.cj.jdbc.Driver").option("url", > "jdbc:mysql://localhost:3306/Test_DB").option("dbtable", "Persons").load() > df: org.apache.spark.sql.DataFrame = [Id: int, FirstName: string ... 2 more > fields] > scala> df.printSchema() > root > |-- Id: integer (nullable = true) > |-- FirstName: string (nullable = true) > |-- LastName: string (nullable = true) > |-- Age: integer (nullable = true) > ============= > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org