[ https://issues.apache.org/jira/browse/SPARK-10855?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Reynold Xin resolved SPARK-10855. --------------------------------- Resolution: Fixed Assignee: Rick Hillegas Fix Version/s: 1.6.0 > Add a JDBC dialect for Apache Derby > ------------------------------------ > > Key: SPARK-10855 > URL: https://issues.apache.org/jira/browse/SPARK-10855 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 1.5.0 > Reporter: Rick Hillegas > Assignee: Rick Hillegas > Priority: Minor > Fix For: 1.6.0 > > > In particular, it would be good if the dialect could handle Derby's > user-defined types. The following script fails: > {noformat} > import org.apache.spark.sql._ > import org.apache.spark.sql.types._ > // the following script was used to create a Derby table > // which has a column of user-defined type: > // > // create type properties external name 'java.util.Properties' language java; > // > // create function systemProperties() returns properties > // language java parameter style java no sql > // external name 'java.lang.System.getProperties'; > // > // create table propertiesTable( props properties ); > // > // insert into propertiesTable values ( null ), ( systemProperties() ); > // > // select * from propertiesTable; > // cannot handle a table which has a column of type > java.sql.Types.JAVA_OBJECT: > // > // java.sql.SQLException: Unsupported type 2000 > // > val df = sqlContext.read.format("jdbc").options( > Map("url" -> "jdbc:derby:/Users/rhillegas/derby/databases/derby1", > "dbtable" -> "app.propertiesTable")).load() > // shutdown the Derby engine > val shutdown = sqlContext.read.format("jdbc").options( > Map("url" -> "jdbc:derby:;shutdown=true", > "dbtable" -> "")).load() > exit() > {noformat} > The inability to handle user-defined types probably affects other databases > besides Derby. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org