[ https://issues.apache.org/jira/browse/SPARK-16528?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15378078#comment-15378078 ]
Sean Owen commented on SPARK-16528: ----------------------------------- No, but the latest RC could theoretically be the released version, in which case this would not be in for 2.0.0. If there's another RC it would become part of it. 2.0.1 is the most correct fix version at the moment. > HiveClientImpl throws NPE when reading database from a custom metastore > ----------------------------------------------------------------------- > > Key: SPARK-16528 > URL: https://issues.apache.org/jira/browse/SPARK-16528 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.0.0 > Reporter: Jacek Lewandowski > Assignee: Jacek Lewandowski > Fix For: 2.0.1, 2.1.0 > > > In _HiveClientImpl_ there is a method to create database: > {code} > override def createDatabase( > database: CatalogDatabase, > ignoreIfExists: Boolean): Unit = withHiveState { > client.createDatabase( > new HiveDatabase( > database.name, > database.description, > database.locationUri, > database.properties.asJava), > ignoreIfExists) > } > {code} > The problem is that it assumes that {{database.properties}} is a not null > value which is not always the truth. In fact, when the database is created, > in _HiveMetaStore_ we have: > {code} > private void createDefaultDB_core(RawStore ms) throws MetaException, > InvalidObjectException { > try { > ms.getDatabase(DEFAULT_DATABASE_NAME); > } catch (NoSuchObjectException e) { > Database db = new Database(DEFAULT_DATABASE_NAME, > DEFAULT_DATABASE_COMMENT, > wh.getDefaultDatabasePath(DEFAULT_DATABASE_NAME).toString(), null); > db.setOwnerName(PUBLIC); > db.setOwnerType(PrincipalType.ROLE); > ms.createDatabase(db); > } > } > {code} > As you can see, parameters field is set to {{null}}. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org