[ https://issues.apache.org/jira/browse/HAWQ-302?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Ruilong Huo updated HAWQ-302: ----------------------------- Affects Version/s: 2.0.0-beta-incubating > hawq_toolkit.hawq_size_of_database generate warning "get hdfsFileInfo > numEntries invalid in gpfs_hdfs_freefileinfo" on large cluster > ------------------------------------------------------------------------------------------------------------------------------------ > > Key: HAWQ-302 > URL: https://issues.apache.org/jira/browse/HAWQ-302 > Project: Apache HAWQ > Issue Type: Bug > Components: Storage > Affects Versions: 2.0.0-beta-incubating > Reporter: Ruilong Huo > Assignee: Ruilong Huo > Priority: Minor > > When trying to get database size using hawq_toolkit.hawq_size_of_database, it > generates warning "get hdfsFileInfo numEntries invalid in > gpfs_hdfs_freefileinfo" on large cluster. > {noformat} > gptest=# select * from hawq_toolkit.hawq_size_of_database ; > WARNING: get hdfsFileInfo numEntries invalid in gpfs_hdfs_freefileinfo > WARNING: get hdfsFileInfo numEntries invalid in gpfs_hdfs_freefileinfo > WARNING: get hdfsFileInfo numEntries invalid in gpfs_hdfs_freefileinfo > ... > WARNING: get hdfsFileInfo numEntries invalid in gpfs_hdfs_freefileinfo > WARNING: get hdfsFileInfo numEntries invalid in gpfs_hdfs_freefileinfo > WARNING: get hdfsFileInfo numEntries invalid in gpfs_hdfs_freefileinfo > sodddatname | sodddatsize > -------------------------------+---------------- > gptest | 181469192 > (1 rows) > {noformat} -- This message was sent by Atlassian JIRA (v6.3.4#6332)