Sergio Lob created PHOENIX-2370:
-----------------------------------
Summary: ResultSetMetaData.getColumnDisplaySize() returns bad
value for varchar and varbinary columns
Key: PHOENIX-2370
URL: https://issues.apache.org/jira/browse/PHOENIX-2370
Project: Phoenix
Issue Type: Bug
Affects Versions: 4.5.0
Environment: Linux lnxx64r6 2.6.32-131.0.15.el6.x86_64 #1 SMP Tue May
10 15:42:40 EDT 2011 x86_64 x86_64 x86_64 GNU/Linux
Reporter: Sergio Lob
ResultSetMetaData.getColumnDisplaySize() returns bad values for varchar and
varbinary columns. Specifically, for the following table:
CREATE TABLE SERGIO (I INTEGER, V10 VARCHAR(10),
VHUGE VARCHAR(2147483647), V VARCHAR, VB10 VARBINARY(10), VBHUGE
VARBINARY(2147483647), VB VARBINARY) ;
1. getColumnDisplaySize() returns 20 for all varbinary columns, no matter the
defined size. This should return the max possible size of the column, so:
getColumnDisplaySize() should return 10 for column VB10,
getColumnDisplaySize() should return 2147483647 for column VBHUGE,
getColumnDisplaySize() should return 2147483647 for column VB, assuming that a
column defined with no size should default to the maximum size.
2. getColumnDisplaySize() returns 40 for all varchar columns that are not
defined with a size, like in column V in the above CREATE TABLE. I would think
that a VARCHAR column defined with no size parameter should default to the
maximum size possible, not to a random number like 40.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)