Github user DaveBirdsall commented on a diff in the pull request:

    https://github.com/apache/trafodion/pull/1634#discussion_r201148299
  
    --- Diff: 
docs/sql_reference/src/asciidoc/_chapters/sql_language_elements.adoc ---
    @@ -396,21 +396,22 @@ length of 8 bytes.
     * _precision_ specifies the allowed number of decimal digits.
     
     
    -1.  The size of a column that allows null values is 2 bytes larger than 
the size for the defined data type.
    -2.  The maximum row size is 32708 bytes, but the actual row size is less 
than that because of bytes used by
    -null indicators, varchar column length indicators, and actual data 
encoding.
    -3.  Storage size is the same as that required by CHAR data type but store 
only half as many characters depending
    -on character set selection.
    -4.  Storage size is reduced by 4 bytes for storage of the varying 
character length.
    -5.  The maximum number of digits in an INTERVAL value is 18, including the 
digits in all INTERVAL fields of the value.
    +1. The size of a column that allows null values is 2 bytes larger than the 
size for the defined data type.
    +2. Storage size is the same as that required by CHAR data type but store 
only half.
    --- End diff --
    
    Perhaps what you mean in footnote 2 is: "The maximum number of characters 
depends on the character set. For 8-bit character sets such as ISO88591, the 
maximum number of characters is equal to the maximum storage size in bytes. For 
16-bit character sets such as UCS2, the maximum number of characters is half 
the maximum storage size in bytes. For character sets requiring up to 32 bits 
per character such as UTF8, the maximum number of characters is one fourth the 
maximum storage size in bytes."


---

Reply via email to