The intent of the code is absolutely to use the two-digit year start to handle two-digits entries. If that isn't happening, it's a pretty bad bug.
-- Adam On 3/8/07, Yee-wah Lee <[EMAIL PROTECTED]> wrote:
Hi all, I see the following code in the (server-side) DateTimeConverter and think it may be problematic. 1) The default style for dates is set to shortish, which forces the year pattern to be at least 4 digits (http://incubator.apache.org/adffaces/trinidad-api/apidocs/org/apache/myfaces/trinidad/convert/DateTimeConverter.html) /New dateStyle |shortish| has been introduced. Shortish is identical to |short| but forces the year to be a full four digits. If dateStyle is not set, then |dateStyle| defaults to |shortish|. / 2) Accordingly, the converter sets the pattern on its DateFormat to use at least 4 digits, if 'y' appears at all. The method is _get4YearFormat() 3) Now, the Javadoc states this for SimpleDateFormat (http://java.sun.com/j2se/1.5.0/docs/api/java/text/SimpleDateFormat.html) /For parsing, if the number of pattern letters is more than 2, the year is interpreted literally, regardless of the number of digits. So using the pattern "MM/dd/yyyy", "01/11/12" parses to Jan 11, 12 A.D. / So why I think this is a problem: * From the user's perspective, if they enters a date like '1/31/07', it becomes January 31st, 7 A.D rather than the (more likely intended) January 31st, 2007. * It also seems like the code intends otherwise, because it also calls DateFormat's set2DigitYearStart() which is intended for parsing 2 digit years. Does anyone have more background on this? I was able to reproduce the behavior using an inputText bound to a backing Date value, and a DateTimeConverter attached to it. Thanks, Yee-Wah