risdenk commented on a change in pull request #553: SOLR-9515: Update to Hadoop 
3
URL: https://github.com/apache/lucene-solr/pull/553#discussion_r253095505
 
 

 ##########
 File path: solr/core/src/test/org/apache/solr/cloud/hdfs/HdfsTestUtil.java
 ##########
 @@ -62,73 +63,88 @@
   public static MiniDFSCluster setupClass(String dir) throws Exception {
     return setupClass(dir, true, true);
   }
-  
+
   public static MiniDFSCluster setupClass(String dir, boolean haTesting) 
throws Exception {
     return setupClass(dir, haTesting, true);
   }
-  
+
+  /**
+   * Checks that commons-lang3 FastDateFormat works with configured locale
+   */
+  @SuppressForbidden(reason="Call FastDateFormat.format same way Hadoop calls 
it")
+  private static void checkFastDateFormat() {
+    try {
+      FastDateFormat.getInstance().format(System.currentTimeMillis());
+    } catch (ArrayIndexOutOfBoundsException e) {
+      LuceneTestCase.assumeNoException("commons-lang3 FastDateFormat doesn't 
work with " +
+          Locale.getDefault().toLanguageTag(), e);
+    }
+  }
+
+  /**
+   * Hadoop fails to generate locale agnostic ids - Checks that generated 
string matches
+   */
+  private static void checkGeneratedIdMatches() {
 
 Review comment:
   Hadoop generates filename ids with `String.format` without any Locale set 
and then uses regex to try to match it. This fails on locales 
`th-TH-u-nu-thai-x-lvariant-TH` and `hi-IN`. This check matches what Hadoop is 
doing to make sure there is a match.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

Reply via email to