dsmiley commented on a change in pull request #324:
URL: https://github.com/apache/solr/pull/324#discussion_r794173884



##########
File path: solr/core/src/java/org/apache/solr/update/UpdateHandler.java
##########
@@ -115,12 +115,12 @@ public UpdateHandler(SolrCore core, UpdateLog updateLog)  
{
     boolean skipUpdateLog = core.getCoreDescriptor().getCloudDescriptor() != 
null && !core.getCoreDescriptor().getCloudDescriptor().requiresTransactionLog();
     if (updateLog == null && ulogPluginInfo != null && 
ulogPluginInfo.isEnabled() && !skipUpdateLog) {
       DirectoryFactory dirFactory = core.getDirectoryFactory();
-      if (dirFactory instanceof HdfsDirectoryFactory) {
-        ulog = new 
HdfsUpdateLog(((HdfsDirectoryFactory)dirFactory).getConfDir());
-      } else {
-        ulog = ulogPluginInfo.className == null ? new UpdateLog():
-                core.getResourceLoader().newInstance(ulogPluginInfo, 
UpdateLog.class, true);
-      }
+
+      // if the update log class is not defined in the plugin info / 
solrconfig.xml
+      // (like <updateLog class="${solr.ulog:solr.UpdateLog}"> )
+      // we fall back use the one which is the default for the given directory 
factory
+      ulog = ulogPluginInfo.className == null ? 
dirFactory.newDefaultUpdateLog() :

Review comment:
       but then uLogPluginInfo would be ignored, which might have settings to 
be applied.  Either `newDefaultUpdateLog` should take the pluginInfo as a 
parameter, or just have it return the class (or class name) to be instantiated 
(and maybe tweak the name to be getDefaultUpdateLogClassName).  I think the 
latter is cleaner.

##########
File path: solr/modules/hdfs/README.md
##########
@@ -0,0 +1,31 @@
+Apache Solr HDFS Module
+===============================
+
+Introduction
+------------
+This module implements the support for Hadoop Distributed File System in 
Apache Solr. 
+
+Building
+--------
+The HDFS module uses the same Gradle build as the core Solr components. 
+
+To build the module, you can use
+
+```
+./gradlew :solr:modules:hdfs:assemble
+```
+
+The resulting module will be placed to the libs directory, for example:
+`solr/modules/hdfs/build/libs/solr-hdfs-9.0.0-SNAPSHOT.jar`
+
+To execute the module tests:
+
+```
+./gradlew :solr:modules:hdfs:test
+```
+
+Usage
+-----
+To use the module, it needs to be placed to the Solr web application classpath 
(for example by symlinking it to the WEB-INF/lib directory.)

Review comment:
       IMO, unless there are unique constraints about HDFS's JARs, we can 
simply refer to the ref guide (`solr-plugins.adoc`)  on how plugins are 
installed.  There are a number of options.

##########
File path: 
solr/modules/hdfs/src/test/org/apache/solr/store/blockcache/BlockCacheTest.java
##########
@@ -257,7 +259,7 @@ public void testCacheConcurrent() throws Exception {
           return;
         }
       }
-      assertEquals("cache key differs from value's key", k, (Long) v.key);
+      Assert.assertEquals("cache key differs from value's key", k, (Long) 
v.key);

Review comment:
       minor: you could just add static imports for this stuff.  I'm not sure 
why these changes are happening.

##########
File path: solr/solr-ref-guide/src/solr-on-hdfs.adoc
##########
@@ -16,11 +16,12 @@
 // specific language governing permissions and limitations
 // under the License.
 
-WARNING: Storing indexes in HDFS is deprecated and may be be removed in 9.0.
-This functionality may be moved to a 3rd-party plugin in the future.
 
 Solr has support for writing and reading its index and transaction log files 
to the HDFS distributed filesystem.
 
+Starting from 9.0, this functionality has been moved from Solr Core to the 
HDFS contrib module. 

Review comment:
       Not "contrib"

##########
File path: gradle/validation/validate-source-patterns.gradle
##########
@@ -108,6 +108,11 @@ allprojects {
 configure(project(':solr:core')) {
   project.tasks.withType(ValidateSourcePatternsTask) {
     sourceFiles.exclude 'src/**/CheckLoggingConfiguration.java'
+  }
+}
+
+configure(project(':solr:modules:hdfs')) {
+  project.tasks.withType(ValidateSourcePatternsTask) {

Review comment:
       just an observation: this will go away with the Hadoop upgrade that will 
follow, if I recall




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@solr.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@solr.apache.org
For additional commands, e-mail: issues-h...@solr.apache.org

Reply via email to