shubhamvishu commented on code in PR #12799:
URL: https://github.com/apache/lucene/pull/12799#discussion_r1400409812
##########
lucene/core/src/java/org/apache/lucene/codecs/lucene99/Lucene99HnswVectorsFormat.java:
##########
@@ -160,12 +160,12 @@ public Lucene99HnswVectorsFormat(int maxConn, int
beamWidth) {
* @param maxConn the maximum number of connections to a node in the HNSW
graph
* @param beamWidth the size of the queue maintained during graph
construction.
* @param numMergeWorkers number of workers (threads) that will be used when
doing merge. If
- * larger than 1, a non-null {@link ExecutorService} must be passed as
mergeExec
- * @param mergeExec the {@link ExecutorService} that will be used by ALL
vector writers that are
+ * larger than 1, a non-null {@link TaskExecutor} must be passed as
mergeExec
+ * @param mergeExec the {@link TaskExecutor} that will be used by ALL vector
writers that are
* generated by this format to do the merge
*/
public Lucene99HnswVectorsFormat(
- int maxConn, int beamWidth, int numMergeWorkers, ExecutorService
mergeExec) {
+ int maxConn, int beamWidth, int numMergeWorkers, TaskExecutor mergeExec)
{
Review Comment:
Sure I'll change it, but tbh I feel maybe it was better to keep the ES
itself instead of storing the TE reference because ES is capable of doing other
things as well that TE does not support yet eg: timeout support etc. I
understand there is no use case as such currently in hnsw indexing side here
but if maybe there could be some in future that would help in having this a bit
extensible(?). Just wanted to make this point here.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]