[
https://issues.apache.org/jira/browse/SOLR-18150?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18069406#comment-18069406
]
ASF subversion and git services commented on SOLR-18150:
--------------------------------------------------------
Commit af21fef102425e2c1b8240634f4eec13eae1e627 in solr's branch
refs/heads/main from David Smiley
[ https://gitbox.apache.org/repos/asf?p=solr.git;h=af21fef1024 ]
SOLR-18150: Add SolrBackend abstraction for test/benchmark deployments (#4214)
SOLR-18150: Add SolrBackend abstraction for test/benchmark deployments
Created SolrBackend interface to abstract over different Solr deployment
types (embedded, remote, Jetty, MiniSolrCloudCluster) with a unified API
for collection/configSet management, diagnostics, and node URL access.
JettySolrRunner now manages a HttpJettySolrClient instance, get via
getSolrClient.
This PR only introduces the abstraction; follow-on PRs will add callers
in benchmarks and tests.
Co-authored-by: Claude Opus 4.6 <[email protected]>
> Introduce SolrBackend abstraction for tests & benchmarks
> --------------------------------------------------------
>
> Key: SOLR-18150
> URL: https://issues.apache.org/jira/browse/SOLR-18150
> Project: Solr
> Issue Type: Test
> Components: test-framework
> Reporter: David Smiley
> Assignee: David Smiley
> Priority: Major
> Labels: pull-request-available
> Time Spent: 1h
> Remaining Estimate: 0h
>
> I propose a basic abstraction for tests & benchmarks to interface with some
> kind of running Solr: EmbeddedSolrServer, JettySolrRunner,
> MiniSolrCloudCluster, a remote (SolrJ Http) Solr, and perhaps a variant of
> that coupled to a Docker container. Unlike SolrClientTestRule, it shouldn't
> be tied to JUnit. Having seen many tests, and solr/benchmark, the principle
> things needed are:
> * create a "collection" (or "core" if not in SolrCloud, but using the same
> API). Must specify the configSet and a map of any name-value properties that
> will be used for substitution when reading the configs. The neutrality on
> collection vs core is a key aspect of the whole abstraction.
> * indicate if a collection exists already -- esp. for benchmarks to reuse a
> collection
> * create/upload a configSet given a path to one. Obviously done before
> creating a collection. Reminder: configSets can be used in both Solr modes
> * indicate if a configSet exists already -- esp. for benchmarks to reuse a
> configSet
> * reload a collection
> * provide a SolrClient for doing the work of the benchmark/test scoped to a
> collection. Needs a means of customization (e.g. http1)
> * provide a SolrClient for doing administrative tasks -- tasks *not* in scope
> of the benchmark/test but nonetheless needed to set things up
> * and more can be added of course
> The SolrBenchmark wouldn't generally be used directly by a benchmark/test,
> but it'd be one layer away. Thus I propose SolrClientTestRule delegate/use a
> SolrBenchmark, and similarly SolrBenchState would likewise. Both those
> things can tend to some unique benchmark/test matters. By being a new
> abstraction that isn't tied to JUnit, it should be easy to add support for
> new/different test frameworks.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]