Hi, I read through the benchmarking wiki page today. It was well written, thanks Kevin for reviving this effort, and in such a structured way!
There's a renewed energy around benchmarking, and more tooling options than ever. So to add to the mix I'm presenting yet another one 🤣🤣, "Solr Benchmark": https://github.com/janhoy/solr-benchmark You may quickly notice that this looks familiar. And yes, it is a fork/port of Rally/OpensearchBenchmark, ported to provision and benchmark Solr clusters, using the same datasets and "workload"s as those tools. I first tried to start a fork 2 or 3 years ago but it stranded. I made a new effort using LLM agents and this first working version was prepared in a few afternoons. Even if the foundation is solid and proven over many years, this initial port is not complete, view it as a MVP and WIP. Only one workload / dataset is ported so far. Take it for a spin... I'll defer to Kevin to add it to the tools list of the wiki and continue the analysis effort with this as one of the contenders. Jan > 2. mars 2026 kl. 17:13 skrev Kevin Liang (BLOOMBERG/ 919 3RD A) > <[email protected]>: > > Hello everyone, > > Given the recent interest and discussion around Solr performance > benchmarking, I figured it would be useful to 1) centralize the discussion > and 2) bring it to a long-lived format (that's not email). So with that, I > have started > https://cwiki.apache.org/confluence/display/SOLR/Solr+Performance+Benchmarking > (I figured there's still more discussion to be had before it becomes a SIP > with technical requirements). > > I encourage anyone and everyone who is interested to provide their input > (comment or edit). This is a community initiative, and shouldn't be limited > by me or any biases I may have. Hopefully people find this useful in moving > the discussion forward. > > -Kevin
