Hi,

> We'd like to have the conbench project itself exist as an independent
> open source project so it can be easily used in other projects.

I agree with it if we have Conbench in apache/arrow like
Archery. Can we resolve this by creating a new repository
(apache/arrow-connbench or something) like apache/arrow-rs
and apache/arrow-datafusion?

> Regarding the Arrow benchmarks, in principle having these in an Arrow
> repository (along with various other benchmarking infrastructure)
> would make sense. One of the reasons for a separate repository is to
> not bind the benchmarks to a particular git hash of the Arrow codebase
> (so data can be backfilled).

It seems that we can create apache/arrow-benchmarks for this case.

> We have the same issue with collecting
> C++ benchmark data from old versions of the project with newer
> benchmarks (at least when the API has not changed, so things would
> still compile with the old version of the codebase)

Umm, this is difficult.
It's convenient that we have C++ benchmark in apache/arrow
when we develop C++. We can sync C++ codebase and benchmark
in apache/arrow easily. If we have C++ benchmark in another
repository, it's difficult to update both C++ codebase and
benchmark.

We may be able to use the following workflow:

1. Create a new C++ benchmark in apache/arrow
2. Move the new C++ benchmark to a benchmark repository from
   apache/arrow once the new C++ benchmark is stable
   (For example, all APIs used by the new C++ benchmark are stable.)

But this will increase maintenance cost...


Thanks,
--
kou

In <CAJPUwMAPs8rHL0AdE1-pNnMt1QU4nEttzW4MGzjQQdp0p0q=j...@mail.gmail.com>
  "Re: Announcing Conbench + Arrow" on Mon, 10 May 2021 15:50:51 -0500,
  Wes McKinney <wesmck...@gmail.com> wrote:

> We'd like to have the conbench project itself exist as an independent
> open source project so it can be easily used in other projects. If it
> makes sense to donate to the ASF at some point we are open to the
> discussion! In the meantime, we welcome pull requests.
> 
> Regarding the Arrow benchmarks, in principle having these in an Arrow
> repository (along with various other benchmarking infrastructure)
> would make sense. One of the reasons for a separate repository is to
> not bind the benchmarks to a particular git hash of the Arrow codebase
> (so data can be backfilled). We have the same issue with collecting
> C++ benchmark data from old versions of the project with newer
> benchmarks (at least when the API has not changed, so things would
> still compile with the old version of the codebase)
> 
> On Mon, May 10, 2021 at 3:31 PM Sutou Kouhei <k...@clear-code.com> wrote:
>>
>> Great!
>>
>> Do you have a plan to donate this and Arrow R/Python
>> benchmarks to Apache Arrow project? Should we keep them
>> under https://github.com/ursacomputing/ ?
>>
>> Thanks,
>> --
>> kou
>>
>> In <caezaprbt2i0iv+xabaxfwqhjyj048s16peunofbdkrhakyu...@mail.gmail.com>
>>   "Announcing Conbench + Arrow" on Mon, 10 May 2021 12:11:45 -0600,
>>   Diana Clarke <diana.joan.cla...@gmail.com> wrote:
>>
>> > Hi folks:
>> >
>> > Last week we officially announced a new benchmarking tool called
>> > Conbench with an Arrow integration.
>> >
>> >     https://ursalabs.org/blog/announcing-conbench/
>> >     https://twitter.com/wesmckinn/status/1390324198623547392
>> >
>> > Conbench is a language independent, continuous benchmarking (CB)
>> > framework, built specifically with the needs of a cross-language,
>> > platform-independent, high-performance project like Arrow in mind.
>> >
>> >     https://github.com/ursacomputing/conbench
>> >
>> > On each merge to the main Arrow branch, over 2000 C++, R, and Python
>> > benchmarks are run, and the results are posted to our publicly
>> > available Conbench server (click the following link to see all your
>> > pretty faces and avatars).
>> >
>> >     https://conbench.ursa.dev/
>> >
>> > You can also benchmark your Arrow pull requests with the GitHub
>> > comment: "@ursabot please benchmark". More information about how to
>> > benchmark your pull requests can be found in the blog post linked to
>> > above.
>> >
>> > A *huge* thanks to Elena Henderson for all the orchestration and
>> > behind the scenes system administration & services she built – I
>> > couldn't have asked for a better partner to collaborate on this
>> > project with.
>> >
>> > Thanks also to Jonathan Keane & Neal Richardson for the R benchmarks,
>> > and to Antoine Pitrou, Weston Pace, David Li, Krisztián Szucs, Wes
>> > McKinney, and many others for lending their domain expertise and
>> > support along the way.
>> >
>> > I'll circle back in a month or so with an update on other Conbench
>> > projects currently in flight. Sneak preview: better statistical
>> > analysis courtesy of Jonathan Keane, and Java benchmarks courtesy of
>> > Kazuaki Ishizaki. In the meantime, I wish you all speedy access to
>> > vaccines.
>> >
>> > --diana

Reply via email to