Is anyone aware of a way we could set up similar continuous benchmarks for JS? We wrote some benchmarks earlier this year but currently have no automated way of running them.

Brian


On 05/11/2018 08:21 PM, Wes McKinney wrote:
Thanks Tom and Antoine!

Since these benchmarks are literally running on a machine in my closet
at home, there may be some downtime in the future. At some point we
should document a process of setting up a new machine from scratch to
be the nightly bare metal benchmark slave.

- Wes

On Fri, May 11, 2018 at 9:08 AM, Antoine Pitrou <solip...@pitrou.net> wrote:
Hi again,

Tom has configured the benchmarking machine to run and publish Arrow's
ASV-based benchmarks.  The latest results can now be seen at:
https://pandas.pydata.org/speed/arrow/

I expect these are regenerated on a regular (daily?) basis.

Thanks Tom :-)

Regards

Antoine.


On Wed, 11 Apr 2018 15:40:17 +0200
Antoine Pitrou <anto...@python.org> wrote:
Hello

With the following changes, it seems we might reach the point where
we're able to run the Python-based benchmark suite accross multiple
commits (at least the ones not anterior to those changes):
https://github.com/apache/arrow/pull/1775

To make this truly useful, we would need a dedicated host.  Ideally a
(Linux) OS running on bare metal, with SMT/HyperThreading disabled.
If running virtualized, the VM should have dedicated physical CPU cores.

That machine would run the benchmarks on a regular basis (perhaps once
per night) and publish the results in static HTML form somewhere.

(note: nice to have in the future might be access to NVidia hardware,
but right now there are no CUDA benchmarks in the Python benchmarks)

What should be the procedure here?

Regards

Antoine.


Reply via email to