Hi Mathieu!
Mathieu Othacehe skribis:
>> How about also adding metrics per build machine? I have the impression,
>> for instance, that the aarch64 machine in my living room is not used.
>> If this is confirmed, we could take appropriate action (uncomment it in
>> /etc/machines.scm :-), compare
Hi,
Mathieu Othacehe skribis:
>> As discussed on IRC, builds per day should be compared to new
>> derivations per day. For example, if on a day there’s 100 new
>> derivations and we only manage to build 10 of them, we have a problem.
>
> I added this line, and they sadly do not overlap :(
It
Hey Ludo,
> As discussed on IRC, builds per day should be compared to new
> derivations per day. For example, if on a day there’s 100 new
> derivations and we only manage to build 10 of them, we have a problem.
I added this line, and they sadly do not overlap :(
> 2020-09-14T21:16:21 Failed
Hello Andreas,
> Congratulations, that looks like a very useful start already!
> (And the number of builds has doubled since yesterday, so someone already
> put it to good use!)
Thanks for your feedback :)
> How about also adding metrics per build machine? I have the impression,
> for
On Mon, Sep 14, 2020 at 03:34:17PM +0200, Mathieu Othacehe wrote:
> I just pushed support for computing and displaying metrics in Cuirass. I
> started with two metrics:
> * Builds per day
> * Average evaluation speed per specification.
> Those metrics can now be seen at:
>
Hi all.
zimoun writes:
> Hi Mathieu,
>
> Really cool!
>
> On Mon, 14 Sep 2020 at 15:35, Mathieu Othacehe wrote:
>
>> * Builds per day
>> * Average evaluation speed per specification.
>
> Something interesting could be: min and max (of 100 evaluations).
> The standard error deviation too but I
Hi!
Mathieu Othacehe skribis:
> I just pushed support for computing and displaying metrics in Cuirass. I
> started with two metrics:
>
> * Builds per day
> * Average evaluation speed per specification.
>
> Those metrics can now be seen at:
>
> https://ci.guix.gnu.org/metrics
>
> and are updated
Hi Mathieu,
Really cool!
On Mon, 14 Sep 2020 at 15:35, Mathieu Othacehe wrote:
> * Builds per day
> * Average evaluation speed per specification.
Something interesting could be: min and max (of 100 evaluations).
The standard error deviation too but I am not sure it is easy to
interpret with a
Hello,
I just pushed support for computing and displaying metrics in Cuirass. I
started with two metrics:
* Builds per day
* Average evaluation speed per specification.
Those metrics can now be seen at:
https://ci.guix.gnu.org/metrics
and are updated every hour.
I plan to add more metrics
Hello,
> Agreed. We regularly push commits that are weeks or months old
> (sometimes years), so there might be too many outliers when looking at
> the commit time.
Yes, so I used checkout time instead of commit time with
af12a80599346968fb9f52edb33b48dd26852788.
I also turned Evaluation
Hi,
Christopher Baines skribis:
> Mathieu Othacehe writes:
>
>> Hello,
>>
>>> As discussed earlier today on IRC with Clément, we could add performance
>>> monitoring capabilities to Cuirass. Interesting metrics would be:
>>>
>>> • time of push to time of evaluation completion;
>>>
>>> •
Mathieu Othacehe writes:
> Hello,
>
>> As discussed earlier today on IRC with Clément, we could add performance
>> monitoring capabilities to Cuirass. Interesting metrics would be:
>>
>> • time of push to time of evaluation completion;
>>
>> • time of evaluation completion to time of build
Hello,
> As discussed earlier today on IRC with Clément, we could add performance
> monitoring capabilities to Cuirass. Interesting metrics would be:
>
> • time of push to time of evaluation completion;
>
> • time of evaluation completion to time of build completion.
Small update on that
As discussed earlier today on IRC with Clément, we could add performance
monitoring capabilities to Cuirass. Interesting metrics would be:
• time of push to time of evaluation completion;
• time of evaluation completion to time of build completion.
We could visualize that per job over
14 matches
Mail list logo