> To avoid a DHT problem ;) it maybe better to take this sorted list and
> assign tests in a cyclic fashion so that all chunks relatively take the
> same amount of time to complete, than it being skewed due to the hash?
I assume you don't really mean cyclic, because that would *guarantee* a
result
Jeff, Nigel,
We do get time a test takes to run, as a part of the output of the test
runs (at least on CentOS runs as I checked here [1]).
To avoid a DHT problem ;) it maybe better to take this sorted list and
assign tests in a cyclic fashion so that all chunks relatively take the
same amoun
With regard to assigning files to chunks, I suggest we start by using an
algorithm similar to that we use in DHT.
hash=$(cat $filename | md5sum) # convert from hex to decimal?
chunk=$((hash % number_of_chunks))
if [ x"$chunk" = x"$my_chunk_id" ]; then
bash $filename # ...and so on
Hello folks,
Last week's community meeting had a discussion about Developer Workflow Issues.
If you missed the meeting, it's worth reading up on the logs[1]. Raghavendra
Talur and I had a followup chat on Friday about tests. I've created a
proposal[2] with what we agreed is the best way forward.