> On Nov 10, 2019, at 7:23 PM, Raphael Bircher <[email protected]> wrote:
> 
> So just reproducible issues go into jira? In my experience, it makes sense in 
> some case, to write an issue für a ireproducible bug. So you can collect all 
> data on one place. Sometimes this helps to track the bugs down.

It’s a judgement call, there’s no hard and fast rule. I don’t think there’s any 
point in raising a JIRA just saying that something failed and you cannot 
reproduce it unless you also have some information to add that might help 
anyone who want’s to work on it track it down. You’ll find a number of JIRAs 
like that. Or if you’re going to grab one and try to track it down, go ahead 
and raise a JIRA for it where we can collaborate. Whatever makes most sense...

And here’s another resource:

http://fucit.org/solr-jenkins-reports/

Hossman (Chris Hostetter) set up a system to report the failures on the various 
machines that run Lucene/Solr tests. I should have mentioned that I often go 
there when I see a failing test that doesn’t reproduce easily to see if it 
fails other places too. Especially when I see failures when testing locally 
after code changes. Each week I try to produce the “BadApple” report with a 
summary of Hoss’ results over the last 4 weeks.

Yes, this is a bit awkward. It’d be nice if all tests passed 100% of the time, 
but we’re not there. It’s a long story…..

Erick


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to