[OMPI devel] Travis: one thing that might help

2017-02-08 Thread Jeff Squyres (jsquyres)
I noticed the other evening that we are doing two things at Travis: 1. Building pull requests 2. Building pushes The 2nd one might well be contributing to our backlog (i.e., every time a PR is merged to the ompi repo, we Travis build again). I also confirmed with Travis that we're supposed to h

Re: [OMPI devel] Travis: one thing that might help

2017-02-08 Thread gilles
Jeff, i also noted that each time a PR is updated, a new Travis build is started. on the other hand, Jenkins is a bit smarter and does not build or cancel "obsolete" PR. i think most of us cannot manually direct Travis to cancel a given build. fwiw, building pushes is not useless. we recently h

Re: [OMPI devel] Travis: one thing that might help

2017-02-08 Thread Jeff Squyres (jsquyres)
On Feb 8, 2017, at 9:34 AM, gil...@rist.or.jp wrote: > > i also noted that each time a PR is updated, a new Travis build is > started. > on the other hand, Jenkins is a bit smarter and does not build or cancel > "obsolete" PR. Are you sure? Here's how I thought Jenkins worked: - create a PR:

Re: [OMPI devel] Travis: one thing that might help

2017-02-08 Thread gilles
Jeff, iirc, i saw build being cancelled (i was monitoring the Jenkins console) when new commits were pushed (or force pushed) to the current PR i will make a test tomorrow it is fair that using Travis for new PR is very likely more useful than for validating all builds Cheers, Gilles -

Re: [OMPI devel] Travis: one thing that might help

2017-02-08 Thread Jeff Squyres (jsquyres)
On Feb 8, 2017, at 10:15 AM, gil...@rist.or.jp wrote: > > iirc, i saw build being cancelled (i was monitoring the Jenkins console) > when new commits were pushed (or force pushed) to the current PR > > i will make a test tomorrow Oh, sweet. That would be good to know; thanks! > it is fair that

Re: [OMPI devel] Segfault on MPI init

2017-02-08 Thread Jeff Squyres (jsquyres)
What version of Open MPI are you running? The error is indicating that Open MPI is trying to start a user-level helper daemon on the remote node, and the daemon is seg faulting (which is unusual). One thing to be aware of: https://www.open-mpi.org/faq/?category=building#install-overwrite