*I appreciate you trying to help! I put the Java and its compiled .class
file on Dropbox. The directory contains the .java and .class files, as well
as a data/ directory:*

http://www.dropbox.com/sh/pds5c5wecfpb2wk/AAAcz17UTDQErmrUqp2SPjpqa?dl=0

*You can run it with and without MPI:*

>  java MPITestBroke data/
>  mpirun -np 1 java MPITestBroke data/

*Attached is a text file of what I see when I run it with mpirun and your
debug flag. Lots of debug lines.*


Nate





On Wed, Aug 12, 2015 at 11:09 AM, Howard Pritchard <hpprit...@gmail.com>
wrote:

> Hi Nate,
>
> Sorry for the delay in getting back to you.
>
> We're somewhat stuck on how to help you, but here are two suggestions.
>
> Could you add the following to your launch command line
>
> --mca odls_base_verbose 100
>
> so we can see exactly what arguments are being feed to java when launching
> your app.
>
> Also, if you could put your MPITestBroke.class file somewhere (like google
> drive)
> where we could get it and try to run locally or at NERSC, that might help
> us
> narrow down the problem.    Better yet, if you have the class or jar file
> for
> the entire app plus some data sets, we could try that out as well.
>
> All the config outputs, etc. you've sent so far indicate a correct
> installation
> of open mpi.
>
> Howard
>
>
> On Aug 6, 2015 1:54 PM, "Nate Chambers" <ncham...@usna.edu> wrote:
>
>> Howard,
>>
>> I tried the nightly build openmpi-dev-2223-g731cfe3 and it still
>> segfaults as before. I must admit I am new to MPI, so is it possible I'm
>> just configuring or running incorrectly? Let me list my steps for you, and
>> maybe something will jump out? Also attached is my config.log.
>>
>>
>> CONFIGURE
>> ./configure --prefix=<install-dir> --enable-mpi-java CC=gcc
>>
>> MAKE
>> make all install
>>
>> RUN
>> <install-dir>/mpirun -np 1 java MPITestBroke twitter/
>>
>>
>> DEFAULT JAVA AND GCC
>>
>> $ java -version
>> java version "1.7.0_21"
>> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
>> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>>
>> $ gcc --v
>> Using built-in specs.
>> Target: x86_64-redhat-linux
>> Configured with: ../configure --prefix=/usr --mandir=/usr/share/man
>> --infodir=/usr/share/info --with-bugurl=
>> http://bugzilla.redhat.com/bugzilla --enable-bootstrap --enable-shared
>> --enable-threads=posix --enable-checking=release --with-system-zlib
>> --enable-__cxa_atexit --disable-libunwind-exceptions
>> --enable-gnu-unique-object
>> --enable-languages=c,c++,objc,obj-c++,java,fortran,ada
>> --enable-java-awt=gtk --disable-dssi
>> --with-java-home=/usr/lib/jvm/java-1.5.0-gcj-1.5.0.0/jre
>> --enable-libgcj-multifile --enable-java-maintainer-mode
>> --with-ecj-jar=/usr/share/java/eclipse-ecj.jar --disable-libjava-multilib
>> --with-ppl --with-cloog --with-tune=generic --with-arch_32=i686
>> --build=x86_64-redhat-linux
>> Thread model: posix
>> gcc version 4.4.7 20120313 (Red Hat 4.4.7-4) (GCC)
>>
>>
>>
>>
>>
>> On Thu, Aug 6, 2015 at 7:58 AM, Howard Pritchard <hpprit...@gmail.com>
>> wrote:
>>
>>> HI Nate,
>>>
>>> We're trying this out on a mac running mavericks and a cray xc system.
>>> the mac has java 8
>>> while the cray xc has java 7.
>>>
>>> We could not get the code to run just using the java launch command,
>>> although we noticed if you add
>>>
>>>     catch(NoClassDefFoundError e) {
>>>
>>>       System.out.println("Not using MPI its out to lunch for now");
>>>
>>>     }
>>>
>>> as one of the catches after the try for firing up MPI, you can get
>>> further.
>>>
>>> Instead we tried on the two systems using
>>>
>>> mpirun -np 1 java MPITestBroke tweets repeat.txt
>>>
>>> and, you guessed it, we can't reproduce the error, at least using master.
>>>
>>> Would you mind trying to get a copy of nightly master build off of
>>>
>>> http://www.open-mpi.org/nightly/master/
>>>
>>> and install that version and give it a try.
>>>
>>> If that works, then I'd suggest using master (or v2.0) for now.
>>>
>>> Howard
>>>
>>>
>>>
>>>
>>> 2015-08-05 14:41 GMT-06:00 Nate Chambers <ncham...@usna.edu>:
>>>
>>>> Howard,
>>>>
>>>> Thanks for looking at all this. Adding System.gc() did not cause it to
>>>> segfault. The segfault still comes much later in the processing.
>>>>
>>>> I was able to reduce my code to a single test file without other
>>>> dependencies. It is attached. This code simply opens a text file and reads
>>>> its lines, one by one. Once finished, it closes and opens the same file and
>>>> reads the lines again. On my system, it does this about 4 times until the
>>>> segfault fires. Obviously this code makes no sense, but it's based on our
>>>> actual code that reads millions of lines of data and does various
>>>> processing to it.
>>>>
>>>> Attached is a tweets.tgz file that you can uncompress to have an input
>>>> directory. The text file is just the same line over and over again. Run it
>>>> as:
>>>>
>>>> *java MPITestBroke tweets/*
>>>>
>>>>
>>>> Nate
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Wed, Aug 5, 2015 at 8:29 AM, Howard Pritchard <hpprit...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi Nate,
>>>>>
>>>>> Sorry for the delay in getting back.  Thanks for the sanity check.
>>>>> You may have a point about the args string to MPI.init -
>>>>> there's nothing the Open MPI is needing from this but that is a
>>>>> difference with your use case - your app has an argument.
>>>>>
>>>>> Would you mind adding a
>>>>>
>>>>> System.gc()
>>>>>
>>>>> call immediately after MPI.init call and see if the gc blows up with a
>>>>> segfault?
>>>>>
>>>>> Also, may be interesting to add the -verbose:jni to your command line.
>>>>>
>>>>> We'll do some experiments here with the init string arg.
>>>>>
>>>>> Is your app open source where we could download it and try to
>>>>> reproduce the problem locally?
>>>>>
>>>>> thanks,
>>>>>
>>>>> Howard
>>>>>
>>>>>
>>>>> 2015-08-04 18:52 GMT-06:00 Nate Chambers <ncham...@usna.edu>:
>>>>>
>>>>>> Sanity checks pass. Both Hello and Ring.java run correctly with the
>>>>>> expected program's output.
>>>>>>
>>>>>> Does MPI.init(args) expect anything from those command-line args?
>>>>>>
>>>>>>
>>>>>> Nate
>>>>>>
>>>>>>
>>>>>> On Tue, Aug 4, 2015 at 12:26 PM, Howard Pritchard <
>>>>>> hpprit...@gmail.com> wrote:
>>>>>>
>>>>>>> Hello Nate,
>>>>>>>
>>>>>>> As a sanity check of your installation, could you try to compile the
>>>>>>> examples/*.java codes using the mpijavac you've installed and see that
>>>>>>> those run correctly?
>>>>>>> I'd be just interested in the Hello.java and Ring.java?
>>>>>>>
>>>>>>> Howard
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> 2015-08-04 14:34 GMT-06:00 Nate Chambers <ncham...@usna.edu>:
>>>>>>>
>>>>>>>> Sure, I reran the configure with CC=gcc and then make install. I
>>>>>>>> think that's the proper way to do it. Attached is my config log. The
>>>>>>>> behavior when running our code appears to be the same. The output is 
>>>>>>>> the
>>>>>>>> same error I pasted in my email above. It occurs when calling 
>>>>>>>> MPI.init().
>>>>>>>>
>>>>>>>> I'm not great at debugging this sort of stuff, but happy to try
>>>>>>>> things out if you need me to.
>>>>>>>>
>>>>>>>> Nate
>>>>>>>>
>>>>>>>>
>>>>>>>> On Tue, Aug 4, 2015 at 5:09 AM, Howard Pritchard <
>>>>>>>> hpprit...@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Hello Nate,
>>>>>>>>>
>>>>>>>>> As a first step to addressing this, could you please try using gcc
>>>>>>>>> rather than the Intel compilers to build Open MPI?
>>>>>>>>>
>>>>>>>>> We've been doing a lot of work recently on the java bindings, etc.
>>>>>>>>> but have never tried using any compilers other
>>>>>>>>> than gcc when working with the java bindings.
>>>>>>>>>
>>>>>>>>> Thanks,
>>>>>>>>>
>>>>>>>>> Howard
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> 2015-08-03 17:36 GMT-06:00 Nate Chambers <ncham...@usna.edu>:
>>>>>>>>>
>>>>>>>>>> We've been struggling with this error for a while, so hoping
>>>>>>>>>> someone more knowledgeable can help!
>>>>>>>>>>
>>>>>>>>>> Our java MPI code exits with a segfault during its normal
>>>>>>>>>> operation, *but the segfault occurs before our code ever uses
>>>>>>>>>> MPI functionality like sending/receiving. *We've removed all
>>>>>>>>>> message calls and any use of MPI.COMM_WORLD from the code. The 
>>>>>>>>>> segfault
>>>>>>>>>> occurs if we call MPI.init(args) in our code, and does not if we 
>>>>>>>>>> comment
>>>>>>>>>> that line out. Further vexing us, the crash doesn't happen at the 
>>>>>>>>>> point of
>>>>>>>>>> the MPI.init call, but later on in the program. I don't have an 
>>>>>>>>>> easy-to-run
>>>>>>>>>> example here because our non-MPI code is so large and complicated. 
>>>>>>>>>> We have
>>>>>>>>>> run simpler test programs with MPI and the segfault does not occur.
>>>>>>>>>>
>>>>>>>>>> We have isolated the line where the segfault occurs. However, if
>>>>>>>>>> we comment that out, the program will run longer, but then randomly 
>>>>>>>>>> (but
>>>>>>>>>> deterministically) segfault later on in the code. Does anyone have 
>>>>>>>>>> tips on
>>>>>>>>>> how to debug this? We have tried several flags with mpirun, but no 
>>>>>>>>>> good
>>>>>>>>>> clues.
>>>>>>>>>>
>>>>>>>>>> We have also tried several MPI versions, including stable 1.8.7
>>>>>>>>>> and the most recent 1.8.8rc1
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ATTACHED
>>>>>>>>>> - config.log from installation
>>>>>>>>>> - output from `ompi_info -all`
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> OUTPUT FROM RUNNING
>>>>>>>>>>
>>>>>>>>>> > mpirun -np 2 java -mx4g FeaturizeDay datadir/ days.txt
>>>>>>>>>> ...
>>>>>>>>>> some normal output from our code
>>>>>>>>>> ...
>>>>>>>>>>
>>>>>>>>>> --------------------------------------------------------------------------
>>>>>>>>>> mpirun noticed that process rank 0 with PID 29646 on node r9n69
>>>>>>>>>> exited on signal 11 (Segmentation fault).
>>>>>>>>>>
>>>>>>>>>> --------------------------------------------------------------------------
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> _______________________________________________
>>>>>>>>>> users mailing list
>>>>>>>>>> us...@open-mpi.org
>>>>>>>>>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>>>>>>>> Link to this post:
>>>>>>>>>> http://www.open-mpi.org/community/lists/users/2015/08/27386.php
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> _______________________________________________
>>>>>>>>> users mailing list
>>>>>>>>> us...@open-mpi.org
>>>>>>>>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>>>>>>> Link to this post:
>>>>>>>>> http://www.open-mpi.org/community/lists/users/2015/08/27389.php
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> _______________________________________________
>>>>>>>> users mailing list
>>>>>>>> us...@open-mpi.org
>>>>>>>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>>>>>> Link to this post:
>>>>>>>> http://www.open-mpi.org/community/lists/users/2015/08/27391.php
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> _______________________________________________
>>>>>>> users mailing list
>>>>>>> us...@open-mpi.org
>>>>>>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>>>>> Link to this post:
>>>>>>> http://www.open-mpi.org/community/lists/users/2015/08/27392.php
>>>>>>>
>>>>>>
>>>>>>
>>>>>> _______________________________________________
>>>>>> users mailing list
>>>>>> us...@open-mpi.org
>>>>>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>>>> Link to this post:
>>>>>> http://www.open-mpi.org/community/lists/users/2015/08/27393.php
>>>>>>
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> users mailing list
>>>>> us...@open-mpi.org
>>>>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>>> Link to this post:
>>>>> http://www.open-mpi.org/community/lists/users/2015/08/27396.php
>>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> users mailing list
>>>> us...@open-mpi.org
>>>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>> Link to this post:
>>>> http://www.open-mpi.org/community/lists/users/2015/08/27399.php
>>>>
>>>
>>>
>>> _______________________________________________
>>> users mailing list
>>> us...@open-mpi.org
>>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>>> Link to this post:
>>> http://www.open-mpi.org/community/lists/users/2015/08/27405.php
>>>
>>
>>
>> _______________________________________________
>> users mailing list
>> us...@open-mpi.org
>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>> Link to this post:
>> http://www.open-mpi.org/community/lists/users/2015/08/27406.php
>>
>
> _______________________________________________
> users mailing list
> us...@open-mpi.org
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
> Link to this post:
> http://www.open-mpi.org/community/lists/users/2015/08/27446.php
>
[r4n52:12205] mca: base: components_register: registering framework odls 
components
[r4n52:12205] mca: base: components_register: found loaded component default
[r4n52:12205] mca: base: components_register: component default has no register 
or open function
[r4n52:12205] mca: base: components_open: opening odls components
[r4n52:12205] mca: base: components_open: found loaded component default
[r4n52:12205] mca: base: components_open: component default open function 
successful
[r4n52:12205] mca:base:select: Auto-selecting odls components
[r4n52:12205] mca:base:select:( odls) Querying component [default]
[r4n52:12205] mca:base:select:( odls) Query of component [default] set priority 
to 1
[r4n52:12205] mca:base:select:( odls) Selected component [default]
[r4n52:12205] [[20766,0],0] odls:launch: spawning child [[20766,1],0]
[r4n52:12205] 
 Data for app_context: index 0  app: /gpfs/pkgs/mhpcc/java/current/bin/java
        Num procs: 1    FirstRank: 0
        Argv[0]: java
        Argv[1]: -cp
        Argv[2]: 
/gpfs/home/nchamber/mpibroken:.:classes:/gpfs/home/nchamber/code/navynlp/classes:/gpfs/home/nchamber/Dropbox/timesieve-code/classes:/gpfs/home/nchamber/code/lib/stanford-corenlp-models.jar:/gpfs/home/nchamber/code/navynlp/lib/*:/gpfs/home/nchamber/code/javanlp/projects/core/classes:/gpfs/home/nchamber/code/javanlp/projects/more/classes:/gpfs/home/nchamber/code/javanlp/projects/mt/classes:/gpfs/home/nchamber/code/javanlp/projects/periphery/classes:/gpfs/home/nchamber/code/javanlp/projects/research/classes:/gpfs/home/nchamber/code/javanlp/projects/rte/classes::/gpfs/home/nchamber/code/javanlp/projects/core/lib/*:/gpfs/home/nchamber/code/javanlp/projects/more/lib/*:/gpfs/home/nchamber/code/javanlp/projects/mt/lib/*:/gpfs/home/nchamber/code/javanlp/projects/periphery/lib/*:/gpfs/home/nchamber/code/javanlp/projects/research/lib/*:/gpfs/home/nchamber/code/javanlp/projects/rte/lib/*::/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/lib/mpi.jar:/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/lib/shmem.jar
        Argv[3]: 
-Djava.library.path=/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/lib
        Argv[4]: MPITestBroke
        Argv[5]: tweets
        Env[0]: OMPI_MCA_odls_base_verbose=100
        Env[1]: OMPI_MCA_mpi_paffinity_alone=1
        Env[2]: OMPI_COMMAND=MPITestBroke
        Env[3]: OMPI_ARGV=tweets
        Env[4]: 
OMPI_MCA_orte_precondition_transports=394e96ac0a2080e2-0eac07ef24bb55f4
        Env[5]: AEROSOFT_ELMHOST=rsvc1
        Env[6]: 
MANPATH=/opt/pbs/default/man:/usr/share/man:/usr/local/share/man:/gpfs/pkgs/hpcmp/bct/man:/gpfs/pkgs/mhpcc/intel/2013/composer_xe_2013.1.117/man/en_US:/gpfs/pkgs/mhpcc/intel/2013/impi/4.1.0.024/man
        Env[7]: HOSTNAME=r4n52
        Env[8]: PROJECTS_HOME=/gpfs/projects
        Env[9]: INTEL_LICENSE_FILE=28518@rsvc1:28518@rsvc2:28518@rsvc3
        Env[10]: PBS_ACCOUNT=USNAM37050419
        Env[11]: TERM=xterm
        Env[12]: SHELL=/bin/bash
        Env[13]: CSE_HOME=/gpfs/pkgs/hpcmp/CSE/CSE
        Env[14]: HISTSIZE=1000
        Env[15]: TMPDIR=/tmp/pbs.458331.rsvc1
        Env[16]: PBS_JOBNAME=STDIN
        Env[17]: PBS_ENVIRONMENT=PBS_INTERACTIVE
        Env[18]: QTDIR=/usr/lib64/qt-3.3
        Env[19]: 
OLDPWD=/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/bin
        Env[20]: QTINC=/usr/lib64/qt-3.3/include
        Env[21]: KRB5_HOME=/usr/local/krb5
        Env[22]: BC_NODE_ALLOC=1
        Env[23]: 
PBS_O_WORKDIR=/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/bin
        Env[24]: ANT_HOME=/gpfs/home/nchamber/bin/apache-ant-1.9.4
        Env[25]: NCPUS=1
        Env[26]: ANT_OPTS=-mx1g
        Env[27]: ARCHIVE_HOME=/mnt/archive/nchamber
        Env[28]: DAAC_HOME=/gpfs/pkgs/hpcmp/daac
        Env[29]: PBS_TASKNUM=1
        Env[30]: USER=nchamber
        Env[31]: 
LD_LIBRARY_PATH=/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/lib:/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/lib:/gpfs/pkgs/mhpcc/intel/2013/impi/4.1.0.024/intel64/lib:/gpfs/pkgs/mhpcc/intel/2013/composer_xe_2013.1.117/compiler/lib/intel64
        Env[32]: 
LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=01;05;37;41:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lz=01;31:*.xz=01;31:*.bz2=01;31:*.tbz=01;31:*.tbz2=01;31:*.bz=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.rar=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=01;36:*.au=01;36:*.flac=01;36:*.mid=01;36:*.midi=01;36:*.mka=01;36:*.mp3=01;36:*.mpc=01;36:*.ogg=01;36:*.ra=01;36:*.wav=01;36:*.axa=01;36:*.oga=01;36:*.spx=01;36:*.xspf=01;36:
        Env[33]: PBS_O_HOME=/gpfs/home/nchamber
        Env[34]: WORKDIR=/scratch/nchamber
        Env[35]: CREATE_HOME=/gpfs/pkgs/hpcmp/create
        Env[36]: PBS_MOMPORT=15003
        Env[37]: PBS_O_QUEUE=standard
        Env[38]: MAIL=/var/spool/mail/nchamber
        Env[39]: PBS_O_LOGNAME=nchamber
        Env[40]: 
PATH=/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/bin:/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/bin:/usr/lib64/qt-3.3/bin:/gpfs/pkgs/mhpcc/intel/2013/impi/4.1.0.024/intel64/bin:/gpfs/pkgs/mhpcc/intel/wrapper:/usr/local/krb5/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/lpp/mmfs/bin:/opt/ibutils/bin:/gpfs/pkgs/mhpcc/java/current/bin:/gpfs/pkgs/mhpcc/java/current/jre/bin:/gpfs/system/bin:/gpfs/pkgs/hpcmp/pstoolkit/bin:/usr/local/bin:/gpfs/pkgs/hpcmp/bct/bin:/usr/local/mpscp/bin:/gpfs/pkgs/hpcmp/csiapps/base/linux.x86_64/bin:/gpfs/pkgs/hpcmp/bciapps/base/linux.x86_64/bin:/gpfs/pkgs/hpcmp/SLB/current:/gpfs/pkgs/mhpcc/gnu/bin:/gpfs/pkgs/mhpcc/intel/2013/composer_xe_2013.1.117/bin/intel64:/opt/pbs/default/bin:/opt/pbs/default/sbin:/gpfs/pkgs/mhpcc/java/current/jre/bin:/gpfs/home/nchamber/bin/apache-ant-1.9.4/bin
        Env[41]: PBS_O_LANG=en_US.UTF-8
        Env[42]: MPSCP_HOME=/usr/local/mpscp
        Env[43]: ARCHIVE_HOST=10.70.100.24
        Env[44]: PBS_JOBCOOKIE=000000006FDCF1740000000009CE14E5
        Env[45]: COST_MODULES_DIR=/gpfs/projects/COST/modules/pkgs
        Env[46]: F90=/gpfs/pkgs/mhpcc/intel/wrapper/ifort
        Env[47]: SAMPLES_HOME=/gpfs/home/samples
        Env[48]: PWD=/gpfs/home/nchamber/mpibroken
        Env[49]: 
_LMFILES_=/gpfs/pkgs/mhpcc/modulefiles/intel/13.0:/gpfs/pkgs/mhpcc/modulefiles/impi/4.1:/gpfs/pkgs/hpcmp/modulefiles/costinit
        Env[50]: JAVA_HOME=/gpfs/pkgs/mhpcc/java/current
        Env[51]: CENTER=/p/cwfs/nchamber
        Env[52]: BCI_HOME=/gpfs/pkgs/hpcmp/bciapps/base/linux.x86_64
        Env[53]: PBS_NODENUM=0
        Env[54]: 
MODULEPATH=/gpfs/projects/COST/modules/pkgs:/usr/share/Modules/modulefiles:/etc/modulefiles:/opt/mellanox/bupc/2.2/modules:/gpfs/pkgs/mhpcc/modulefiles:/gpfs/pkgs/hpcmp/modulefiles:/gpfs/pkgs/hpcmp/daac/modules:/gpfs/pkgs/hpcmp/create/sh/nesm/modulefiles
        Env[55]: LOADEDMODULES=intel/13.0:impi/4.1:costinit
        Env[56]: BC_HOST=riptide
        Env[57]: PBS_JOBDIR=/gpfs/home/nchamber
        Env[58]: TZ=Pacific/Honolulu
        Env[59]: MPSCP_BLOCKED_FILE=/usr/local/mpscp/etc/mpscp_blocked_ports
        Env[60]: PET_HOME=/gpfs/pkgs/hpcmp/PTOOLS
        Env[61]: BC_CORES_PER_NODE=16
        Env[62]: JAVANLP_HOME=/gpfs/home/nchamber/code/javanlp
        Env[63]: F77=/gpfs/pkgs/mhpcc/intel/wrapper/ifort
        Env[64]: COBALT_LIC=1...@rls.csi.hpc.mil
        Env[65]: PBS_O_SHELL=/bin/bash
        Env[66]: 
LM_LICENSE_FILE=:1...@license.mhpcc.hpc.mil:1...@rls.csi.hpc.mil:1...@rls.csi.hpc.mil:1...@rls.csi.hpc.mil
        Env[67]: BC_MEM_PER_NODE=29000
        Env[68]: PBS_JOBID=458331.rsvc1
        Env[69]: COST_HOME=/gpfs/projects/COST
        Env[70]: CXX=/gpfs/pkgs/mhpcc/intel/wrapper/icpc
        Env[71]: CSI_HOME=/gpfs/pkgs/hpcmp/csiapps/base/linux.x86_64
        Env[72]: HISTCONTROL=ignoredups
        Env[73]: SHLVL=1
        Env[74]: HOME=/gpfs/home/nchamber
        Env[75]: PBS_O_HOST=riptide04
        Env[76]: BC_MPI_TASKS_ALLOC=1
        Env[77]: 
PYTHONPATH=/gpfs/home/nchamber/code/twitter_nlp/python/ner/unidecode
        Env[78]: LOGNAME=nchamber
        Env[79]: QTLIB=/usr/lib64/qt-3.3/lib
        Env[80]: MPSCP_CONFIG_FILENAME=/usr/local/mpscp/etc/mpscp_config
        Env[81]: CVS_RSH=ssh
        Env[82]: 
CLASSPATH=.:classes:/gpfs/home/nchamber/code/navynlp/classes:/gpfs/home/nchamber/Dropbox/timesieve-code/classes:/gpfs/home/nchamber/code/lib/stanford-corenlp-models.jar:/gpfs/home/nchamber/code/navynlp/lib/*:/gpfs/home/nchamber/code/javanlp/projects/core/classes:/gpfs/home/nchamber/code/javanlp/projects/more/classes:/gpfs/home/nchamber/code/javanlp/projects/mt/classes:/gpfs/home/nchamber/code/javanlp/projects/periphery/classes:/gpfs/home/nchamber/code/javanlp/projects/research/classes:/gpfs/home/nchamber/code/javanlp/projects/rte/classes::/gpfs/home/nchamber/code/javanlp/projects/core/lib/*:/gpfs/home/nchamber/code/javanlp/projects/more/lib/*:/gpfs/home/nchamber/code/javanlp/projects/mt/lib/*:/gpfs/home/nchamber/code/javanlp/projects/periphery/lib/*:/gpfs/home/nchamber/code/javanlp/projects/research/lib/*:/gpfs/home/nchamber/code/javanlp/projects/rte/lib/*:
        Env[83]: PDSH_SSH_ARGS_APPEND=-q
        Env[84]: PBS_QUEUE=standard
        Env[85]: MODULESHOME=/usr/share/Modules
        Env[86]: LESSOPEN=|/usr/bin/lesspipe.sh %s
        Env[87]: OMP_NUM_THREADS=1
        Env[88]: PBS_O_MAIL=/var/spool/mail/nchamber
        Env[89]: CC=/gpfs/pkgs/mhpcc/intel/wrapper/icc
        Env[90]: PBS_O_SYSTEM=Linux
        Env[91]: G_BROKEN_FILENAMES=1
        Env[92]: PBS_NODEFILE=/var/spool/PBS/aux/458331.rsvc1
        Env[93]: I_MPI_ROOT=/gpfs/pkgs/mhpcc/intel/2013/impi/4.1.0.024
        Env[94]: 
PBS_O_PATH=/opt/xcat/bin:/opt/xcat/sbin:/usr/local/ossh/bin:/gpfs/pkgs/mhpcc/intel/2013/impi/4.1.0.024/intel64/bin:/gpfs/pkgs/mhpcc/intel/wrapper:/usr/local/krb5/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/ossh-6.6p1a_x86_64/bin:/usr/local/sbin:/usr/lpp/mmfs/bin:/opt/ibutils/bin:/gpfs/pkgs/mhpcc/java/current/bin:/gpfs/pkgs/mhpcc/java/current/jre/bin:/gpfs/system/bin:/gpfs/pkgs/hpcmp/pstoolkit/bin:/usr/local/bin:/gpfs/pkgs/hpcmp/bct/bin:/usr/local/mpscp/bin:/gpfs/pkgs/hpcmp/csiapps/base/linux.x86_64/bin:/gpfs/pkgs/hpcmp/bciapps/base/linux.x86_64/bin:/gpfs/pkgs/hpcmp/SLB/current:/gpfs/pkgs/mhpcc/gnu/bin:/gpfs/pkgs/mhpcc/intel/2013/composer_xe_2013.1.117/bin/intel64:/opt/pbs/default/bin:/opt/pbs/default/sbin:/gpfs/pkgs/mhpcc/java/current/jre/bin:/gpfs/home/nchamber/bin/apache-ant-1.9.4/bin
        Env[95]: BASH_FUNC_module()=() {  eval `/usr/bin/modulecmd bash $*`
}
        Env[96]: 
_=/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/bin/mpirun
        Env[97]: 
PMIX_SERVER_URI=1360920576.0:/tmp/pbs.458331.rsvc1/openmpi-sessions-902206@r4n52_0/20766/0/pmix
        Env[98]: 
OMPI_MCA_orte_local_daemon_uri=1360920576.0;usock;tcp://10.60.104.52,10.70.104.52:46211;ud://335110.603.1
        Env[99]: 
OMPI_MCA_orte_hnp_uri=1360920576.0;usock;tcp://10.60.104.52,10.70.104.52:46211;ud://335110.603.1
        Env[100]: OMPI_MCA_mpi_yield_when_idle=0
        Env[101]: OMPI_MCA_orte_app_num=0
        Env[102]: OMPI_UNIVERSE_SIZE=1
        Env[103]: OMPI_MCA_orte_num_nodes=1
        Env[104]: OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
        Env[105]: OMPI_MCA_orte_bound_at_launch=1
        Env[106]: OMPI_MCA_ess=pmi
        Env[107]: OMPI_MCA_orte_ess_num_procs=1
        Env[108]: OMPI_COMM_WORLD_SIZE=1
        Env[109]: OMPI_COMM_WORLD_LOCAL_SIZE=1
        Env[110]: OMPI_MCA_orte_tmpdir_base=/tmp/pbs.458331.rsvc1
        Env[111]: OMPI_NUM_APP_CTX=1
        Env[112]: OMPI_FIRST_RANKS=0
        Env[113]: OMPI_APP_CTX_NUM_PROCS=1
        Env[114]: OMPI_MCA_initial_wdir=/gpfs/home/nchamber/mpibroken
        Env[115]: OMPI_MCA_ess_base_jobid=1360920577
        Env[116]: OMPI_MCA_ess_base_vpid=0
        Env[117]: OMPI_COMM_WORLD_RANK=0
        Env[118]: OMPI_COMM_WORLD_LOCAL_RANK=0
        Env[119]: OMPI_COMM_WORLD_NODE_RANK=0
        Env[120]: OMPI_MCA_orte_ess_node_rank=0
        Env[121]: PMIX_ID=1360920577.0
        Env[122]: 
OMPI_FILE_LOCATION=/tmp/pbs.458331.rsvc1/openmpi-sessions-902206@r4n52_0/20766/1/0
        Working dir: /gpfs/home/nchamber/mpibroken
        Prefix: 
/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3
        Used on node: TRUE
 ORTE_ATTR: GLOBAL Data type: OPAL_STRING       Key: APP-PREFIX-DIR     Value: 
/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3
 ORTE_ATTR: LOCAL Data type: OPAL_INT32 Key: APP-MAX-RESTARTS   Value: 0
[r4n52:12205] [[20766,0],0] STARTING /gpfs/pkgs/mhpcc/java/current/bin/java
[r4n52:12205] [[20766,0],0]     ARGV[0]: java
[r4n52:12205] [[20766,0],0]     ARGV[1]: -cp
[r4n52:12205] [[20766,0],0]     ARGV[2]: 
/gpfs/home/nchamber/mpibroken:.:classes:/gpfs/home/nchamber/code/navynlp/classes:/gpfs/home/nchamber/Dropbox/timesieve-code/classes:/gpfs/home/nchamber/code/lib/stanford-corenlp-models.jar:/gpfs/home/nchamber/code/navynlp/lib/*:/gpfs/home/nchamber/code/javanlp/projects/core/classes:/gpfs/home/nchamber/code/javanlp/projects/more/classes:/gpfs/home/nchamber/code/javanlp/projects/mt/classes:/gpfs/home/nchamber/code/javanlp/projects/periphery/classes:/gpfs/home/nchamber/code/javanlp/projects/research/classes:/gpfs/home/nchamber/code/javanlp/projects/rte/classes::/gpfs/home/nchamber/code/javanlp/projects/core/lib/*:/gpfs/home/nchamber/code/javanlp/projects/more/lib/*:/gpfs/home/nchamber/code/javanlp/projects/mt/lib/*:/gpfs/home/nchamber/code/javanlp/projects/periphery/lib/*:/gpfs/home/nchamber/code/javanlp/projects/research/lib/*:/gpfs/home/nchamber/code/javanlp/projects/rte/lib/*::/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/lib/mpi.jar:/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/lib/shmem.jar
[r4n52:12205] [[20766,0],0]     ARGV[3]: 
-Djava.library.path=/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/lib
[r4n52:12205] [[20766,0],0]     ARGV[4]: MPITestBroke
[r4n52:12205] [[20766,0],0]     ARGV[5]: tweets
[r4n52:12205] [[20766,0],0]     ENVIRON[0]: OMPI_MCA_odls_base_verbose=100
[r4n52:12205] [[20766,0],0]     ENVIRON[1]: OMPI_MCA_mpi_paffinity_alone=1
[r4n52:12205] [[20766,0],0]     ENVIRON[2]: OMPI_COMMAND=MPITestBroke
[r4n52:12205] [[20766,0],0]     ENVIRON[3]: OMPI_ARGV=tweets
[r4n52:12205] [[20766,0],0]     ENVIRON[4]: 
OMPI_MCA_orte_precondition_transports=394e96ac0a2080e2-0eac07ef24bb55f4
[r4n52:12205] [[20766,0],0]     ENVIRON[5]: AEROSOFT_ELMHOST=rsvc1
[r4n52:12205] [[20766,0],0]     ENVIRON[6]: 
MANPATH=/opt/pbs/default/man:/usr/share/man:/usr/local/share/man:/gpfs/pkgs/hpcmp/bct/man:/gpfs/pkgs/mhpcc/intel/2013/composer_xe_2013.1.117/man/en_US:/gpfs/pkgs/mhpcc/intel/2013/impi/4.1.0.024/man
[r4n52:12205] [[20766,0],0]     ENVIRON[7]: HOSTNAME=r4n52
[r4n52:12205] [[20766,0],0]     ENVIRON[8]: PROJECTS_HOME=/gpfs/projects
[r4n52:12205] [[20766,0],0]     ENVIRON[9]: 
INTEL_LICENSE_FILE=28518@rsvc1:28518@rsvc2:28518@rsvc3
[r4n52:12205] [[20766,0],0]     ENVIRON[10]: PBS_ACCOUNT=USNAM37050419
[r4n52:12205] [[20766,0],0]     ENVIRON[11]: TERM=xterm
[r4n52:12205] [[20766,0],0]     ENVIRON[12]: SHELL=/bin/bash
[r4n52:12205] [[20766,0],0]     ENVIRON[13]: CSE_HOME=/gpfs/pkgs/hpcmp/CSE/CSE
[r4n52:12205] [[20766,0],0]     ENVIRON[14]: HISTSIZE=1000
[r4n52:12205] [[20766,0],0]     ENVIRON[15]: TMPDIR=/tmp/pbs.458331.rsvc1
[r4n52:12205] [[20766,0],0]     ENVIRON[16]: PBS_JOBNAME=STDIN
[r4n52:12205] [[20766,0],0]     ENVIRON[17]: PBS_ENVIRONMENT=PBS_INTERACTIVE
[r4n52:12205] [[20766,0],0]     ENVIRON[18]: QTDIR=/usr/lib64/qt-3.3
[r4n52:12205] [[20766,0],0]     ENVIRON[19]: 
OLDPWD=/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/bin
[r4n52:12205] [[20766,0],0]     ENVIRON[20]: QTINC=/usr/lib64/qt-3.3/include
[r4n52:12205] [[20766,0],0]     ENVIRON[21]: KRB5_HOME=/usr/local/krb5
[r4n52:12205] [[20766,0],0]     ENVIRON[22]: BC_NODE_ALLOC=1
[r4n52:12205] [[20766,0],0]     ENVIRON[23]: 
PBS_O_WORKDIR=/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/bin
[r4n52:12205] [[20766,0],0]     ENVIRON[24]: 
ANT_HOME=/gpfs/home/nchamber/bin/apache-ant-1.9.4
[r4n52:12205] [[20766,0],0]     ENVIRON[25]: NCPUS=1
[r4n52:12205] [[20766,0],0]     ENVIRON[26]: ANT_OPTS=-mx1g
[r4n52:12205] [[20766,0],0]     ENVIRON[27]: ARCHIVE_HOME=/mnt/archive/nchamber
[r4n52:12205] [[20766,0],0]     ENVIRON[28]: DAAC_HOME=/gpfs/pkgs/hpcmp/daac
[r4n52:12205] [[20766,0],0]     ENVIRON[29]: PBS_TASKNUM=1
[r4n52:12205] [[20766,0],0]     ENVIRON[30]: USER=nchamber
[r4n52:12205] [[20766,0],0]     ENVIRON[31]: 
LD_LIBRARY_PATH=/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/lib:/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/lib:/gpfs/pkgs/mhpcc/intel/2013/impi/4.1.0.024/intel64/lib:/gpfs/pkgs/mhpcc/intel/2013/composer_xe_2013.1.117/compiler/lib/intel64
[r4n52:12205] [[20766,0],0]     ENVIRON[32]: 
LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=01;05;37;41:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lz=01;31:*.xz=01;31:*.bz2=01;31:*.tbz=01;31:*.tbz2=01;31:*.bz=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.rar=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=01;36:*.au=01;36:*.flac=01;36:*.mid=01;36:*.midi=01;36:*.mka=01;36:*.mp3=01;36:*.mpc=01;36:*.ogg=01;36:*.ra=01;36:*.wav=01;36:*.axa=01;36:*.oga=01;36:*.spx=01;36:*.xspf=01;36:
[r4n52:12205] [[20766,0],0]     ENVIRON[33]: PBS_O_HOME=/gpfs/home/nchamber
[r4n52:12205] [[20766,0],0]     ENVIRON[34]: WORKDIR=/scratch/nchamber
[r4n52:12205] [[20766,0],0]     ENVIRON[35]: CREATE_HOME=/gpfs/pkgs/hpcmp/create
[r4n52:12205] [[20766,0],0]     ENVIRON[36]: PBS_MOMPORT=15003
[r4n52:12205] [[20766,0],0]     ENVIRON[37]: PBS_O_QUEUE=standard
[r4n52:12205] [[20766,0],0]     ENVIRON[38]: MAIL=/var/spool/mail/nchamber
[r4n52:12205] [[20766,0],0]     ENVIRON[39]: PBS_O_LOGNAME=nchamber
[r4n52:12205] [[20766,0],0]     ENVIRON[40]: 
PATH=/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/bin:/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/bin:/usr/lib64/qt-3.3/bin:/gpfs/pkgs/mhpcc/intel/2013/impi/4.1.0.024/intel64/bin:/gpfs/pkgs/mhpcc/intel/wrapper:/usr/local/krb5/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/lpp/mmfs/bin:/opt/ibutils/bin:/gpfs/pkgs/mhpcc/java/current/bin:/gpfs/pkgs/mhpcc/java/current/jre/bin:/gpfs/system/bin:/gpfs/pkgs/hpcmp/pstoolkit/bin:/usr/local/bin:/gpfs/pkgs/hpcmp/bct/bin:/usr/local/mpscp/bin:/gpfs/pkgs/hpcmp/csiapps/base/linux.x86_64/bin:/gpfs/pkgs/hpcmp/bciapps/base/linux.x86_64/bin:/gpfs/pkgs/hpcmp/SLB/current:/gpfs/pkgs/mhpcc/gnu/bin:/gpfs/pkgs/mhpcc/intel/2013/composer_xe_2013.1.117/bin/intel64:/opt/pbs/default/bin:/opt/pbs/default/sbin:/gpfs/pkgs/mhpcc/java/current/jre/bin:/gpfs/home/nchamber/bin/apache-ant-1.9.4/bin
[r4n52:12205] [[20766,0],0]     ENVIRON[41]: PBS_O_LANG=en_US.UTF-8
[r4n52:12205] [[20766,0],0]     ENVIRON[42]: MPSCP_HOME=/usr/local/mpscp
[r4n52:12205] [[20766,0],0]     ENVIRON[43]: ARCHIVE_HOST=10.70.100.24
[r4n52:12205] [[20766,0],0]     ENVIRON[44]: 
PBS_JOBCOOKIE=000000006FDCF1740000000009CE14E5
[r4n52:12205] [[20766,0],0]     ENVIRON[45]: 
COST_MODULES_DIR=/gpfs/projects/COST/modules/pkgs
[r4n52:12205] [[20766,0],0]     ENVIRON[46]: 
F90=/gpfs/pkgs/mhpcc/intel/wrapper/ifort
[r4n52:12205] [[20766,0],0]     ENVIRON[47]: SAMPLES_HOME=/gpfs/home/samples
[r4n52:12205] [[20766,0],0]     ENVIRON[48]: PWD=/gpfs/home/nchamber/mpibroken
[r4n52:12205] [[20766,0],0]     ENVIRON[49]: 
_LMFILES_=/gpfs/pkgs/mhpcc/modulefiles/intel/13.0:/gpfs/pkgs/mhpcc/modulefiles/impi/4.1:/gpfs/pkgs/hpcmp/modulefiles/costinit
[r4n52:12205] [[20766,0],0]     ENVIRON[50]: 
JAVA_HOME=/gpfs/pkgs/mhpcc/java/current
[r4n52:12205] [[20766,0],0]     ENVIRON[51]: CENTER=/p/cwfs/nchamber
[r4n52:12205] [[20766,0],0]     ENVIRON[52]: 
BCI_HOME=/gpfs/pkgs/hpcmp/bciapps/base/linux.x86_64
[r4n52:12205] [[20766,0],0]     ENVIRON[53]: PBS_NODENUM=0
[r4n52:12205] [[20766,0],0]     ENVIRON[54]: 
MODULEPATH=/gpfs/projects/COST/modules/pkgs:/usr/share/Modules/modulefiles:/etc/modulefiles:/opt/mellanox/bupc/2.2/modules:/gpfs/pkgs/mhpcc/modulefiles:/gpfs/pkgs/hpcmp/modulefiles:/gpfs/pkgs/hpcmp/daac/modules:/gpfs/pkgs/hpcmp/create/sh/nesm/modulefiles
[r4n52:12205] [[20766,0],0]     ENVIRON[55]: 
LOADEDMODULES=intel/13.0:impi/4.1:costinit
[r4n52:12205] [[20766,0],0]     ENVIRON[56]: BC_HOST=riptide
[r4n52:12205] [[20766,0],0]     ENVIRON[57]: PBS_JOBDIR=/gpfs/home/nchamber
[r4n52:12205] [[20766,0],0]     ENVIRON[58]: TZ=Pacific/Honolulu
[r4n52:12205] [[20766,0],0]     ENVIRON[59]: 
MPSCP_BLOCKED_FILE=/usr/local/mpscp/etc/mpscp_blocked_ports
[r4n52:12205] [[20766,0],0]     ENVIRON[60]: PET_HOME=/gpfs/pkgs/hpcmp/PTOOLS
[r4n52:12205] [[20766,0],0]     ENVIRON[61]: BC_CORES_PER_NODE=16
[r4n52:12205] [[20766,0],0]     ENVIRON[62]: 
JAVANLP_HOME=/gpfs/home/nchamber/code/javanlp
[r4n52:12205] [[20766,0],0]     ENVIRON[63]: 
F77=/gpfs/pkgs/mhpcc/intel/wrapper/ifort
[r4n52:12205] [[20766,0],0]     ENVIRON[64]: COBALT_LIC=1...@rls.csi.hpc.mil
[r4n52:12205] [[20766,0],0]     ENVIRON[65]: PBS_O_SHELL=/bin/bash
[r4n52:12205] [[20766,0],0]     ENVIRON[66]: 
LM_LICENSE_FILE=:1...@license.mhpcc.hpc.mil:1...@rls.csi.hpc.mil:1...@rls.csi.hpc.mil:1...@rls.csi.hpc.mil
[r4n52:12205] [[20766,0],0]     ENVIRON[67]: BC_MEM_PER_NODE=29000
[r4n52:12205] [[20766,0],0]     ENVIRON[68]: PBS_JOBID=458331.rsvc1
[r4n52:12205] [[20766,0],0]     ENVIRON[69]: COST_HOME=/gpfs/projects/COST
[r4n52:12205] [[20766,0],0]     ENVIRON[70]: 
CXX=/gpfs/pkgs/mhpcc/intel/wrapper/icpc
[r4n52:12205] [[20766,0],0]     ENVIRON[71]: 
CSI_HOME=/gpfs/pkgs/hpcmp/csiapps/base/linux.x86_64
[r4n52:12205] [[20766,0],0]     ENVIRON[72]: HISTCONTROL=ignoredups
[r4n52:12205] [[20766,0],0]     ENVIRON[73]: SHLVL=1
[r4n52:12205] [[20766,0],0]     ENVIRON[74]: HOME=/gpfs/home/nchamber
[r4n52:12205] [[20766,0],0]     ENVIRON[75]: PBS_O_HOST=riptide04
[r4n52:12205] [[20766,0],0]     ENVIRON[76]: BC_MPI_TASKS_ALLOC=1
[r4n52:12205] [[20766,0],0]     ENVIRON[77]: 
PYTHONPATH=/gpfs/home/nchamber/code/twitter_nlp/python/ner/unidecode
[r4n52:12205] [[20766,0],0]     ENVIRON[78]: LOGNAME=nchamber
[r4n52:12205] [[20766,0],0]     ENVIRON[79]: QTLIB=/usr/lib64/qt-3.3/lib
[r4n52:12205] [[20766,0],0]     ENVIRON[80]: 
MPSCP_CONFIG_FILENAME=/usr/local/mpscp/etc/mpscp_config
[r4n52:12205] [[20766,0],0]     ENVIRON[81]: CVS_RSH=ssh
[r4n52:12205] [[20766,0],0]     ENVIRON[82]: 
CLASSPATH=.:classes:/gpfs/home/nchamber/code/navynlp/classes:/gpfs/home/nchamber/Dropbox/timesieve-code/classes:/gpfs/home/nchamber/code/lib/stanford-corenlp-models.jar:/gpfs/home/nchamber/code/navynlp/lib/*:/gpfs/home/nchamber/code/javanlp/projects/core/classes:/gpfs/home/nchamber/code/javanlp/projects/more/classes:/gpfs/home/nchamber/code/javanlp/projects/mt/classes:/gpfs/home/nchamber/code/javanlp/projects/periphery/classes:/gpfs/home/nchamber/code/javanlp/projects/research/classes:/gpfs/home/nchamber/code/javanlp/projects/rte/classes::/gpfs/home/nchamber/code/javanlp/projects/core/lib/*:/gpfs/home/nchamber/code/javanlp/projects/more/lib/*:/gpfs/home/nchamber/code/javanlp/projects/mt/lib/*:/gpfs/home/nchamber/code/javanlp/projects/periphery/lib/*:/gpfs/home/nchamber/code/javanlp/projects/research/lib/*:/gpfs/home/nchamber/code/javanlp/projects/rte/lib/*:
[r4n52:12205] [[20766,0],0]     ENVIRON[83]: PDSH_SSH_ARGS_APPEND=-q
[r4n52:12205] [[20766,0],0]     ENVIRON[84]: PBS_QUEUE=standard
[r4n52:12205] [[20766,0],0]     ENVIRON[85]: MODULESHOME=/usr/share/Modules
[r4n52:12205] [[20766,0],0]     ENVIRON[86]: LESSOPEN=|/usr/bin/lesspipe.sh %s
[r4n52:12205] [[20766,0],0]     ENVIRON[87]: OMP_NUM_THREADS=1
[r4n52:12205] [[20766,0],0]     ENVIRON[88]: PBS_O_MAIL=/var/spool/mail/nchamber
[r4n52:12205] [[20766,0],0]     ENVIRON[89]: 
CC=/gpfs/pkgs/mhpcc/intel/wrapper/icc
[r4n52:12205] [[20766,0],0]     ENVIRON[90]: PBS_O_SYSTEM=Linux
[r4n52:12205] [[20766,0],0]     ENVIRON[91]: G_BROKEN_FILENAMES=1
[r4n52:12205] [[20766,0],0]     ENVIRON[92]: 
PBS_NODEFILE=/var/spool/PBS/aux/458331.rsvc1
[r4n52:12205] [[20766,0],0]     ENVIRON[93]: 
I_MPI_ROOT=/gpfs/pkgs/mhpcc/intel/2013/impi/4.1.0.024
[r4n52:12205] [[20766,0],0]     ENVIRON[94]: 
PBS_O_PATH=/opt/xcat/bin:/opt/xcat/sbin:/usr/local/ossh/bin:/gpfs/pkgs/mhpcc/intel/2013/impi/4.1.0.024/intel64/bin:/gpfs/pkgs/mhpcc/intel/wrapper:/usr/local/krb5/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/ossh-6.6p1a_x86_64/bin:/usr/local/sbin:/usr/lpp/mmfs/bin:/opt/ibutils/bin:/gpfs/pkgs/mhpcc/java/current/bin:/gpfs/pkgs/mhpcc/java/current/jre/bin:/gpfs/system/bin:/gpfs/pkgs/hpcmp/pstoolkit/bin:/usr/local/bin:/gpfs/pkgs/hpcmp/bct/bin:/usr/local/mpscp/bin:/gpfs/pkgs/hpcmp/csiapps/base/linux.x86_64/bin:/gpfs/pkgs/hpcmp/bciapps/base/linux.x86_64/bin:/gpfs/pkgs/hpcmp/SLB/current:/gpfs/pkgs/mhpcc/gnu/bin:/gpfs/pkgs/mhpcc/intel/2013/composer_xe_2013.1.117/bin/intel64:/opt/pbs/default/bin:/opt/pbs/default/sbin:/gpfs/pkgs/mhpcc/java/current/jre/bin:/gpfs/home/nchamber/bin/apache-ant-1.9.4/bin
[r4n52:12205] [[20766,0],0]     ENVIRON[95]: BASH_FUNC_module()=() {  eval 
`/usr/bin/modulecmd bash $*`
}
[r4n52:12205] [[20766,0],0]     ENVIRON[96]: 
_=/gpfs/home/nchamber/code/openmpi/installed/openmpi-dev-2223-g731cfe3/bin/mpirun
[r4n52:12205] [[20766,0],0]     ENVIRON[97]: 
PMIX_SERVER_URI=1360920576.0:/tmp/pbs.458331.rsvc1/openmpi-sessions-902206@r4n52_0/20766/0/pmix
[r4n52:12205] [[20766,0],0]     ENVIRON[98]: 
OMPI_MCA_orte_local_daemon_uri=1360920576.0;usock;tcp://10.60.104.52,10.70.104.52:46211;ud://335110.603.1
[r4n52:12205] [[20766,0],0]     ENVIRON[99]: 
OMPI_MCA_orte_hnp_uri=1360920576.0;usock;tcp://10.60.104.52,10.70.104.52:46211;ud://335110.603.1
[r4n52:12205] [[20766,0],0]     ENVIRON[100]: OMPI_MCA_mpi_yield_when_idle=0
[r4n52:12205] [[20766,0],0]     ENVIRON[101]: OMPI_MCA_orte_app_num=0
[r4n52:12205] [[20766,0],0]     ENVIRON[102]: OMPI_UNIVERSE_SIZE=1
[r4n52:12205] [[20766,0],0]     ENVIRON[103]: OMPI_MCA_orte_num_nodes=1
[r4n52:12205] [[20766,0],0]     ENVIRON[104]: 
OMPI_MCA_shmem_RUNTIME_QUERY_hint=mmap
[r4n52:12205] [[20766,0],0]     ENVIRON[105]: OMPI_MCA_orte_bound_at_launch=1
[r4n52:12205] [[20766,0],0]     ENVIRON[106]: OMPI_MCA_ess=pmi
[r4n52:12205] [[20766,0],0]     ENVIRON[107]: OMPI_MCA_orte_ess_num_procs=1
[r4n52:12205] [[20766,0],0]     ENVIRON[108]: OMPI_COMM_WORLD_SIZE=1
[r4n52:12205] [[20766,0],0]     ENVIRON[109]: OMPI_COMM_WORLD_LOCAL_SIZE=1
[r4n52:12205] [[20766,0],0]     ENVIRON[110]: 
OMPI_MCA_orte_tmpdir_base=/tmp/pbs.458331.rsvc1
[r4n52:12205] [[20766,0],0]     ENVIRON[111]: OMPI_NUM_APP_CTX=1
[r4n52:12205] [[20766,0],0]     ENVIRON[112]: OMPI_FIRST_RANKS=0
[r4n52:12205] [[20766,0],0]     ENVIRON[113]: OMPI_APP_CTX_NUM_PROCS=1
[r4n52:12205] [[20766,0],0]     ENVIRON[114]: 
OMPI_MCA_initial_wdir=/gpfs/home/nchamber/mpibroken
[r4n52:12205] [[20766,0],0]     ENVIRON[115]: OMPI_MCA_ess_base_jobid=1360920577
[r4n52:12205] [[20766,0],0]     ENVIRON[116]: OMPI_MCA_ess_base_vpid=0
[r4n52:12205] [[20766,0],0]     ENVIRON[117]: OMPI_COMM_WORLD_RANK=0
[r4n52:12205] [[20766,0],0]     ENVIRON[118]: OMPI_COMM_WORLD_LOCAL_RANK=0
[r4n52:12205] [[20766,0],0]     ENVIRON[119]: OMPI_COMM_WORLD_NODE_RANK=0
[r4n52:12205] [[20766,0],0]     ENVIRON[120]: OMPI_MCA_orte_ess_node_rank=0
[r4n52:12205] [[20766,0],0]     ENVIRON[121]: PMIX_ID=1360920577.0
[r4n52:12205] [[20766,0],0]     ENVIRON[122]: 
OMPI_FILE_LOCATION=/tmp/pbs.458331.rsvc1/openmpi-sessions-902206@r4n52_0/20766/1/0
[r4n52:12205] [[20766,0],0]     ENVIRON[123]: OPAL_OUTPUT_STDERR_FD=28

opening file tweets/repeats.txt

opening file tweets/repeats.txt

opening file tweets/repeats.txt

opening file tweets/repeats.txt

opening file tweets/repeats.txt

opening file tweets/repeats.txt
[r4n52:12205] [[20766,0],0] odls:wait_local_proc child process [[20766,1],0] 
pid 12208 terminated
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 0 on node r4n52 exited on signal 11 
(Segmentation fault).
--------------------------------------------------------------------------
[r4n52:12205] mca: base: close: component default closed
[r4n52:12205] mca: base: close: unloading component default

Reply via email to