问题
I have reduced my program to the following example:
#include <mpi.h>
int main(int argc, char * argv[]) {
int rank, size;
MPI_Init(&argc, &argv);
MPI_Comm_rank(MPI_COMM_WORLD, &rank);
MPI_Comm_size(MPI_COMM_WORLD, &size);
MPI_Barrier(MPI_COMM_WORLD);
MPI_Finalize();
return 0;
}
I compile and run the code, and get the following result:
My-MacBook-Pro-2:xCode_TrapSim user$ mpicxx -g -O0 -Wall barrierTest.cpp -o barrierTestExec
My-MacBook-Pro-2:xCode_TrapSim user$ mpiexec -n 2 ./barrierTestExec
================================================================================== =
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= PID 21633 RUNNING AT My-MacBook-Pro-2.local
= EXIT CODE: 11
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
YOUR APPLICATION TERMINATED WITH THE EXIT STRING: Segmentation fault: 11 (signal 11)
This typically refers to a problem with your application.
Please see the FAQ page for debugging suggestions
If I comment out the MPI_Barrier, or run the program on only one node, the code runs fine. I am using the following compilers:
My-MacBook-Pro-2:xCode_TrapSim user$ mpiexec --version
HYDRA build details:
Version: 3.2
Release Date: Wed Nov 11 22:06:48 CST 2015
CC: clang
CXX: clang++
F77: /usr/local/bin/gfortran
F90: /usr/local/bin/gfortran
Configure options: '--disable-option-checking' '--prefix=/usr/local/Cellar/mpich/3.2_1' '--disable-dependency-tracking' '--disable-silent-rules' '--mandir=/usr/local/Cellar/mpich/3.2_1/share/man' 'CC=clang' 'CXX=clang++' 'FC=/usr/local/bin/gfortran' 'F77=/usr/local/bin/gfortran' '--cache-file=/dev/null' '--srcdir=.' 'CFLAGS= -O2' 'LDFLAGS=' 'LIBS=-lpthread ' 'CPPFLAGS= -I/private/tmp/mpich-20160606-48824-1qsaqn8/mpich-3.2/src/mpl/include -I/private/tmp/mpich-20160606-48824-1qsaqn8/mpich-3.2/src/mpl/include -I/private/tmp/mpich-20160606-48824-1qsaqn8/mpich-3.2/src/openpa/src -I/private/tmp/mpich-20160606-48824-1qsaqn8/mpich-3.2/src/openpa/src -D_REENTRANT -I/private/tmp/mpich-20160606-48824-1qsaqn8/mpich-3.2/src/mpi/romio/include'
Process Manager: pmi
Launchers available: ssh rsh fork slurm ll lsf sge manual persist
Topology libraries available: hwloc
Resource management kernels available: user slurm ll lsf sge pbs cobalt
Checkpointing libraries available:
Demux engines available: poll select
My-MacBook-Pro-2:xCode_TrapSim user$ clang --version
Apple LLVM version 7.3.0 (clang-703.0.31)
Target: x86_64-apple-darwin15.5.0
Thread model: posix
InstalledDir: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin
This seems like it should be a trivial problem, but I can't seem to figure it out. Why would MPI_Barrier cause this simple code to seg fault?
回答1:
It is supper hard to decide what is wrong with your installation. However, if you can use any of the MPI flavors, maybe you can try this one:
http://www.owsiak.org/?p=3492
All I can say, it works with Open MPI
~/opt/usr/local/bin/mpicxx -g -O0 -Wall barrierTestExec.cpp -o barrierTestExec
~/opt/usr/local/bin/mpiexec -n 2 ./barrierTestExec
and no exception in my case. It really seems to be environment specific.
来源:https://stackoverflow.com/questions/38086041/why-does-mpi-barrier-cause-a-segmentation-fault-in-c