Opened 14 years ago

Closed 13 years ago

#2485 closed Bugs (worksforme)

MPI broadcast issue

Reported by: bryon.robidoux@… Owned by: Matthias Troyer
Milestone: Boost 1.38.0 Component: mpi
Version: Boost 1.36.0 Severity: Showstopper
Keywords: broadcast MPI Cc:

Description

The broadcast function is crashing and I can't see anything that I am doing wrong. Below I have the code and output to a very simple program that I used test the boost API.

I compiled the boost libraries using Visual Studio 2008 SP1. I linked the MPI library against msmpi that is provided with the HPC server 2008 SDK. I ran this on a single HPC server 2008 box, so there is no cluster involved. It is the simplest case possible.

Thanks for your help, Bryon

Example Program * int main(int argc, char* argv[]) {

mpi::environment env(argc, argv); mpi::communicator world; char value[128]={'\0'}; if(world.rank()==0){

strcpy(value,"Hello Bryon from C API\0"); cout<<"start "<<value<<endl;

}

MPI_Bcast(value,12 , MPI_INT, 0, MPI_COMM_WORLD); std::cout<<world.rank()<<" " << value<<std::endl;

bool was_successful=false; if(world.rank()==0)

was_successful=true;

mpi::broadcast(world,was_successful,0); std::cout<<world.rank()<<" :"<<(was_successful?"Success with broacasting boolean":"Failure with broadcasting boolean")<<std::endl;

std::string bvalue; if(world.rank()==0)

bvalue="Hello Bryon from boost API with string.";

mpi::broadcast(world,bvalue,0); std::cout<<world.rank()<<" :"<<bvalue<<std::endl;

return 0;

}

Output

E:\HPC Class\hpcs-day3\HPCS-Day3\demos\09 MPI-Send-Recv\TaskParallelMPI\x64\rele ase>mpiexec -n 2 TaskParallelMPI start Hello Bryon from C API 0 Hello Bryon from C API 0 :Success with broacasting boolean 1 Hello Bryon from C API 1 :Success with broacasting boolean

This application has requested the Runtime to terminate it in an unusual way. Please contact the application's support team for more information.

job aborted: [ranks] message

[0] terminated

[1] process exited without calling finalize


[1] on 2008ROBIDOUXBR TaskParallelMPI ended prematurely and may have crashed. exit code 255


E:\HPC Class\hpcs-day3\HPCS-Day3\demos\09 MPI-Send-Recv\TaskParallelMPI\x64\rele ase>

Change History (3)

comment:1 by Matthias Troyer, 13 years ago

Owner: changed from Douglas Gregor to Matthias Troyer

comment:2 by Matthias Troyer, 13 years ago

Status: newassigned

comment:3 by Matthias Troyer, 13 years ago

Resolution: worksforme
Status: assignedclosed

I tried this on the platforms available to me and it works. Since I don't have a Windows machine I cannot test it there. I will close this ticket until someone can confirm this on a Windows machine and can help me debug this.

Note: See TracTickets for help on using tickets.