blacs_test, FORTRAN90 programs which test blacs, the Basic Linear Algebra Communication Subprograms, which form a linear algebra-oriented message passing interface that may be implemented efficiently and uniformly across a large range of distributed memory platforms.
The length of time required to implement efficient distributed memory algorithms makes it impractical to rewrite programs for every new parallel machine. The BLACS exist in order to make linear algebra applications both easier to program and more portable. It is for this reason that the BLACS are used as the communication layer of the distributed memory linear algebra package SCALAPACK, for instance.
MPI is one example of a distributed memory system. A program written at the BLACS level can run on under MPI. The same program should run correctly on systems that use other distributed memory systems. The key is that on each system, the installation of the BLACS library takes into account the interfact between the standard BLACS routines and the local distributed memory system.
The computer code and data files described and made available on this web page are distributed under the GNU LGPL license.
blacs_test is available in a FORTRAN90 version.
MPI, FORTRAN90 programs which demonstrate the use of MPI for parallel computing in distributed memory systems.
OPENMP FORTRAN90 programs which illustrate the use of the OpenMP application program interface for carrying out parallel computations in a shared memory environment.
SCALAPACK, FORTRAN90 programs which demonstrate the use of SCALAPACK.
blacs_test is a simple test in which a grid is set up, and each process reports in to the master process: