Channelflow 2.0

Memory leak in openmpi-2.1.x (ubuntu 18.04 package)

Dear All,

this is a heads-up for ubuntu users: There is a memory leak when channelflow is linked to the default ubuntu openmpi package based on openmpi-2.1.1. The leak is not a channelflow issue, it definitely comes from openmpi-2.1.1. I have seen this on ubuntu 18.04 machines as well as on the two clusters I am using.

I guess as v2.1.x has long been superseded, the issue is unlikely to occur on clusters, where v3.x or v4.x are usually available.

In summary, if channelflow is installed on a ubuntu machine, the ubuntu openmpi package is best avoided and a manual install of a newer version (3.0.0 and 4.0.1 worked for me) is a better option.

Regards,
Moritz

Hello,

I am facing a similar issue, though I am not sure if its a memory leak or some other issue which is causing excessive consumption of memory. I am running channelflow on the IDRIS supercomputer facility (MPI version 4.1) in Paris and observing that the default memory of 3.5Gb per process is insufficient for higher resolutions.

(1) Resolution of Nx,Ny,Nz=2560,33,5120 could not be run even with np=1024.
(2) A peculiar behaviour is observed with Nx,Ny,Nz=1024,121,1024 running without problems even with np=32, while for Nx,Ny,Nz=1280,33,2560 and np=64, resulting in a core dump !
(3) For other resolutions along the same order or higher (2048,121,2048 np~512), the simulations encounter “PMPI_Gatherv : Other MPI error, error stack:” OR “std::bad_alloc()” .

Can anybody give any guidance on what is happening here ? What has been the experience with channelflow regarding the memory consumption ?

Regards,

Pavan

You may both be interested in the bug report below. Related?

Hi Jake,

thank you for the info. Not sure if it is related, we saw the memory leak in simulateflow and it was quite drastic. I had similar problems with openmpi-2.1.x with another code a while ago. This is why I did not do any further detective work and moved to a more recent MPI version as a first step, and the leak disappeared.

Regards,
Moritz

Ok - bit of a shame, I’ve been looking for closure on and off for that issue for a while now.

You might consider submitting a bug report for posterity - since the build system ought to guard against using openmpi-2.1.x in that case.