[firedrake] Installing Firedrake on an HPC machine

Justin Chang jychang48 at gmail.com
Thu Aug 6 15:42:31 BST 2015


>mpiexec -n 2 python run-pyop2.py

--------------------------------------------------------------------------

An MPI process has executed an operation involving a call to the

"fork()" system call to create a child process.  Open MPI is currently

operating in a condition that could result in memory corruption or

other system errors; your MPI job may hang, crash, or produce silent

data corruption.  The use of fork() (or system() or other calls that

create child processes) is strongly discouraged.


The process that invoked fork was:


  Local host:          compute-0-0 (PID 49631)

  MPI_COMM_WORLD rank: 1


If you are *absolutely sure* that your application will successfully

and correctly survive a call to fork(), you may disable this warning

by setting the mpi_warn_on_fork MCA parameter to 0.

--------------------------------------------------------------------------

[compute-0-0.local:49628] 1 more process has sent help message
help-mpi-runtime.txt / mpi_init:warn-fork

[compute-0-0.local:49628] Set MCA parameter "orte_base_help_aggregate" to 0
to see all help / error messages

mpiexec -n 2 python import-firedrake.py

--------------------------------------------------------------------------

An MPI process has executed an operation involving a call to the

"fork()" system call to create a child process.  Open MPI is currently

operating in a condition that could result in memory corruption or

other system errors; your MPI job may hang, crash, or produce silent

data corruption.  The use of fork() (or system() or other calls that

create child processes) is strongly discouraged.


The process that invoked fork was:


  Local host:          compute-0-0 (PID 49798)

  MPI_COMM_WORLD rank: 1


If you are *absolutely sure* that your application will successfully

and correctly survive a call to fork(), you may disable this warning

by setting the mpi_warn_on_fork MCA parameter to 0.

--------------------------------------------------------------------------

[compute-0-0.local:49795] 1 more process has sent help message
help-mpi-runtime.txt / mpi_init:warn-fork

[compute-0-0.local:49795] Set MCA parameter "orte_base_help_aggregate" to 0
to see all help / error messages

>mpiexec -n 2 python firedrake-test.py

--------------------------------------------------------------------------

An MPI process has executed an operation involving a call to the

"fork()" system call to create a child process.  Open MPI is currently

operating in a condition that could result in memory corruption or

other system errors; your MPI job may hang, crash, or produce silent

data corruption.  The use of fork() (or system() or other calls that

create child processes) is strongly discouraged.


The process that invoked fork was:


  Local host:          compute-0-0 (PID 49534)

  MPI_COMM_WORLD rank: 1


If you are *absolutely sure* that your application will successfully

and correctly survive a call to fork(), you may disable this warning

by setting the mpi_warn_on_fork MCA parameter to 0.

--------------------------------------------------------------------------

Compiling form form


Compiler stage 1: Analyzing form(s)

-----------------------------------



  Geometric dimension:       2

  Number of cell subdomains: 0

  Rank:                      0

  Arguments:                 '()'

  Number of coefficients:    1

  Coefficients:              '[w_0]'

  Unique elements:           'R0(?)'

  Unique sub elements:       'R0(?)'



  representation:    quadrature

  quadrature_degree: auto --> 0

  quadrature_rule:   auto --> default



Compiler stage 1 finished in 0.0059092 seconds.


Compiler stage 2: Computing intermediate representation

-------------------------------------------------------

  Computing representation of integrals

  Computing quadrature representation

  Transforming cell integral



Compiler stage 2 finished in 0.0390902 seconds.


Compiler stage 3: Optimizing intermediate representation

--------------------------------------------------------

  Skipping optimizations, add -O to optimize



Compiler stage 3 finished in 0.000159979 seconds.


Compiler stage 4 finished in 0.00311899 seconds.


*FFC finished in 0.0489531 seconds.*

[0] pyop2:INFO Compiling wrapper...

[0] pyop2:INFO Compiling wrapper...done

1.0

1.0

[compute-0-0.local:49531] 1 more process has sent help message
help-mpi-runtime.txt / mpi_init:warn-fork

[compute-0-0.local:49531] Set MCA parameter "orte_base_help_aggregate" to 0
to see all help / error messages


Thanks,

Justin

On Thu, Aug 6, 2015 at 9:36 AM, Lawrence Mitchell <
lawrence.mitchell at imperial.ac.uk> wrote:

> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
>
> On 06/08/15 15:26, Justin Chang wrote:
> > Lawrence,
> >
> >> mpiexec -n 2 python no-fork.py
> >
> > 2
> >
> > 2
>
> OK, good.
>
> >> mpiexec -n 2 python fork-before.py
> >
> > child exiting
> >
> > child exiting
> >
> > 2
> >
> > 2
>
> Also good.
>
> >> mpiexec -n 2 python fork-after.py
> >
> > 2
> >
> > 2
> >
> > child exiting
> >
> > child exiting
> >
> >
> --------------------------------------------------------------------------
> >
> >  An MPI process has executed an operation involving a call to the
> >
> > "fork()" system call to create a child process.  Open MPI is
> > currently
> >
> > operating in a condition that could result in memory corruption or
> >
> > other system errors; your MPI job may hang, crash, or produce
> > silent
> >
> > data corruption.  The use of fork() (or system() or other calls
> > that
> >
> > create child processes) is strongly discouraged.
> >
> >
> > The process that invoked fork was:
> >
> >
> > Local host:          compute-0-0 (PID 43057)
> >
> > MPI_COMM_WORLD rank: 0
> >
> >
> > If you are *absolutely sure* that your application will
> > successfully
> >
> > and correctly survive a call to fork(), you may disable this
> > warning
> >
> > by setting the mpi_warn_on_fork MCA parameter to 0.
> >
> >
> --------------------------------------------------------------------------
> >
> >  [compute-0-0.local:43055] 1 more process has sent help message
> > help-mpi-runtime.txt / mpi_init:warn-fork
> >
> > [compute-0-0.local:43055] Set MCA parameter
> > "orte_base_help_aggregate" to 0 to see all help / error messages
>
> As expected, we fork()ed after MPI_Init and so OpenMPI complains.
>
> >> mpiexec -n 2 python closer-test.py
> >
> > In parent True
> >
> > In parent True
> >
> > In child False
> >
> > In child False
>
> OK good, so if we fork before MPI_Init then OpenMPI doesn't complain.
>
> >> mpiexec -n 2 python fork-pyop2.py
> >
> > 2
> >
> > 2
>
> So this one just does the same as before, but using the PyOP2-internal
> "early fork", so that works too.
>
> >
> > The last three examples pyop2.py, import-firedrake.py, and
> > firedrake-test.py did not run because they say "ImportError:
> > cannot import name op2". And now all of my firedrake programs run
> > into this exact error, which is confusing.
>
> Ah the pyop2.py example was badly named.  If you rename it to
> run-pyop2.py and do mpiexec -n 2 python run-pyop2.py I hope things
> work again.  What's happened is that this file, with name "pyop2" is
> now in your pythonpath and python when trying to import pyop2 looks in
> this file for all the symbols rather than the proper PyOP2 package.
> Can you do that and then try again?
>
> Lawrence
> -----BEGIN PGP SIGNATURE-----
> Version: GnuPG v1
>
> iQEcBAEBAgAGBQJVw3DdAAoJECOc1kQ8PEYv1j0IALOwkyECkCNyOTxmVoGcS0A2
> 1iv7CNNjCSDUO4P6zhqLAVQ5yPSJmupAiQcA/FBkwJE+HvHG3c++szmmyMbeN8EL
> a7XblVdpnTwt9vFy5sTwZjzKm2RtnanRxrkY2FQhiD1ko6FQRXCBLvyWZGJ8ynUy
> a8MfcSO/UxCV1cjWzRq6F1AGZnwjUGnZRoidy5PxFE2mBqr/lw0DP9HKG/PfguUg
> vTXGMdHqAdp4GTi4U3SplVmRGCmmrI8MqQ39Ej/Ww1ksQVWeHBWEo+dg2qfEE6I5
> 679QRYCr24NLdNjWuuTJjSao5hvdFQMmJC+y0XRCMNljksWYIAm28fB3LKqqXZQ=
> =WvxN
> -----END PGP SIGNATURE-----
>
> _______________________________________________
> firedrake mailing list
> firedrake at imperial.ac.uk
> https://mailman.ic.ac.uk/mailman/listinfo/firedrake
>
-------------- next part --------------
HTML attachment scrubbed and removed


More information about the firedrake mailing list