[firedrake] Firedrake Installation

Florian Rathgeber florian.rathgeber at imperial.ac.uk
Sun Nov 16 13:48:28 GMT 2014


This looks like you might have mpich2 installed in addition to openmpi.
Can you please check and verify? If so, please remove mpich2.

Florian

On 16/11/14 13:39, Andrew McRae wrote:
> Hi all,
> 
> Suet [cc'd], one of Colin's project students, is encoutering the
> following error when using the PyOP2 'quick install' script.
> 
> Can anyone help?  IIRC, this is running in a (32-bit?) Ubuntu 14.04
> virtual machine.
> 
> (Last night, she was getting an mpi4py error, but I believe this
> <https://github.com/OP2/PyOP2/commit/e15b3ee8d72369f10c9a84d944a3ecbf06f7e927>
> fixed that).
> 
> Cheers,
> 
> Andrew
> 
> On 16 November 2014 13:28, Lee, Suet <suet.lee11 at imperial.ac.uk
> <mailto:suet.lee11 at imperial.ac.uk>> wrote:
> 
>     I ran the install mpi4py commange but I have a new error now:
> 
> 
>     mosh at ubuntu:~$ wget -O -
>     https://github.com/OP2/PyOP2/raw/master/install.sh | sudo bash
>     --2014-11-16 13:20:14-- 
>     https://github.com/OP2/PyOP2/raw/master/install.sh
>     Resolving github.com <http://github.com> (github.com
>     <http://github.com>)... 192.30.252.131
>     Connecting to github.com <http://github.com> (github.com
>     <http://github.com>)|192.30.252.131|:443... connected.
>     HTTP request sent, awaiting response... 302 Found
>     Location:
>     https://raw.githubusercontent.com/OP2/PyOP2/master/install.sh
>     [following]
>     --2014-11-16 13:20:16-- 
>     https://raw.githubusercontent.com/OP2/PyOP2/master/install.sh
>     Resolving raw.githubusercontent.com
>     <http://raw.githubusercontent.com> (raw.githubusercontent.com
>     <http://raw.githubusercontent.com>)... 185.31.18.133
>     Connecting to raw.githubusercontent.com
>     <http://raw.githubusercontent.com> (raw.githubusercontent.com
>     <http://raw.githubusercontent.com>)|185.31.18.133|:443... connected.
>     HTTP request sent, awaiting response... 200 OK
>     Length: 3667 (3.6K) [text/plain]
>     Saving to: ‘STDOUT’
> 
>     100%[======================================>] 3,667       --.-K/s  
>     in 0s      
> 
>     2014-11-16 13:20:16 (40.6 MB/s) - written to stdout [3667/3667]
> 
>     PyOP2 installation started at Sun Nov 16 13:20:16 GMT 2014
>       on Linux ubuntu 3.13.0-24-generic #46-Ubuntu SMP Thu Apr 10
>     19:08:14 UTC 2014 i686 i686 i686 GNU/Linux
> 
>     *** Privileged installation ***
>       Running unprivileged commands as mosh
> 
>     *** Preparing system ***
> 
>     *** Installing dependencies ***
> 
>     *** Installing PETSc ***
> 
>     *** Installing PyOP2 ***
> 
>     [ubuntu:31603] *** Process received signal ***
>     [ubuntu:31603] Signal: Floating point exception (8)
>     [ubuntu:31603] Signal code: Integer divide-by-zero (1)
>     [ubuntu:31603] Failing at address: 0xb74d6da0
>     [ubuntu:31603] [ 0] [0xb779a40c]
>     [ubuntu:31603] [ 1] /usr/lib/i386-linux-gnu/libhwloc.so.5(+0x2cda0)
>     [0xb74d6da0]
>     [ubuntu:31603] [ 2] /usr/lib/i386-linux-gnu/libhwloc.so.5(+0x2e71c)
>     [0xb74d871c]
>     [ubuntu:31603] [ 3] /usr/lib/i386-linux-gnu/libhwloc.so.5(+0x2ea8b)
>     [0xb74d8a8b]
>     [ubuntu:31603] [ 4] /usr/lib/i386-linux-gnu/libhwloc.so.5(+0x98f6)
>     [0xb74b38f6]
>     [ubuntu:31603] [ 5]
>     /usr/lib/i386-linux-gnu/libhwloc.so.5(hwloc_topology_load+0x1c6)
>     [0xb74b48ec]
>     [ubuntu:31603] [ 6]
>     /usr/lib/libopen-rte.so.4(orte_odls_base_open+0x7b1) [0xb76fa881]
>     [ubuntu:31603] [ 7]
>     /usr/lib/openmpi/lib/openmpi/mca_ess_hnp.so(+0x2445) [0xb7488445]
>     [ubuntu:31603] [ 8] /usr/lib/libopen-rte.so.4(orte_init+0x1cf)
>     [0xb76cfb3f]
>     [ubuntu:31603] [ 9] /usr/lib/libopen-rte.so.4(orte_daemon+0x256)
>     [0xb76ec1c6]
>     [ubuntu:31603] [10] orted() [0x80485b3]
>     [ubuntu:31603] [11]
>     /lib/i386-linux-gnu/libc.so.6(__libc_start_main+0xf3) [0xb7504a83]
>     [ubuntu:31603] [12] orted() [0x80485f8]
>     [ubuntu:31603] *** End of error message ***
>     [ubuntu:31601] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to start a
>     daemon on the local node in file ess_singleton_module.c at line 343
>     [ubuntu:31601] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to start a
>     daemon on the local node in file ess_singleton_module.c at line 140
>     [ubuntu:31601] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to start a
>     daemon on the local node in file runtime/orte_init.c at line 128
>     --------------------------------------------------------------------------
>     It looks like orte_init failed for some reason; your parallel process is
>     likely to abort.  There are many reasons that a parallel process can
>     fail during orte_init; some of which are due to configuration or
>     environment problems.  This failure appears to be an internal failure;
>     here's some additional information (which may only be relevant to an
>     Open MPI developer):
> 
>       orte_ess_set_name failed
>       --> Returned value Unable to start a daemon on the local node
>     (-128) instead of ORTE_SUCCESS
>     --------------------------------------------------------------------------
>     --------------------------------------------------------------------------
>     It looks like MPI_INIT failed for some reason; your parallel process is
>     likely to abort.  There are many reasons that a parallel process can
>     fail during MPI_INIT; some of which are due to configuration or
>     environment
>     problems.  This failure appears to be an internal failure; here's some
>     additional information (which may only be relevant to an Open MPI
>     developer):
> 
>       ompi_mpi_init: orte_init failed
>       --> Returned "Unable to start a daemon on the local node" (-128)
>     instead of "Success" (0)
>     --------------------------------------------------------------------------
>     [ubuntu:31601] *** An error occurred in MPI_Init_thread
>     [ubuntu:31601] *** on a NULL communicator
>     [ubuntu:31601] *** Unknown error
>     [ubuntu:31601] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort
>     --------------------------------------------------------------------------
>     An MPI process is aborting at a time when it cannot guarantee that all
>     of its peer processes in the job will be killed properly.  You should
>     double check that everything has shut down cleanly.
> 
>       Reason:     Before MPI_INIT completed
>       Local host: ubuntu
>       PID:        31601
>     --------------------------------------------------------------------------
>     PyOP2 installation failed
>       See /home/mosh/PyOP2/pyop2_install.log for details
> 
> 
>     Any idea what's going on?
> 
> 
>     Suet

-------------- next part --------------
A non-text attachment was scrubbed...
Name: smime.p7s
Type: application/pkcs7-signature
Size: 2980 bytes
Desc: S/MIME Cryptographic Signature
URL: <http://mailman.ic.ac.uk/pipermail/firedrake/attachments/20141116/79338795/attachment.p7s>


More information about the firedrake mailing list