[firedrake] More detailled breakdown of PETSc timings / higher order geometric MG

Eike Mueller e.mueller at bath.ac.uk
Wed Mar 18 14:11:04 GMT 2015


Dear all,

to get a more detailled breakdown of the PETSc fieldsplit preconditioner I now tried

        ksp = up_solver.snes.getKSP()
        ksp.setMonitor(self._ksp_monitor)
        ksp_hdiv = ksp.getPC().getFieldSplitSubKSP()
        ksp_hdiv.setMonitor(self._ksp_monitor)

to attach my own KSP monitor to the solver for the HDiv system. I can then use that to work out the time per iteration and number of iterations of the velocity mass matrix solve. I suspect that for some reason the same PC (preonly+bjacobi+ILU) is less efficient for my standalone velocity mass matrix solve, possibly because the ilu does not work due to the wrong dof-ordering (I observe that preonly+bjacobi+ILU is not faster than cg+jacobi for my inversion, but in the fieldsplit case there is a significant difference).

However, the third line of the code above crashes with a nasty segfault in PETSc:

  File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/gravitywaves.py", line 475, in solve
    pc_hdiv = ksp.getPC().getFieldSplitSubKSP()
  File "PC.pyx", line 384, in petsc4py.PETSc.PC.getFieldSplitSubKSP (src/petsc4py.PETSc.c:136328)
petsc4py.PETSc.Error: error code 85
[0] PCFieldSplitGetSubKSP() line 1662 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c
[0] PCFieldSplitGetSubKSP_FieldSplit_Schur() line 1259 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c
[0] MatSchurComplementGetKSP() line 317 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/utils/schurm.c
[0] Null argument, when expecting valid pointer
[0] Null Object: Parameter # 1

Thanks,

Eike

> On 17 Mar 2015, at 09:22, Eike Mueller <E.Mueller at bath.ac.uk> wrote:
> 
> sorry, those times were with an unoptimised PETSc.
> 
>> Data I have currently:
>> In the matrix-free solver, one velocity mass matrix inverse costs 2.27s, and I need two per iteration just for the forward/backward substitution. On the other hand, one GMRES iteration of the PETSc solver (which includes everything: applying the mixed operator, solving the pressure system, inverting the velocity mass matrices) takes 3.87s, so something is not right there.
> 
> If I use optimised PETSc, I get ~0.8s for one velocity mass matrix solve in the matrix-free solver (and a total time per iteration of 3.2s). The time per iteration in the PETSc solver with AMG preconditioner is 0.8s.
> 
> Thanks,
> 
> Eike




More information about the firedrake mailing list