[firedrake] More detailled breakdown of PETSc timings / higher order geometric MG
Eike Mueller
e.mueller at bath.ac.uk
Thu Mar 19 09:04:55 GMT 2015
Hi Lawrence,
turns out it does work, but I did not get any output since I used KSP=PREONLY… Once I change this to e.g. JACOBI my monitor prints what it should print.
Thanks,
Eike
> On 18 Mar 2015, at 20:15, Eike Mueller <E.Mueller at bath.ac.uk> wrote:
>
> Hi Lawrence,
>
> if I set up the ksp as below, I get rid of the segfault. However, I don’t get any output from the KSP monitors attached to the subKSPs, only from the KSP monitor of the main KSP.
>
> up_solver = LinearVariationalSolver(up_problem, solver_parameters=sparams)
> ksp = up_solver.snes.getKSP()
> ksp.setUp()
> ksp.setMonitor(self._ksp_monitor)
> ksp_hdiv = ksp.getPC().getFieldSplitSubKSP()
> ksp_hdiv[0].setMonitor(KSPMonitor('fieldsplit_0',verbose=2))
> ksp_hdiv[1].setMonitor(KSPMonitor('fieldsplit_1',verbose=2))
> with self._ksp_monitor:
> up_solver.solve()
>
> Thanks,
>
> Eike
>
> --
>
> Dr Eike Hermann Mueller
> Lecturer in Scientific Computing
>
> Department of Mathematical Sciences
> University of Bath
> Bath BA2 7AY, United Kingdom
>
> +44 1225 38 5557
> e.mueller at bath.ac.uk <mailto:e.mueller at bath.ac.uk>
> http://people.bath.ac.uk/em459/
>
>> On 18 Mar 2015, at 15:09, Lawrence Mitchell <lawrence.mitchell at imperial.ac.uk <mailto:lawrence.mitchell at imperial.ac.uk>> wrote:
>>
>>
>> On 18 Mar 2015, at 08:11, Eike Mueller <e.mueller at bath.ac.uk <mailto:e.mueller at bath.ac.uk>> wrote:
>>
>>> Dear all,
>>>
>>> to get a more detailled breakdown of the PETSc fieldsplit preconditioner I now tried
>>>
>>> ksp = up_solver.snes.getKSP()
>>> ksp.setMonitor(self._ksp_monitor)
>>> ksp_hdiv = ksp.getPC().getFieldSplitSubKSP()
>>> ksp_hdiv.setMonitor(self._ksp_monitor)
>>>
>>> to attach my own KSP monitor to the solver for the HDiv system. I can then use that to work out the time per iteration and number of iterations of the velocity mass matrix solve. I suspect that for some reason the same PC (preonly+bjacobi+ILU) is less efficient for my standalone velocity mass matrix solve, possibly because the ilu does not work due to the wrong dof-ordering (I observe that preonly+bjacobi+ILU is not faster than cg+jacobi for my inversion, but in the fieldsplit case there is a significant difference).
>>>
>>> However, the third line of the code above crashes with a nasty segfault in PETSc:
>>>
>>> File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/gravitywaves.py", line 475, in solve
>>> pc_hdiv = ksp.getPC().getFieldSplitSubKSP()
>>> File "PC.pyx", line 384, in petsc4py.PETSc.PC.getFieldSplitSubKSP (src/petsc4py.PETSc.c:136328)
>>> petsc4py.PETSc.Error: error code 85
>>> [0] PCFieldSplitGetSubKSP() line 1662 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c
>>> [0] PCFieldSplitGetSubKSP_FieldSplit_Schur() line 1259 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c
>>> [0] MatSchurComplementGetKSP() line 317 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/utils/schurm.c
>>> [0] Null argument, when expecting valid pointer
>>> [0] Null Object: Parameter # 1
>>>
>> You probably needed to call
>>
>> up_solver.snes.setUp() (and maybe up_solver.snes.setFromOptions(), once you've set the petsc options appropriately) before you can pull the schur complement KSPs out.
>>
>> Lawrence
>>
>> _______________________________________________
>> firedrake mailing list
>> firedrake at imperial.ac.uk <mailto:firedrake at imperial.ac.uk>
>> https://mailman.ic.ac.uk/mailman/listinfo/firedrake <https://mailman.ic.ac.uk/mailman/listinfo/firedrake>
> _______________________________________________
> firedrake mailing list
> firedrake at imperial.ac.uk
> https://mailman.ic.ac.uk/mailman/listinfo/firedrake
-------------- next part --------------
HTML attachment scrubbed and removed
More information about the firedrake
mailing list