[firedrake] hybridisation and tensor-product multigrid
Eike Mueller
e.mueller at bath.ac.uk
Tue Mar 17 21:12:03 GMT 2015
Dear all,
while doing this I updated my firedrake master, and suddenly my main solver code stopped working. I get the error below. I pulled the latest versions of PyOP2, FIAT, FFC and UFL, and I did run firedrake-clean. I used the master branch for all those packages, except for PyOP2 where I use columnwise_kernels (but I also tried the master here).
The line it struggles with is
compiled_form = compile_form(mass, 'mass')[0]
where mass is a UFL form.
Thanks,
Eike
Traceback (most recent call last):
File "driver.py", line 508, in <module>
main(parameter_filename)
File "driver.py", line 303, in main
omega_N)
File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/pressuresolver/hierarchy.py", line 30, in __init__
self._data = [Type(*x,**kwargs) for x in arglist]
File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/pressuresolver/operators.py", line 170, in __init__
self._Mu_h = LumpedMass(dot(w_h,TrialFunction(self._W2_h))*self._dx)
File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/pressuresolver/lumpedmass.py", line 34, in __init__
compiled_form = compile_form(mass, 'mass')[0]
File "/Users/eikemueller/PostDocBath/EllipticSolvers/firedrake/firedrake/ffc_interface.py", line 282, in compile_form
ffc_kernel = FFCKernel(f, name + str(i) + str(j), parameters)
File "/Users/eikemueller/PostDocBath/EllipticSolvers/PyOP2/pyop2/caching.py", line 202, in __new__
obj = make_obj()
File "/Users/eikemueller/PostDocBath/EllipticSolvers/PyOP2/pyop2/caching.py", line 192, in make_obj
obj.__init__(*args, **kwargs)
File "/Users/eikemueller/PostDocBath/EllipticSolvers/firedrake/firedrake/ffc_interface.py", line 199, in __init__
_kernel = kernel if not parameters["assemble_inverse"] else _inverse(kernel)
KeyError: 'assemble_inverse'
eikemueller at Eikes-MBP $ ls -l /Users/eikemueller/PostDocBath/EllipticSolvers/PyOP2/pyop2/caching.py
-rw-r--r-- 1 eikemueller staff 10699 17 Oct 10:50 /Users/eikemueller/PostDocBath/EllipticSolvers/PyOP2/pyop2/caching.py
> On 17 Mar 2015, at 17:16, Miklos Homolya <m.homolya14 at imperial.ac.uk> wrote:
>
> Hi Eike,
>
> can you please try again the extruded case with FIAT and Firedrake branches refactor-flattening?
>
> Thanks,
> Miklos
>
>
> On 17/03/15 14:08, Eike Mueller wrote:
>> Hi Miklos,
>>
>> thanks, that’s probably what I’m looking for. It works if I just create a BDFM1 element on a 2d grid, and then say
>>
>> U1 = FiniteElement('BDFM',triangle,2)
>> U1_broken = BrokenElement(U1)
>>
>> However, the 3d code below which builds elements on an extruded mesh does not work. I’ve also tried to first create BrokenElement versions of U1 and V1, and then combine them, but that doesn’t work either.
>>
>> Thanks,
>>
>> Eike
>>
>> PS: I started writing a class for locally assembling matrices (for this class it does not matter whether the function spaces are discontinuous or not, but the multiplying them locally of course only makes sense if the spaces are discontinuous), see here: https://github.com/firedrakeproject/firedrake-helmholtzsolver/blob/hybridization/source/bandedmatrix/locallyassembledmatrix.py <https://github.com/firedrakeproject/firedrake-helmholtzsolver/blob/hybridization/source/bandedmatrix/locallyassembledmatrix.py>
>> It has methods for assembling a UFL form, appling the locally assembled matrices to a vector, adding and multiplying locally assembled matrices and calculating the inverse (using dgels from LAPACK).
>>
>> from firedrake import *
>>
>> host_mesh = UnitIcosahedralSphereMesh(0)
>> mesh = ExtrudedMesh(host_mesh,layers=4,extrusion_type='radial')
>>
>> U1 = FiniteElement('BDFM',triangle,2)
>> U2 = FiniteElement('DG',triangle,1)
>> V0 = FiniteElement('CG',interval,2)
>> V1 = FiniteElement('DG',interval,1)
>>
>> W2_elt = HDiv(OuterProductElement(U1,V1))+HDiv(OuterProductElement(U2,V0))
>> W2_broken_elt = BrokenElement(W2_elt)
>>
>> W = FunctionSpace(mesh,W2_broken_elt)
>>
>> WARNING: Creating an EnrichedElement,
>> if you intended to create a MixedElement use '*' instead of '+'.
>> Traceback (most recent call last):
>> File "brokenelement.py", line 14, in <module>
>> W = FunctionSpace(mesh,W2_broken_elt)
>> File "/Users/eikemueller/PostDocBath/EllipticSolvers/PyOP2/pyop2/caching.py", line 160, in __new__
>> obj = make_obj()
>> File "/Users/eikemueller/PostDocBath/EllipticSolvers/PyOP2/pyop2/caching.py", line 141, in make_obj
>> obj.__init__(*args, **kwargs)
>> File "/Users/eikemueller/PostDocBath/EllipticSolvers/firedrake/firedrake/functionspace.py", line 540, in __init__
>> super(FunctionSpace, self).__init__(mesh, element, name, dim=1)
>> File "/Users/eikemueller/PostDocBath/EllipticSolvers/firedrake/firedrake/functionspace.py", line 52, in __init__
>> self.flattened_element = self.fiat_element.flattened_element()
>> AttributeError: Discontinuized instance has no attribute 'flattened_element'
>>
>> --
>>
>> Dr Eike Hermann Mueller
>> Lecturer in Scientific Computing
>>
>> Department of Mathematical Sciences
>> University of Bath
>> Bath BA2 7AY, United Kingdom
>>
>> +44 1225 38 5557
>> e.mueller at bath.ac.uk <mailto:e.mueller at bath.ac.uk>
>> http://people.bath.ac.uk/em459/ <http://people.bath.ac.uk/em459/>
>>> On 17 Mar 2015, at 13:34, Miklos Homolya <m.homolya14 at imperial.ac.uk <mailto:m.homolya14 at imperial.ac.uk>> wrote:
>>>
>>> Hi,
>>>
>>> I'm not quite sure, but I think BrokenElement is probably what you need.
>>> BrokenElement keeps all the dofs and transformations and their meaning, but re-associates all dofs with the cell interior.
>>>
>>> Regards,
>>> Miklos
>>>
>>>
>>> On 17/03/15 13:29, Eike Mueller wrote:
>>>> Hi again,
>>>>
>>>>> Another slight problem is that we don't have trace elements for quadrilaterals or tensor product elements at the moment. Our approach to trace spaces is also rather hacked up, we extract the facet basis functions from an H(div) basis and the tabulator returns DOFs by dotting the local basis functions by the local normal.
>>>>
>>>> we would also need a discontinuous HDiv space, i.e. e.g. discontinuous versions of RT0 and BDFM1. How can I get those?
>>>>
>>>> Thanks,
>>>>
>>>> Eike
>>>>
>>>>> Andrew: presumably you didn't implement them because you anticipated some fiddliness for tensor-products?
>>>>>
>>>>> cheers
>>>>> --cjc
>>>>>
>>>>> On 16 March 2015 at 08:49, Eike Mueller <E.Mueller at bath.ac.uk <mailto:E.Mueller at bath.ac.uk>> wrote:
>>>>> Dear firedrakers,
>>>>>
>>>>> I have two questions regarding the extension of a hybridised solver to a tensor-product approach:
>>>>>
>>>>> (1) In firedrake, is there already a generic way of multiplying locally assembled matrices? I need this for the hybridised solver, so for example I want to (locally) assemble the velocity mass matrix M_u and divergence operator D and then multiply them to get, for example:
>>>>>
>>>>> D^T M_u^{-1} D
>>>>>
>>>>> I can create a hack by assembling them into vector-valued DG0 fields and then writing the necessary operations to multiply them and abstract that into a class (as I did for the column-assembled matrices), but I wanted to check if this is supported generically in firdrake (i.e. if there is support for working with a locally assembled matrix representation). If I can do that, then I can see how I can build all operator that are needed in the hybridised equation and for mapping between the Lagrange multipliers and pressure/velocity. For the columnwise smoother, I then need to extract bits of those locally assembled matrices and assemble them columnwise as for the DG0 case.
>>>>>
>>>>> (2) The other ingredient we need for the Gopalakrishnan and Tan approach is a tensor-product solver in the P1 space. So can I already prolongate/restrict in the horizontal-direction only in this space? I recall that Lawrence wrote a P1 multigrid, but I presume this is for a isotropic grid which is refined in all coordinate directions. Again I can probably do it 'by hand' by just L2 projecting between the spaces, but this will not be the most efficient way. Getting the columnwise smoother should work as for the DG0 case: I need to assemble the matrix locally and then pick out the vertical couplings and build them into a columnwise matrix, which I store as a vector-valued P1 space on the horizontal host-grid.
>>>>>
>>>>> Thanks a lot,
>>>>>
>>>>> Eike
>>>>>
>>>>> --
>>>>> Dr Eike Hermann Mueller
>>>>> Lecturer in Scientific Computing
>>>>>
>>>>> Department of Mathematical Sciences
>>>>> University of Bath
>>>>> Bath BA2 7AY, United Kingdom
>>>>>
>>>>> +44 1225 38 6241 <tel:%2B44%201225%2038%206241>
>>>>> e.mueller at bath.ac.uk <mailto:e.mueller at bath.ac.uk>
>>>>> http://people.bath.ac.uk/em459/ <http://people.bath.ac.uk/em459/>
>>>>>
>>>>> _______________________________________________
>>>>> firedrake mailing list
>>>>> firedrake at imperial.ac.uk <mailto:firedrake at imperial.ac.uk>
>>>>> https://mailman.ic.ac.uk/mailman/listinfo/firedrake <https://mailman.ic.ac.uk/mailman/listinfo/firedrake>
>>>>>
>>>>> _______________________________________________
>>>>> firedrake mailing list
>>>>> firedrake at imperial.ac.uk <mailto:firedrake at imperial.ac.uk>
>>>>> https://mailman.ic.ac.uk/mailman/listinfo/firedrake <https://mailman.ic.ac.uk/mailman/listinfo/firedrake>
>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> firedrake mailing list
>>>> firedrake at imperial.ac.uk <mailto:firedrake at imperial.ac.uk>
>>>> https://mailman.ic.ac.uk/mailman/listinfo/firedrake <https://mailman.ic.ac.uk/mailman/listinfo/firedrake>
>>>
>>> _______________________________________________
>>> firedrake mailing list
>>> firedrake at imperial.ac.uk <mailto:firedrake at imperial.ac.uk>
>>> https://mailman.ic.ac.uk/mailman/listinfo/firedrake <https://mailman.ic.ac.uk/mailman/listinfo/firedrake>
>>
>>
>>
>> _______________________________________________
>> firedrake mailing list
>> firedrake at imperial.ac.uk <mailto:firedrake at imperial.ac.uk>
>> https://mailman.ic.ac.uk/mailman/listinfo/firedrake <https://mailman.ic.ac.uk/mailman/listinfo/firedrake>
>
> _______________________________________________
> firedrake mailing list
> firedrake at imperial.ac.uk
> https://mailman.ic.ac.uk/mailman/listinfo/firedrake
-------------- next part --------------
HTML attachment scrubbed and removed
More information about the firedrake
mailing list