[firedrake] parallel weirdness
Hiroe Yamazaki
h.yamazaki at imperial.ac.uk
Thu Nov 20 02:40:56 GMT 2014
Thanks, Lawrence. I see the residual behavior and changed 'nits' (the
number of iterations during the each time step) from 2 to 3, then the
parallelization problem appears to go away without moving terms. Would
that make sense for you (I still haven't understood why the residual
is reduced less fast in parallel runs)?
Anyway this change makes the calculation time a bit longer, but the
parallel efficiency seems to be good in larger problems, so it would
be still highly beneficial.
Cheers,
Hiroe
2014-11-19 17:23 GMT+00:00 Mitchell, Lawrence
<lawrence.mitchell at imperial.ac.uk>:
>
> On 19 Nov 2014, at 13:01, Hiroe Yamazaki <h.yamazaki at imperial.ac.uk> wrote:
>
>> Hi Lawrence,
>>
>> I updated the files with the exact settings I see the problem, so
>> could you run test_linear.py again with the current HEAD of the branch
>> "periodic_parallel" (267de84)? With this setting I got results that
>> looked different at t = 600*dt in serial and parallel. I've attached
>> the picture for your reference (left: serial, right: parallel(2
>> cores), serial run gives a symmetric result but parallel run results
>> in slightly asymmetric).
>
> Thanks, I see the problem. I note that if I dial the solver tolerances up to 1e-10, then the problem appears to go away, but other than that I've got no idea. I note that by moving this one term, you'll change the subspaces the Krylov solves explore. If I monitor the convergence of the U solver, I see that the true residual is reduced less fast than the preconditioned residual, so is the problem that you somehow haven't killed the errors well enough? Does this give any useful clues?
>
> Cheers,
>
> Lawrence
More information about the firedrake
mailing list