-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add documentation on the current state #18
Conversation
README.md
Outdated
|
||
In a first step, the model order reduction was applied to the deal.II (Neumann) participant. Since deal.II is written in `C++` and the model order reduction through pyMOR is carried out through the python programming language, we compile the `C++` functions of deal.II into a python compatible library using `pybin11`, which is already included as a submodule within this project. Therefore, the deal.II source code can be found in the `lib` directory and the function calls for the deal.II heat problem as well as the pyMOR model order reduction are located in `example/neumann-reduced/heat_equation_reduced.py`. Although model order reduction with FEniCS is supported by pyMOR, we don't apply any model order reduction on the FEniCS side, i.e., the example code `example/dirichlet-fenics/heat.py` solves always the full order model. | ||
|
||
In order to apply the model order reduction on the Neumann side, we parametrize the diffusion coefficient within this participant: we split the squared domain on the right side once more into a square and an L-shaped remainder. In the offline phase, we perform multiple coupled simulations between the Dirichlet and Neumann participant using a different diffusion coefficient in the sub-square on the Neumann side. The FEniCS side computes during the offline phase always the same computational setup. By default, the sub-square with the varying diffusion coefficient is part of the coupling interface. The motivation for such a setup is the runtime reduction for the reduced order model during the online phase: Building the deal.II code in `Release` mode results in a speedup factor of around 10 for 8 global refinements, which corresponds to 66.049 degrees of freedom. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@sdrave The very last sentence #17 (comment)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That should be 66,049 degrees
, no?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, my bad
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good summary 👍
A few suggestions below how we could still improve the documentation.
|
||
## Description | ||
|
||
The repository contains a ready-to-run example in the `example` directory. It consists of a simplified and adapted version of the [partitioned-heat tutorial](https://precice.org/tutorials-partitioned-heat-conduction.html), where we split a domain artificially into two parts, solve in each subdomain the same problem and carry out a Dirichlet-Neumann coupling across a common coupling interface in order to recover a global solution. However, instead of the time-dependent heat equation, we solve here a stationary Laplace problem with Dirichlet boundary conditions on the left side of the domain `u_D = 3` and homogeneous Dirichlet boundary conditions on `u_D = 0` on the right side of the domain. The left side ('Dirichlet participant') is computed using FeniCS and the right side ('Neumann participant') is computed using deal.II. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The repository contains a ready-to-run example in the `example` directory. It consists of a simplified and adapted version of the [partitioned-heat tutorial](https://precice.org/tutorials-partitioned-heat-conduction.html), where we split a domain artificially into two parts, solve in each subdomain the same problem and carry out a Dirichlet-Neumann coupling across a common coupling interface in order to recover a global solution. However, instead of the time-dependent heat equation, we solve here a stationary Laplace problem with Dirichlet boundary conditions on the left side of the domain `u_D = 3` and homogeneous Dirichlet boundary conditions on `u_D = 0` on the right side of the domain. The left side ('Dirichlet participant') is computed using FeniCS and the right side ('Neumann participant') is computed using deal.II. | |
The repository contains a ready-to-run example in the `example` directory. It consists of a simplified and adapted version of the [partitioned-heat tutorial](https://precice.org/tutorials-partitioned-heat-conduction.html), where we split a domain artificially into two parts, solve in each subdomain the same problem and carry out a Dirichlet-Neumann coupling across a common coupling interface in order to recover a global solution. However, instead of the time-dependent heat equation, we solve here a stationary Laplace problem with Dirichlet boundary conditions on the left side of the domain `u_D = 3` and homogeneous Dirichlet boundary conditions `u_D = 0` on the right side of the domain. The left side ('Dirichlet participant') is computed using FeniCS and the right side ('Neumann participant') is computed using deal.II. |
README.md
Outdated
|
||
In a first step, the model order reduction was applied to the deal.II (Neumann) participant. Since deal.II is written in `C++` and the model order reduction through pyMOR is carried out through the python programming language, we compile the `C++` functions of deal.II into a python compatible library using `pybin11`, which is already included as a submodule within this project. Therefore, the deal.II source code can be found in the `lib` directory and the function calls for the deal.II heat problem as well as the pyMOR model order reduction are located in `example/neumann-reduced/heat_equation_reduced.py`. Although model order reduction with FEniCS is supported by pyMOR, we don't apply any model order reduction on the FEniCS side, i.e., the example code `example/dirichlet-fenics/heat.py` solves always the full order model. | ||
|
||
In order to apply the model order reduction on the Neumann side, we parametrize the diffusion coefficient within this participant: we split the squared domain on the right side once more into a square and an L-shaped remainder. In the offline phase, we perform multiple coupled simulations between the Dirichlet and Neumann participant using a different diffusion coefficient in the sub-square on the Neumann side. The FEniCS side computes during the offline phase always the same computational setup. By default, the sub-square with the varying diffusion coefficient is part of the coupling interface. The motivation for such a setup is the runtime reduction for the reduced order model during the online phase: Building the deal.II code in `Release` mode results in a speedup factor of around 10 for 8 global refinements, which corresponds to 66.049 degrees of freedom. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we add a rough sketch here? domains and subdomains.
What could also be a good is a flow chart: what happens in the offline phase, what happens in the online phase.
```bash | ||
./run.sh -n 15 | ||
``` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the n here standing for?
|
||
## Running a simulation | ||
|
||
The example setup is located in the `example` directory. In a first step, the offline phase, multiple coupled simulations need to be performed in order to generate the parametrized reduced basis later on. By default the coupled simulation is performed `5` times using uniform samples of the diffusion coefficient. Afterwards, `5` random diffusion coefficients are used in order to compare the full order model and the reduced order model. In order to execute all simulations, execute |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Uniform samples from a fixed interval?
python3 heat_equation_reduced.py | ||
``` | ||
|
||
from the `neumann-reduced` directory. The Neumann participant prints out statistics regarding the error and speedup of the reduced basis. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Running these two scripts triggers the offline phase and afterwards the online phase?
Maybe we could make this more explicit.
Please rectify in case anything is wrong or not enough.