xref: /petsc/doc/tutorials/handson.md (revision 7ab0661e7bf890d52e64979ba468952a11c5843a)
1(handson)=
2
3# Tutorials, by Mathematical Problem
4
5TODO: Add link to Python example here
6
7(handson-example-1)=
8
9## Linear elliptic PDE on a 2D grid
10
11WHAT THIS EXAMPLE DEMONSTRATES:
12
13- Using command line options
14- Using Linear Solvers
15- Handling a simple structured grid
16
17FURTHER DETAILS:
18
19- <a href="PETSC_DOC_OUT_ROOT_PLACEHOLDER/src/ksp/ksp/tutorials/ex50.c.html#line1">Mathematical description of the problem</a>
20- <a href="PETSC_DOC_OUT_ROOT_PLACEHOLDER/src/ksp/ksp/tutorials/ex50.c.html#line21">the source code</a>
21
22DO THE FOLLOWING:
23
24- Compile `src/ksp/ksp/tutorials/ex50.c`
25
26  ```console
27  $ cd petsc/src/ksp/ksp/tutorials
28  $ make ex50
29  ```
30
31- Run a 1 processor example with a 3x3 mesh and view the matrix
32  assembled
33
34  ```console
35  $ mpiexec -n 1 ./ex50  -da_grid_x 4 -da_grid_y 4 -mat_view
36  ```
37
38  Expected output:
39
40  ```{literalinclude} /../src/ksp/ksp/tutorials/output/ex50_tut_1.out
41  :language: none
42  ```
43
44- Run with a 120x120 mesh on 4 processors using superlu_dist and
45  view the solver options used
46
47  ```console
48  $ mpiexec -n 4 ./ex50  -da_grid_x 120 -da_grid_y 120 -pc_type lu -pc_factor_mat_solver_type superlu_dist -ksp_monitor -ksp_view
49  ```
50
51  Expected output:
52
53  ```{literalinclude} /../src/ksp/ksp/tutorials/output/ex50_tut_2.out
54  :language: none
55  ```
56
57- Run with a 1025x1025 grid using multigrid solver on 4
58  processors with 9 multigrid levels
59
60  ```console
61  $ mpiexec -n 4 ./ex50 -da_grid_x 1025 -da_grid_y 1025 -pc_type mg -pc_mg_levels 9 -ksp_monitor
62  ```
63
64  Expected output:
65
66  ```{literalinclude} /../src/ksp/ksp/tutorials/output/ex50_tut_3.out
67  :language: none
68  ```
69
70(handson-example-2)=
71
72## Nonlinear ODE arising from a time-dependent one-dimensional PDE
73
74WHAT THIS EXAMPLE DEMONSTRATES:
75
76- Using command line options
77- Handling a simple structured grid
78- Using the ODE integrator
79- Using call-back functions
80
81FURTHER DETAILS:
82
83- <a href="PETSC_DOC_OUT_ROOT_PLACEHOLDER/src/ts/tutorials/ex2.c.html#line13">Mathematical description of the problem</a>
84- <a href="PETSC_DOC_OUT_ROOT_PLACEHOLDER/src/ts/tutorials/ex2.c.html#line36">the source code</a>
85
86DO THE FOLLOWING:
87
88- Compile `src/ts/tutorials/ex2.c`
89
90  ```console
91  $ cd petsc/src/ts/tutorials
92  $ make ex2
93  ```
94
95- Run a 1 processor example on the default grid with all the
96  default solver options
97
98  ```console
99  $ mpiexec -n 1 ./ex2 -ts_max_steps 10 -ts_monitor
100  ```
101
102  Expected output:
103
104  ```{literalinclude} /../src/ts/tutorials/output/ex2_tut_1.out
105  :language: none
106  ```
107
108- Run with the same options on 4 processors plus monitor
109  convergence of the nonlinear and linear solvers
110
111  ```console
112  $ mpiexec -n 4 ./ex2 -ts_max_steps 10 -ts_monitor -snes_monitor -ksp_monitor
113  ```
114
115  Expected output:
116
117  ```{literalinclude} /../src/ts/tutorials/output/ex2_tut_2.out
118  :language: none
119  ```
120
121- Run with the same options on 4 processors with 128 grid points
122
123  ```console
124  $ mpiexec -n 16 ./ex2 -ts_max_steps 10 -ts_monitor -M 128
125  ```
126
127  Expected output:
128
129  ```{literalinclude} /../src/ts/tutorials/output/ex2_tut_3.out
130  :language: none
131  ```
132
133(handson-example-3)=
134
135## Nonlinear PDE on a structured grid
136
137WHAT THIS EXAMPLE DEMONSTRATES:
138
139- Handling a 2d structured grid
140- Using the nonlinear solvers
141- Changing the default linear solver
142
143FURTHER DETAILS:
144
145- <a href="PETSC_DOC_OUT_ROOT_PLACEHOLDER/src/snes/tutorials/ex19.c.html#line19">Mathematical description of the problem</a>
146- <a href="PETSC_DOC_OUT_ROOT_PLACEHOLDER/src/snes/tutorials/ex19.c.html#line94">main program source code</a>
147- <a href="PETSC_DOC_OUT_ROOT_PLACEHOLDER/src/snes/tutorials/ex19.c.html#line246">physics source code</a>
148
149DO THE FOLLOWING:
150
151- Compile `src/snes/tutorials/ex19.c`
152
153  ```console
154  $ cd petsc/src/snes/tutorials/
155  $ make ex19
156  ```
157
158- Run a 4 processor example with 5 levels of grid refinement,
159  monitor the convergence of the nonlinear and linear solver and
160  examine the exact solver used
161
162  ```console
163  $ mpiexec -n 4 ./ex19 -da_refine 5 -snes_monitor -ksp_monitor -snes_view
164  ```
165
166  Expected output:
167
168  ```{literalinclude} /../src/snes/tutorials/output/ex19_tut_1.out
169  :language: none
170  ```
171
172- Run with the same options but use geometric multigrid as the
173  linear solver
174
175  ```console
176  $ mpiexec -n 4 ./ex19 -da_refine 5 -snes_monitor -ksp_monitor -snes_view -pc_type mg
177  ```
178
179  Expected output:
180
181  ```{literalinclude} /../src/snes/tutorials/output/ex19_tut_2.out
182  :language: none
183  ```
184
185  Note this requires many fewer iterations than the default
186  solver
187
188- Run with the same options but use algebraic multigrid (hypre's
189  BoomerAMG) as the linear solver
190
191  ```console
192  $ mpiexec -n 4 ./ex19 -da_refine 5 -snes_monitor -ksp_monitor -snes_view -pc_type hypre
193  ```
194
195  Expected output:
196
197  ```{literalinclude} /../src/snes/tutorials/output/ex19_tut_3.out
198  :language: none
199  ```
200
201  Note this requires many fewer iterations than the default
202  solver but requires more linear solver iterations than
203  geometric multigrid.
204
205- Run with the same options but use the ML preconditioner from
206  Trilinos
207
208  ```console
209  $ mpiexec -n 4 ./ex19 -da_refine 5 -snes_monitor -ksp_monitor -snes_view -pc_type ml
210  ```
211
212  Expected output:
213
214  ```{literalinclude} /../src/snes/tutorials/output/ex19_tut_8.out
215  :language: none
216  ```
217
218- Run on 1 processor with the default linear solver and profile
219  the run
220
221  ```console
222  $ mpiexec -n 1 ./ex19 -da_refine 5 -log_view
223  ```
224
225  Expected output:
226
227  ```{literalinclude} /../src/snes/tutorials/output/ex19_tut_4.out
228  :language: none
229  ```
230
231  Search for the line beginning with SNESSolve, the fourth column
232  gives the time for the nonlinear solve.
233
234- Run on 1 processor with the geometric multigrid linear solver
235  and profile the run
236
237  ```console
238  $ mpiexec -n 1 ./ex19 -da_refine 5 -log_view -pc_type mg
239  ```
240
241  Expected output:
242
243  ```{literalinclude} /../src/snes/tutorials/output/ex19_tut_5.out
244  :language: none
245  ```
246
247  Compare the runtime for SNESSolve to the case with the default
248  solver
249
250- Run on 4 processors with the default linear solver and profile
251  the run
252
253  ```console
254  $ mpiexec -n 4 ./ex19 -da_refine 5 -log_view
255  ```
256
257  Expected output:
258
259  ```{literalinclude} /../src/snes/tutorials/output/ex19_tut_6.out
260  :language: none
261  ```
262
263  Compare the runtime for `SNESSolve` to the 1 processor case with
264  the default solver. What is the speedup?
265
266- Run on 4 processors with the geometric multigrid linear solver
267  and profile the run
268
269  ```console
270  $ mpiexec -n 4 ./ex19 -da_refine 5 -log_view -pc_type mg
271  ```
272
273  Expected output:
274
275  ```{literalinclude} /../src/snes/tutorials/output/ex19_tut_7.out
276  :language: none
277  ```
278
279  Compare the runtime for SNESSolve to the 1 processor case with
280  multigrid. What is the speedup? Why is the speedup for
281  multigrid lower than the speedup for the default solver?
282
283(handson-example-4)=
284
285## Nonlinear time dependent PDE on unstructured grid
286
287WHAT THIS EXAMPLE DEMONSTRATES:
288
289- Changing the default ODE integrator
290- Handling unstructured grids
291- Registering your own interchangeable physics and algorithm
292  modules
293
294FURTHER DETAILS:
295
296- <a href="PETSC_DOC_OUT_ROOT_PLACEHOLDER/src/ts/tutorials/ex11.c.html">Mathematical description of the problem</a>
297- <a href="PETSC_DOC_OUT_ROOT_PLACEHOLDER/src/ts/tutorials/ex11.c.html#line1403">main program source code</a>
298- <a href="PETSC_DOC_OUT_ROOT_PLACEHOLDER/src/ts/tutorials/ex11.c.html#line186">source code of physics modules</a>
299
300DO THE FOLLOWING:
301
302- Compile `src/ts/tutorials/ex11.c`
303
304  ```console
305  $ cd petsc/src/ts/tutorials
306  $ make ex11
307  ```
308
309- Run simple advection through a tiny hybrid mesh
310
311  ```console
312  $ mpiexec -n 1 ./ex11 -f ${PETSC_DIR}/share/petsc/datafiles/meshes/sevenside.exo
313  ```
314
315  Expected output:
316
317  ```{literalinclude} /../src/ts/tutorials/output/ex11_tut_1.out
318  :language: none
319  ```
320
321- Run simple advection through a small mesh with a Rosenbrock-W
322  solver
323
324  ```console
325  $ mpiexec -n 1 ./ex11 -f ${PETSC_DIR}/share/petsc/datafiles/meshes/sevenside.exo -ts_type rosw
326  ```
327
328  Expected output:
329
330  ```{literalinclude} /../src/ts/tutorials/output/ex11_tut_2.out
331  :language: none
332  ```
333
334- Run simple advection through a larger quadrilateral mesh of an
335  annulus with least squares reconstruction and no limiting,
336  monitoring the error
337
338  ```console
339  $ mpiexec -n 4 ./ex11 -f ${PETSC_DIR}/share/petsc/datafiles/meshes/annulus-20.exo -monitor Error -advect_sol_type bump -petscfv_type leastsquares -petsclimiter_type sin
340  ```
341
342  Expected output:
343
344  ```{literalinclude} /../src/ts/tutorials/output/ex11_tut_3.out
345  :language: none
346  ```
347
348  Compare turning to the error after turning off reconstruction.
349
350- Run shallow water on the larger mesh with least squares
351  reconstruction and minmod limiting, monitoring water Height
352  (integral is conserved) and Energy (not conserved)
353
354  ```console
355  $ mpiexec -n 4 ./ex11 -f ${PETSC_DIR}/share/petsc/datafiles/meshes/annulus-20.exo -physics sw -monitor Height,Energy -petscfv_type leastsquares -petsclimiter_type minmod
356  ```
357
358  Expected output:
359
360  ```{literalinclude} /../src/ts/tutorials/output/ex11_tut_4.out
361  :language: none
362  ```
363