xref: /petsc/src/snes/tutorials/output/ex12_tri_parmetis_hpddm.out (revision f13dfd9ea68e0ddeee984e65c377a1819eab8a8a)
1  0 SNES Function norm 33.3967
2  1 SNES Function norm 3.95646e-09
3L_2 Error: 7.89093
4Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 1
5SNES Object: 4 MPI processes
6  type: newtonls
7  maximum iterations=50, maximum function evaluations=10000
8  tolerances: relative=1e-08, absolute=1e-50, solution=1e-08
9  total number of linear solver iterations=16
10  total number of function evaluations=2
11  norm schedule ALWAYS
12  SNESLineSearch Object: 4 MPI processes
13    type: bt
14      interpolation: cubic
15      alpha=1.000000e-04
16    maxstep=1.000000e+08, minlambda=1.000000e-12
17    tolerances: relative=1.000000e-08, absolute=1.000000e-15, lambda=1.000000e-08
18    maximum iterations=40
19  KSP Object: 4 MPI processes
20    type: gmres
21      restart=100, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
22      happy breakdown tolerance 1e-30
23    maximum iterations=10000, initial guess is zero
24    tolerances: relative=1e-10, absolute=1e-50, divergence=10000.
25    left preconditioning
26    using PRECONDITIONED norm type for convergence test
27  PC Object: 4 MPI processes
28    type: hpddm
29    levels: 2
30    Neumann matrix attached? TRUE
31    shared subdomain KSP between SLEPc and PETSc? FALSE
32    coarse correction: DEFLATED
33    on process #0, value (+ threshold if available) for selecting deflation vectors: 20
34    grid and operator complexities: 1.01463 1.10782
35    KSP Object: (pc_hpddm_levels_1_) 4 MPI processes
36      type: preonly
37      maximum iterations=10000, initial guess is zero
38      tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
39      left preconditioning
40      using NONE norm type for convergence test
41    PC Object: (pc_hpddm_levels_1_) 4 MPI processes
42      type: shell
43        no name
44      linear system matrix = precond matrix:
45      Mat Object: 4 MPI processes
46        type: mpiaij
47        total number of mallocs used during MatSetValues calls=0
48          not using I-node (on process 0) routines
49    PC Object: (pc_hpddm_levels_1_) 4 MPI processes
50      type: bjacobi
51        number of blocks = 4
52        Local solver information for first block is in the following KSP and PC objects on rank 0:
53        Use -pc_hpddm_levels_1_ksp_view ::ascii_info_detail to display information for all blocks
54        KSP Object: (pc_hpddm_levels_1_sub_) 1 MPI process
55          type: preonly
56          maximum iterations=10000, initial guess is zero
57          tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
58          left preconditioning
59          using NONE norm type for convergence test
60        PC Object: (pc_hpddm_levels_1_sub_) 1 MPI process
61          type: icc
62            out-of-place factorization
63            3 levels of fill
64            tolerance for zero pivot 2.22045e-14
65            using Manteuffel shift [POSITIVE_DEFINITE]
66            matrix ordering: natural
67            factor fill ratio given 1., needed 2.69786
68              Factored matrix follows:
69                Mat Object: (pc_hpddm_levels_1_sub_) 1 MPI process
70                  type: seqsbaij
71                  package used to perform factorization: petsc
72                      block size is 1
73          linear system matrix = precond matrix:
74          Mat Object: (pc_hpddm_levels_1_sub_) 1 MPI process
75            type: seqaij
76            total number of mallocs used during MatSetValues calls=0
77              not using I-node routines
78      linear system matrix = precond matrix:
79      Mat Object: 4 MPI processes
80        type: mpiaij
81        total number of mallocs used during MatSetValues calls=0
82          not using I-node (on process 0) routines
83      KSP Object: (pc_hpddm_coarse_) 2 MPI processes
84        type: preonly
85        maximum iterations=10000, initial guess is zero
86        tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
87        left preconditioning
88        using NONE norm type for convergence test
89      PC Object: (pc_hpddm_coarse_) 2 MPI processes
90        type: redundant
91          First (color=0) of 2 PCs follows
92          KSP Object: (pc_hpddm_coarse_redundant_) 1 MPI process
93            type: preonly
94            maximum iterations=10000, initial guess is zero
95            tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
96            left preconditioning
97            using NONE norm type for convergence test
98          PC Object: (pc_hpddm_coarse_redundant_) 1 MPI process
99            type: cholesky
100              out-of-place factorization
101              tolerance for zero pivot 2.22045e-14
102              matrix ordering: natural
103              factor fill ratio given 5., needed 1.1
104                Factored matrix follows:
105                  Mat Object: (pc_hpddm_coarse_redundant_) 1 MPI process
106                    type: seqsbaij
107                    package used to perform factorization: petsc
108                        block size is 20
109            linear system matrix = precond matrix:
110            Mat Object: 1 MPI process
111              type: seqsbaij
112              total number of mallocs used during MatSetValues calls=0
113                  block size is 20
114        linear system matrix = precond matrix:
115        Mat Object: (pc_hpddm_coarse_) 2 MPI processes
116          type: mpisbaij
117          total number of mallocs used during MatSetValues calls=0
118              block size is 20
119    linear system matrix = precond matrix:
120    Mat Object: 4 MPI processes
121      type: mpiaij
122      total number of mallocs used during MatSetValues calls=0
123        not using I-node (on process 0) routines
124