1! 2! Description: Solves a nonlinear system in parallel with SNES. 3! We solve the Bratu (SFI - solid fuel ignition) problem in a 2D rectangular 4! domain, using distributed arrays (DMDAs) to partition the parallel grid. 5! The command line options include: 6! -par <parameter>, where <parameter> indicates the nonlinearity of the problem 7! problem SFI: <parameter> = Bratu parameter (0 <= par <= 6.81) 8! 9 10! 11! -------------------------------------------------------------------------- 12! 13! Solid Fuel Ignition (SFI) problem. This problem is modeled by 14! the partial differential equation 15! 16! -Laplacian u - lambda*exp(u) = 0, 0 < x,y < 1, 17! 18! with boundary conditions 19! 20! u = 0 for x = 0, x = 1, y = 0, y = 1. 21! 22! A finite difference approximation with the usual 5-point stencil 23! is used to discretize the boundary value problem to obtain a nonlinear 24! system of equations. 25! 26! The uniprocessor version of this code is snes/tutorials/ex4f.F 27! 28! -------------------------------------------------------------------------- 29! The following define must be used before including any PETSc include files 30! into a module or interface. This is because they can't handle declarations 31! in them 32! 33 34 module f90module 35 use petscsys 36 use petscis 37 use petscvec 38 use petscdm 39 use petscdmda 40 use petscmat 41 use petscpc 42 use petscksp 43 use petscsnes 44#include <petsc/finclude/petscsnes.h> 45 type userctx 46 PetscInt xs,xe,xm,gxs,gxe,gxm 47 PetscInt ys,ye,ym,gys,gye,gym 48 PetscInt mx,my 49 PetscMPIInt rank 50 PetscReal lambda 51 end type userctx 52 53 contains 54! --------------------------------------------------------------------- 55! 56! FormFunction - Evaluates nonlinear function, F(x). 57! 58! Input Parameters: 59! snes - the SNES context 60! X - input vector 61! dummy - optional user-defined context, as set by SNESSetFunction() 62! (not used here) 63! 64! Output Parameter: 65! F - function vector 66! 67! Notes: 68! This routine serves as a wrapper for the lower-level routine 69! "FormFunctionLocal", where the actual computations are 70! done using the standard Fortran style of treating the local 71! vector data as a multidimensional array over the local mesh. 72! This routine merely handles ghost point scatters and accesses 73! the local vector data via VecGetArrayF90() and VecRestoreArrayF90(). 74! 75 subroutine FormFunction(snes,X,F,user,ierr) 76 implicit none 77 78! Input/output variables: 79 SNES snes 80 Vec X,F 81 PetscErrorCode ierr 82 type (userctx) user 83 DM da 84 85! Declarations for use with local arrays: 86 PetscScalar,pointer :: lx_v(:),lf_v(:) 87 Vec localX 88 89! Scatter ghost points to local vector, using the 2-step process 90! DMGlobalToLocalBegin(), DMGlobalToLocalEnd(). 91! By placing code between these two statements, computations can 92! be done while messages are in transition. 93 PetscCall(SNESGetDM(snes,da,ierr)) 94 PetscCall(DMGetLocalVector(da,localX,ierr)) 95 PetscCall(DMGlobalToLocalBegin(da,X,INSERT_VALUES,localX,ierr)) 96 PetscCall(DMGlobalToLocalEnd(da,X,INSERT_VALUES,localX,ierr)) 97 98! Get a pointer to vector data. 99! - For default PETSc vectors, VecGetArray90() returns a pointer to 100! the data array. Otherwise, the routine is implementation dependent. 101! - You MUST call VecRestoreArrayF90() when you no longer need access to 102! the array. 103! - Note that the interface to VecGetArrayF90() differs from VecGetArray(), 104! and is useable from Fortran-90 Only. 105 106 PetscCall(VecGetArrayF90(localX,lx_v,ierr)) 107 PetscCall(VecGetArrayF90(F,lf_v,ierr)) 108 109! Compute function over the locally owned part of the grid 110 PetscCall(FormFunctionLocal(lx_v,lf_v,user,ierr)) 111 112! Restore vectors 113 PetscCall(VecRestoreArrayF90(localX,lx_v,ierr)) 114 PetscCall(VecRestoreArrayF90(F,lf_v,ierr)) 115 116! Insert values into global vector 117 118 PetscCall(DMRestoreLocalVector(da,localX,ierr)) 119 PetscCall(PetscLogFlops(11.0d0*user%ym*user%xm,ierr)) 120 121! PetscCallA(VecView(X,PETSC_VIEWER_STDOUT_WORLD,ierr)) 122! PetscCallA(VecView(F,PETSC_VIEWER_STDOUT_WORLD,ierr)) 123 return 124 end subroutine formfunction 125 end module f90module 126 127 module f90moduleinterfaces 128 use f90module 129 130 Interface SNESSetApplicationContext 131 Subroutine SNESSetApplicationContext(snes,ctx,ierr) 132 use f90module 133 SNES snes 134 type(userctx) ctx 135 PetscErrorCode ierr 136 End Subroutine 137 End Interface SNESSetApplicationContext 138 139 Interface SNESGetApplicationContext 140 Subroutine SNESGetApplicationContext(snes,ctx,ierr) 141 use f90module 142 SNES snes 143 type(userctx), pointer :: ctx 144 PetscErrorCode ierr 145 End Subroutine 146 End Interface SNESGetApplicationContext 147 end module f90moduleinterfaces 148 149 program main 150 use f90module 151 use f90moduleinterfaces 152 implicit none 153! 154 155! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 156! Variable declarations 157! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 158! 159! Variables: 160! snes - nonlinear solver 161! x, r - solution, residual vectors 162! J - Jacobian matrix 163! its - iterations for convergence 164! Nx, Ny - number of preocessors in x- and y- directions 165! matrix_free - flag - 1 indicates matrix-free version 166! 167 SNES snes 168 Vec x,r 169 Mat J 170 PetscErrorCode ierr 171 PetscInt its 172 PetscBool flg,matrix_free 173 PetscInt ione,nfour 174 PetscReal lambda_max,lambda_min 175 type (userctx) user 176 DM da 177 178! Note: Any user-defined Fortran routines (such as FormJacobian) 179! MUST be declared as external. 180 external FormInitialGuess,FormJacobian 181 182! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 183! Initialize program 184! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 185 PetscCallA(PetscInitialize(ierr)) 186 PetscCallMPIA(MPI_Comm_rank(PETSC_COMM_WORLD,user%rank,ierr)) 187 188! Initialize problem parameters 189 lambda_max = 6.81 190 lambda_min = 0.0 191 user%lambda = 6.0 192 ione = 1 193 nfour = 4 194 PetscCallA(PetscOptionsGetReal(PETSC_NULL_OPTIONS,PETSC_NULL_CHARACTER,'-par',user%lambda,flg,ierr)) 195 if (user%lambda .ge. lambda_max .or. user%lambda .le. lambda_min) then 196 SETERRA(PETSC_COMM_SELF,PETSC_ERR_USER,'Lambda provided with -par is out of range ') 197 endif 198 199! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 200! Create nonlinear solver context 201! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 202 PetscCallA(SNESCreate(PETSC_COMM_WORLD,snes,ierr)) 203 204! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 205! Create vector data structures; set function evaluation routine 206! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 207 208! Create distributed array (DMDA) to manage parallel grid and vectors 209 210! This really needs only the star-type stencil, but we use the box 211! stencil temporarily. 212 PetscCallA(DMDACreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_BOX,nfour,nfour,PETSC_DECIDE,PETSC_DECIDE,ione,ione,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,da,ierr)) 213 PetscCallA(DMSetFromOptions(da,ierr)) 214 PetscCallA(DMSetUp(da,ierr)) 215 216 PetscCallA(DMDAGetInfo(da,PETSC_NULL_INTEGER,user%mx,user%my,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,ierr)) 217 218! 219! Visualize the distribution of the array across the processors 220! 221! PetscCallA(DMView(da,PETSC_VIEWER_DRAW_WORLD,ierr)) 222 223! Extract global and local vectors from DMDA; then duplicate for remaining 224! vectors that are the same types 225 PetscCallA(DMCreateGlobalVector(da,x,ierr)) 226 PetscCallA(VecDuplicate(x,r,ierr)) 227 228! Get local grid boundaries (for 2-dimensional DMDA) 229 PetscCallA(DMDAGetCorners(da,user%xs,user%ys,PETSC_NULL_INTEGER,user%xm,user%ym,PETSC_NULL_INTEGER,ierr)) 230 PetscCallA(DMDAGetGhostCorners(da,user%gxs,user%gys,PETSC_NULL_INTEGER,user%gxm,user%gym,PETSC_NULL_INTEGER,ierr)) 231 232! Here we shift the starting indices up by one so that we can easily 233! use the Fortran convention of 1-based indices (rather 0-based indices). 234 user%xs = user%xs+1 235 user%ys = user%ys+1 236 user%gxs = user%gxs+1 237 user%gys = user%gys+1 238 239 user%ye = user%ys+user%ym-1 240 user%xe = user%xs+user%xm-1 241 user%gye = user%gys+user%gym-1 242 user%gxe = user%gxs+user%gxm-1 243 244 PetscCallA(SNESSetApplicationContext(snes,user,ierr)) 245 246! Set function evaluation routine and vector 247 PetscCallA(SNESSetFunction(snes,r,FormFunction,user,ierr)) 248 249! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 250! Create matrix data structure; set Jacobian evaluation routine 251! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 252 253! Set Jacobian matrix data structure and default Jacobian evaluation 254! routine. User can override with: 255! -snes_fd : default finite differencing approximation of Jacobian 256! -snes_mf : matrix-free Newton-Krylov method with no preconditioning 257! (unless user explicitly sets preconditioner) 258! -snes_mf_operator : form preconditioning matrix as set by the user, 259! but use matrix-free approx for Jacobian-vector 260! products within Newton-Krylov method 261! 262! Note: For the parallel case, vectors and matrices MUST be partitioned 263! accordingly. When using distributed arrays (DMDAs) to create vectors, 264! the DMDAs determine the problem partitioning. We must explicitly 265! specify the local matrix dimensions upon its creation for compatibility 266! with the vector distribution. Thus, the generic MatCreate() routine 267! is NOT sufficient when working with distributed arrays. 268! 269! Note: Here we only approximately preallocate storage space for the 270! Jacobian. See the users manual for a discussion of better techniques 271! for preallocating matrix memory. 272 273 PetscCallA(PetscOptionsHasName(PETSC_NULL_OPTIONS,PETSC_NULL_CHARACTER,'-snes_mf',matrix_free,ierr)) 274 if (.not. matrix_free) then 275 PetscCallA(DMSetMatType(da,MATAIJ,ierr)) 276 PetscCallA(DMCreateMatrix(da,J,ierr)) 277 PetscCallA(SNESSetJacobian(snes,J,J,FormJacobian,user,ierr)) 278 endif 279 280! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 281! Customize nonlinear solver; set runtime options 282! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 283! Set runtime options (e.g., -snes_monitor -snes_rtol <rtol> -ksp_type <type>) 284 PetscCallA(SNESSetDM(snes,da,ierr)) 285 PetscCallA(SNESSetFromOptions(snes,ierr)) 286 287! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 288! Evaluate initial guess; then solve nonlinear system. 289! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 290! Note: The user should initialize the vector, x, with the initial guess 291! for the nonlinear solver prior to calling SNESSolve(). In particular, 292! to employ an initial guess of zero, the user should explicitly set 293! this vector to zero by calling VecSet(). 294 295 PetscCallA(FormInitialGuess(snes,x,ierr)) 296 PetscCallA(SNESSolve(snes,PETSC_NULL_VEC,x,ierr)) 297 PetscCallA(SNESGetIterationNumber(snes,its,ierr)) 298 if (user%rank .eq. 0) then 299 write(6,100) its 300 endif 301 100 format('Number of SNES iterations = ',i5) 302 303! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 304! Free work space. All PETSc objects should be destroyed when they 305! are no longer needed. 306! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 307 if (.not. matrix_free) PetscCallA(MatDestroy(J,ierr)) 308 PetscCallA(VecDestroy(x,ierr)) 309 PetscCallA(VecDestroy(r,ierr)) 310 PetscCallA(SNESDestroy(snes,ierr)) 311 PetscCallA(DMDestroy(da,ierr)) 312 313 PetscCallA(PetscFinalize(ierr)) 314 end 315 316! --------------------------------------------------------------------- 317! 318! FormInitialGuess - Forms initial approximation. 319! 320! Input Parameters: 321! X - vector 322! 323! Output Parameter: 324! X - vector 325! 326! Notes: 327! This routine serves as a wrapper for the lower-level routine 328! "InitialGuessLocal", where the actual computations are 329! done using the standard Fortran style of treating the local 330! vector data as a multidimensional array over the local mesh. 331! This routine merely handles ghost point scatters and accesses 332! the local vector data via VecGetArrayF90() and VecRestoreArrayF90(). 333! 334 subroutine FormInitialGuess(snes,X,ierr) 335 use f90module 336 use f90moduleinterfaces 337 implicit none 338 339! Input/output variables: 340 SNES snes 341 type(userctx), pointer:: puser 342 Vec X 343 PetscErrorCode ierr 344 DM da 345 346! Declarations for use with local arrays: 347 PetscScalar,pointer :: lx_v(:) 348 349 ierr = 0 350 PetscCallA(SNESGetDM(snes,da,ierr)) 351 PetscCallA(SNESGetApplicationContext(snes,puser,ierr)) 352! Get a pointer to vector data. 353! - For default PETSc vectors, VecGetArray90() returns a pointer to 354! the data array. Otherwise, the routine is implementation dependent. 355! - You MUST call VecRestoreArrayF90() when you no longer need access to 356! the array. 357! - Note that the interface to VecGetArrayF90() differs from VecGetArray(), 358! and is useable from Fortran-90 Only. 359 360 PetscCallA(VecGetArrayF90(X,lx_v,ierr)) 361 362! Compute initial guess over the locally owned part of the grid 363 PetscCallA(InitialGuessLocal(puser,lx_v,ierr)) 364 365! Restore vector 366 PetscCallA(VecRestoreArrayF90(X,lx_v,ierr)) 367 368! Insert values into global vector 369 370 return 371 end 372 373! --------------------------------------------------------------------- 374! 375! InitialGuessLocal - Computes initial approximation, called by 376! the higher level routine FormInitialGuess(). 377! 378! Input Parameter: 379! x - local vector data 380! 381! Output Parameters: 382! x - local vector data 383! ierr - error code 384! 385! Notes: 386! This routine uses standard Fortran-style computations over a 2-dim array. 387! 388 subroutine InitialGuessLocal(user,x,ierr) 389 use f90module 390 implicit none 391 392! Input/output variables: 393 type (userctx) user 394 PetscScalar x(user%xs:user%xe,user%ys:user%ye) 395 PetscErrorCode ierr 396 397! Local variables: 398 PetscInt i,j 399 PetscReal temp1,temp,hx,hy 400 PetscReal one 401 402! Set parameters 403 404 ierr = 0 405 one = 1.0 406 hx = one/(user%mx-1) 407 hy = one/(user%my-1) 408 temp1 = user%lambda/(user%lambda + one) 409 410 do 20 j=user%ys,user%ye 411 temp = min(j-1,user%my-j)*hy 412 do 10 i=user%xs,user%xe 413 if (i .eq. 1 .or. j .eq. 1 .or. i .eq. user%mx .or. j .eq. user%my) then 414 x(i,j) = 0.0 415 else 416 x(i,j) = temp1 * sqrt(min(hx*min(i-1,user%mx-i),temp)) 417 endif 418 10 continue 419 20 continue 420 421 return 422 end 423 424! --------------------------------------------------------------------- 425! 426! FormFunctionLocal - Computes nonlinear function, called by 427! the higher level routine FormFunction(). 428! 429! Input Parameter: 430! x - local vector data 431! 432! Output Parameters: 433! f - local vector data, f(x) 434! ierr - error code 435! 436! Notes: 437! This routine uses standard Fortran-style computations over a 2-dim array. 438! 439 subroutine FormFunctionLocal(x,f,user,ierr) 440 use f90module 441 442 implicit none 443 444! Input/output variables: 445 type (userctx) user 446 PetscScalar x(user%gxs:user%gxe,user%gys:user%gye) 447 PetscScalar f(user%xs:user%xe,user%ys:user%ye) 448 PetscErrorCode ierr 449 450! Local variables: 451 PetscScalar two,one,hx,hy,hxdhy,hydhx,sc 452 PetscScalar u,uxx,uyy 453 PetscInt i,j 454 455 one = 1.0 456 two = 2.0 457 hx = one/(user%mx-1) 458 hy = one/(user%my-1) 459 sc = hx*hy*user%lambda 460 hxdhy = hx/hy 461 hydhx = hy/hx 462 463! Compute function over the locally owned part of the grid 464 465 do 20 j=user%ys,user%ye 466 do 10 i=user%xs,user%xe 467 if (i .eq. 1 .or. j .eq. 1 .or. i .eq. user%mx .or. j .eq. user%my) then 468 f(i,j) = x(i,j) 469 else 470 u = x(i,j) 471 uxx = hydhx * (two*u - x(i-1,j) - x(i+1,j)) 472 uyy = hxdhy * (two*u - x(i,j-1) - x(i,j+1)) 473 f(i,j) = uxx + uyy - sc*exp(u) 474 endif 475 10 continue 476 20 continue 477 478 return 479 end 480 481! --------------------------------------------------------------------- 482! 483! FormJacobian - Evaluates Jacobian matrix. 484! 485! Input Parameters: 486! snes - the SNES context 487! x - input vector 488! dummy - optional user-defined context, as set by SNESSetJacobian() 489! (not used here) 490! 491! Output Parameters: 492! jac - Jacobian matrix 493! jac_prec - optionally different preconditioning matrix (not used here) 494! flag - flag indicating matrix structure 495! 496! Notes: 497! This routine serves as a wrapper for the lower-level routine 498! "FormJacobianLocal", where the actual computations are 499! done using the standard Fortran style of treating the local 500! vector data as a multidimensional array over the local mesh. 501! This routine merely accesses the local vector data via 502! VecGetArrayF90() and VecRestoreArrayF90(). 503! 504! Notes: 505! Due to grid point reordering with DMDAs, we must always work 506! with the local grid points, and then transform them to the new 507! global numbering with the "ltog" mapping 508! We cannot work directly with the global numbers for the original 509! uniprocessor grid! 510! 511! Two methods are available for imposing this transformation 512! when setting matrix entries: 513! (A) MatSetValuesLocal(), using the local ordering (including 514! ghost points!) 515! - Set matrix entries using the local ordering 516! by calling MatSetValuesLocal() 517! (B) MatSetValues(), using the global ordering 518 519! - Set matrix entries using the global ordering by calling 520! MatSetValues() 521! Option (A) seems cleaner/easier in many cases, and is the procedure 522! used in this example. 523! 524 subroutine FormJacobian(snes,X,jac,jac_prec,user,ierr) 525 use f90module 526 implicit none 527 528! Input/output variables: 529 SNES snes 530 Vec X 531 Mat jac,jac_prec 532 type(userctx) user 533 PetscErrorCode ierr 534 DM da 535 536! Declarations for use with local arrays: 537 PetscScalar,pointer :: lx_v(:) 538 Vec localX 539 540! Scatter ghost points to local vector, using the 2-step process 541! DMGlobalToLocalBegin(), DMGlobalToLocalEnd() 542! Computations can be done while messages are in transition, 543! by placing code between these two statements. 544 545 PetscCallA(SNESGetDM(snes,da,ierr)) 546 PetscCallA(DMGetLocalVector(da,localX,ierr)) 547 PetscCallA(DMGlobalToLocalBegin(da,X,INSERT_VALUES,localX,ierr)) 548 PetscCallA(DMGlobalToLocalEnd(da,X,INSERT_VALUES,localX,ierr)) 549 550! Get a pointer to vector data 551 PetscCallA(VecGetArrayF90(localX,lx_v,ierr)) 552 553! Compute entries for the locally owned part of the Jacobian preconditioner. 554 PetscCallA(FormJacobianLocal(lx_v,jac_prec,user,ierr)) 555 556! Assemble matrix, using the 2-step process: 557! MatAssemblyBegin(), MatAssemblyEnd() 558! Computations can be done while messages are in transition, 559! by placing code between these two statements. 560 561 PetscCallA(MatAssemblyBegin(jac,MAT_FINAL_ASSEMBLY,ierr)) 562 if (jac .ne. jac_prec) then 563 PetscCallA(MatAssemblyBegin(jac_prec,MAT_FINAL_ASSEMBLY,ierr)) 564 endif 565 PetscCallA(VecRestoreArrayF90(localX,lx_v,ierr)) 566 PetscCallA(DMRestoreLocalVector(da,localX,ierr)) 567 PetscCallA(MatAssemblyEnd(jac,MAT_FINAL_ASSEMBLY,ierr)) 568 if (jac .ne. jac_prec) then 569 PetscCallA(MatAssemblyEnd(jac_prec,MAT_FINAL_ASSEMBLY,ierr)) 570 endif 571 572! Tell the matrix we will never add a new nonzero location to the 573! matrix. If we do it will generate an error. 574 575 PetscCallA(MatSetOption(jac,MAT_NEW_NONZERO_LOCATION_ERR,PETSC_TRUE,ierr)) 576 577 return 578 end 579 580! --------------------------------------------------------------------- 581! 582! FormJacobianLocal - Computes Jacobian preconditioner matrix, 583! called by the higher level routine FormJacobian(). 584! 585! Input Parameters: 586! x - local vector data 587! 588! Output Parameters: 589! jac_prec - Jacobian preconditioner matrix 590! ierr - error code 591! 592! Notes: 593! This routine uses standard Fortran-style computations over a 2-dim array. 594! 595! Notes: 596! Due to grid point reordering with DMDAs, we must always work 597! with the local grid points, and then transform them to the new 598! global numbering with the "ltog" mapping 599! We cannot work directly with the global numbers for the original 600! uniprocessor grid! 601! 602! Two methods are available for imposing this transformation 603! when setting matrix entries: 604! (A) MatSetValuesLocal(), using the local ordering (including 605! ghost points!) 606! - Set matrix entries using the local ordering 607! by calling MatSetValuesLocal() 608! (B) MatSetValues(), using the global ordering 609! - Then apply this map explicitly yourself 610! - Set matrix entries using the global ordering by calling 611! MatSetValues() 612! Option (A) seems cleaner/easier in many cases, and is the procedure 613! used in this example. 614! 615 subroutine FormJacobianLocal(x,jac_prec,user,ierr) 616 use f90module 617 implicit none 618 619! Input/output variables: 620 type (userctx) user 621 PetscScalar x(user%gxs:user%gxe,user%gys:user%gye) 622 Mat jac_prec 623 PetscErrorCode ierr 624 625! Local variables: 626 PetscInt row,col(5),i,j 627 PetscInt ione,ifive 628 PetscScalar two,one,hx,hy,hxdhy 629 PetscScalar hydhx,sc,v(5) 630 631! Set parameters 632 ione = 1 633 ifive = 5 634 one = 1.0 635 two = 2.0 636 hx = one/(user%mx-1) 637 hy = one/(user%my-1) 638 sc = hx*hy 639 hxdhy = hx/hy 640 hydhx = hy/hx 641 642! Compute entries for the locally owned part of the Jacobian. 643! - Currently, all PETSc parallel matrix formats are partitioned by 644! contiguous chunks of rows across the processors. 645! - Each processor needs to insert only elements that it owns 646! locally (but any non-local elements will be sent to the 647! appropriate processor during matrix assembly). 648! - Here, we set all entries for a particular row at once. 649! - We can set matrix entries either using either 650! MatSetValuesLocal() or MatSetValues(), as discussed above. 651! - Note that MatSetValues() uses 0-based row and column numbers 652! in Fortran as well as in C. 653 654 do 20 j=user%ys,user%ye 655 row = (j - user%gys)*user%gxm + user%xs - user%gxs - 1 656 do 10 i=user%xs,user%xe 657 row = row + 1 658! boundary points 659 if (i .eq. 1 .or. j .eq. 1 .or. i .eq. user%mx .or. j .eq. user%my) then 660 col(1) = row 661 v(1) = one 662 PetscCallA(MatSetValuesLocal(jac_prec,ione,row,ione,col,v,INSERT_VALUES,ierr)) 663! interior grid points 664 else 665 v(1) = -hxdhy 666 v(2) = -hydhx 667 v(3) = two*(hydhx + hxdhy) - sc*user%lambda*exp(x(i,j)) 668 v(4) = -hydhx 669 v(5) = -hxdhy 670 col(1) = row - user%gxm 671 col(2) = row - 1 672 col(3) = row 673 col(4) = row + 1 674 col(5) = row + user%gxm 675 PetscCallA(MatSetValuesLocal(jac_prec,ione,row,ifive,col,v,INSERT_VALUES,ierr)) 676 endif 677 10 continue 678 20 continue 679 680 return 681 end 682 683! 684!/*TEST 685! 686! test: 687! nsize: 4 688! args: -snes_mf -pc_type none -da_processors_x 4 -da_processors_y 1 -snes_monitor_short -ksp_gmres_cgs_refinement_type refine_always 689! requires: !single 690! 691! test: 692! suffix: 2 693! nsize: 4 694! args: -da_processors_x 2 -da_processors_y 2 -snes_monitor_short -ksp_gmres_cgs_refinement_type refine_always 695! requires: !single 696! 697! test: 698! suffix: 3 699! nsize: 3 700! args: -snes_fd -snes_monitor_short -ksp_gmres_cgs_refinement_type refine_always 701! requires: !single 702! 703! test: 704! suffix: 4 705! nsize: 3 706! args: -snes_mf_operator -snes_monitor_short -ksp_gmres_cgs_refinement_type refine_always 707! requires: !single 708! 709! test: 710! suffix: 5 711! requires: !single 712! 713!TEST*/ 714