1! 2! Description: Solves a nonlinear system in parallel with SNES. 3! We solve the Bratu (SFI - solid fuel ignition) problem in a 2D rectangular 4! domain, using distributed arrays (DMDAs) to partition the parallel grid. 5! The command line options include: 6! -par <parameter>, where <parameter> indicates the nonlinearity of the problem 7! problem SFI: <parameter> = Bratu parameter (0 <= par <= 6.81) 8! 9!!/*T 10! Concepts: SNES^parallel Bratu example 11! Concepts: DMDA^using distributed arrays; 12! Processors: n 13!T*/ 14 15! 16! -------------------------------------------------------------------------- 17! 18! Solid Fuel Ignition (SFI) problem. This problem is modeled by 19! the partial differential equation 20! 21! -Laplacian u - lambda*exp(u) = 0, 0 < x,y < 1, 22! 23! with boundary conditions 24! 25! u = 0 for x = 0, x = 1, y = 0, y = 1. 26! 27! A finite difference approximation with the usual 5-point stencil 28! is used to discretize the boundary value problem to obtain a nonlinear 29! system of equations. 30! 31! The uniprocessor version of this code is snes/tutorials/ex4f.F 32! 33! -------------------------------------------------------------------------- 34! The following define must be used before including any PETSc include files 35! into a module or interface. This is because they can't handle declarations 36! in them 37! 38 39 module f90module 40 use petscsys 41 use petscis 42 use petscvec 43 use petscdm 44 use petscdmda 45 use petscmat 46 use petscpc 47 use petscksp 48 use petscsnes 49#include <petsc/finclude/petscsnes.h> 50 type userctx 51 PetscInt xs,xe,xm,gxs,gxe,gxm 52 PetscInt ys,ye,ym,gys,gye,gym 53 PetscInt mx,my 54 PetscMPIInt rank 55 PetscReal lambda 56 end type userctx 57 58 contains 59! --------------------------------------------------------------------- 60! 61! FormFunction - Evaluates nonlinear function, F(x). 62! 63! Input Parameters: 64! snes - the SNES context 65! X - input vector 66! dummy - optional user-defined context, as set by SNESSetFunction() 67! (not used here) 68! 69! Output Parameter: 70! F - function vector 71! 72! Notes: 73! This routine serves as a wrapper for the lower-level routine 74! "FormFunctionLocal", where the actual computations are 75! done using the standard Fortran style of treating the local 76! vector data as a multidimensional array over the local mesh. 77! This routine merely handles ghost point scatters and accesses 78! the local vector data via VecGetArrayF90() and VecRestoreArrayF90(). 79! 80 subroutine FormFunction(snes,X,F,user,ierr) 81 implicit none 82 83! Input/output variables: 84 SNES snes 85 Vec X,F 86 PetscErrorCode ierr 87 type (userctx) user 88 DM da 89 90! Declarations for use with local arrays: 91 PetscScalar,pointer :: lx_v(:),lf_v(:) 92 Vec localX 93 94! Scatter ghost points to local vector, using the 2-step process 95! DMGlobalToLocalBegin(), DMGlobalToLocalEnd(). 96! By placing code between these two statements, computations can 97! be done while messages are in transition. 98 call SNESGetDM(snes,da,ierr);CHKERRQ(ierr) 99 call DMGetLocalVector(da,localX,ierr);CHKERRQ(ierr) 100 call DMGlobalToLocalBegin(da,X,INSERT_VALUES,localX,ierr);CHKERRQ(ierr) 101 call DMGlobalToLocalEnd(da,X,INSERT_VALUES,localX,ierr);CHKERRQ(ierr) 102 103! Get a pointer to vector data. 104! - For default PETSc vectors, VecGetArray90() returns a pointer to 105! the data array. Otherwise, the routine is implementation dependent. 106! - You MUST call VecRestoreArrayF90() when you no longer need access to 107! the array. 108! - Note that the interface to VecGetArrayF90() differs from VecGetArray(), 109! and is useable from Fortran-90 Only. 110 111 call VecGetArrayF90(localX,lx_v,ierr);CHKERRQ(ierr) 112 call VecGetArrayF90(F,lf_v,ierr);CHKERRQ(ierr) 113 114! Compute function over the locally owned part of the grid 115 call FormFunctionLocal(lx_v,lf_v,user,ierr);CHKERRQ(ierr) 116 117! Restore vectors 118 call VecRestoreArrayF90(localX,lx_v,ierr);CHKERRQ(ierr) 119 call VecRestoreArrayF90(F,lf_v,ierr);CHKERRQ(ierr) 120 121! Insert values into global vector 122 123 call DMRestoreLocalVector(da,localX,ierr);CHKERRQ(ierr) 124 call PetscLogFlops(11.0d0*user%ym*user%xm,ierr) 125 126! call VecView(X,PETSC_VIEWER_STDOUT_WORLD,ierr) 127! call VecView(F,PETSC_VIEWER_STDOUT_WORLD,ierr) 128 return 129 end subroutine formfunction 130 end module f90module 131 132 module f90moduleinterfaces 133 use f90module 134 135 Interface SNESSetApplicationContext 136 Subroutine SNESSetApplicationContext(snes,ctx,ierr) 137 use f90module 138 SNES snes 139 type(userctx) ctx 140 PetscErrorCode ierr 141 End Subroutine 142 End Interface SNESSetApplicationContext 143 144 Interface SNESGetApplicationContext 145 Subroutine SNESGetApplicationContext(snes,ctx,ierr) 146 use f90module 147 SNES snes 148 type(userctx), pointer :: ctx 149 PetscErrorCode ierr 150 End Subroutine 151 End Interface SNESGetApplicationContext 152 end module f90moduleinterfaces 153 154 program main 155 use f90module 156 use f90moduleinterfaces 157 implicit none 158! 159 160! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 161! Variable declarations 162! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 163! 164! Variables: 165! snes - nonlinear solver 166! x, r - solution, residual vectors 167! J - Jacobian matrix 168! its - iterations for convergence 169! Nx, Ny - number of preocessors in x- and y- directions 170! matrix_free - flag - 1 indicates matrix-free version 171! 172 SNES snes 173 Vec x,r 174 Mat J 175 PetscErrorCode ierr 176 PetscInt its 177 PetscBool flg,matrix_free 178 PetscInt ione,nfour 179 PetscReal lambda_max,lambda_min 180 type (userctx) user 181 DM da 182 183! Note: Any user-defined Fortran routines (such as FormJacobian) 184! MUST be declared as external. 185 external FormInitialGuess,FormJacobian 186 187! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 188! Initialize program 189! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 190 call PetscInitialize(PETSC_NULL_CHARACTER,ierr) 191 if (ierr .ne. 0) then 192 print*,'Unable to initialize PETSc' 193 stop 194 endif 195 call MPI_Comm_rank(PETSC_COMM_WORLD,user%rank,ierr) 196 197! Initialize problem parameters 198 lambda_max = 6.81 199 lambda_min = 0.0 200 user%lambda = 6.0 201 ione = 1 202 nfour = 4 203 call PetscOptionsGetReal(PETSC_NULL_OPTIONS,PETSC_NULL_CHARACTER,'-par',user%lambda,flg,ierr);CHKERRA(ierr) 204 if (user%lambda .ge. lambda_max .or. user%lambda .le. lambda_min) then; SETERRA(PETSC_COMM_SELF,PETSC_ERR_USER,'Lambda provided with -par is out of range '); endif 205 206! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 207! Create nonlinear solver context 208! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 209 call SNESCreate(PETSC_COMM_WORLD,snes,ierr);CHKERRA(ierr) 210 211! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 212! Create vector data structures; set function evaluation routine 213! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 214 215! Create distributed array (DMDA) to manage parallel grid and vectors 216 217! This really needs only the star-type stencil, but we use the box 218! stencil temporarily. 219 call DMDACreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_BOX,nfour,nfour,PETSC_DECIDE,PETSC_DECIDE,ione,ione, & 220 & PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,da,ierr);CHKERRA(ierr) 221 call DMSetFromOptions(da,ierr) 222 call DMSetUp(da,ierr) 223 224 call DMDAGetInfo(da,PETSC_NULL_INTEGER,user%mx,user%my,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER, & 225 & PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,ierr);CHKERRA(ierr) 226 227! 228! Visualize the distribution of the array across the processors 229! 230! call DMView(da,PETSC_VIEWER_DRAW_WORLD,ierr) 231 232! Extract global and local vectors from DMDA; then duplicate for remaining 233! vectors that are the same types 234 call DMCreateGlobalVector(da,x,ierr);CHKERRA(ierr) 235 call VecDuplicate(x,r,ierr);CHKERRA(ierr) 236 237! Get local grid boundaries (for 2-dimensional DMDA) 238 call DMDAGetCorners(da,user%xs,user%ys,PETSC_NULL_INTEGER,user%xm,user%ym,PETSC_NULL_INTEGER,ierr);CHKERRA(ierr) 239 call DMDAGetGhostCorners(da,user%gxs,user%gys,PETSC_NULL_INTEGER,user%gxm,user%gym,PETSC_NULL_INTEGER,ierr);CHKERRA(ierr) 240 241! Here we shift the starting indices up by one so that we can easily 242! use the Fortran convention of 1-based indices (rather 0-based indices). 243 user%xs = user%xs+1 244 user%ys = user%ys+1 245 user%gxs = user%gxs+1 246 user%gys = user%gys+1 247 248 user%ye = user%ys+user%ym-1 249 user%xe = user%xs+user%xm-1 250 user%gye = user%gys+user%gym-1 251 user%gxe = user%gxs+user%gxm-1 252 253 call SNESSetApplicationContext(snes,user,ierr);CHKERRA(ierr) 254 255! Set function evaluation routine and vector 256 call SNESSetFunction(snes,r,FormFunction,user,ierr);CHKERRA(ierr) 257 258! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 259! Create matrix data structure; set Jacobian evaluation routine 260! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 261 262! Set Jacobian matrix data structure and default Jacobian evaluation 263! routine. User can override with: 264! -snes_fd : default finite differencing approximation of Jacobian 265! -snes_mf : matrix-free Newton-Krylov method with no preconditioning 266! (unless user explicitly sets preconditioner) 267! -snes_mf_operator : form preconditioning matrix as set by the user, 268! but use matrix-free approx for Jacobian-vector 269! products within Newton-Krylov method 270! 271! Note: For the parallel case, vectors and matrices MUST be partitioned 272! accordingly. When using distributed arrays (DMDAs) to create vectors, 273! the DMDAs determine the problem partitioning. We must explicitly 274! specify the local matrix dimensions upon its creation for compatibility 275! with the vector distribution. Thus, the generic MatCreate() routine 276! is NOT sufficient when working with distributed arrays. 277! 278! Note: Here we only approximately preallocate storage space for the 279! Jacobian. See the users manual for a discussion of better techniques 280! for preallocating matrix memory. 281 282 call PetscOptionsHasName(PETSC_NULL_OPTIONS,PETSC_NULL_CHARACTER,'-snes_mf',matrix_free,ierr);CHKERRA(ierr) 283 if (.not. matrix_free) then 284 call DMSetMatType(da,MATAIJ,ierr);CHKERRA(ierr) 285 call DMCreateMatrix(da,J,ierr);CHKERRA(ierr) 286 call SNESSetJacobian(snes,J,J,FormJacobian,user,ierr);CHKERRA(ierr) 287 endif 288 289! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 290! Customize nonlinear solver; set runtime options 291! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 292! Set runtime options (e.g., -snes_monitor -snes_rtol <rtol> -ksp_type <type>) 293 call SNESSetDM(snes,da,ierr);CHKERRA(ierr) 294 call SNESSetFromOptions(snes,ierr);CHKERRA(ierr) 295 296! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 297! Evaluate initial guess; then solve nonlinear system. 298! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 299! Note: The user should initialize the vector, x, with the initial guess 300! for the nonlinear solver prior to calling SNESSolve(). In particular, 301! to employ an initial guess of zero, the user should explicitly set 302! this vector to zero by calling VecSet(). 303 304 call FormInitialGuess(snes,x,ierr);CHKERRA(ierr) 305 call SNESSolve(snes,PETSC_NULL_VEC,x,ierr);CHKERRA(ierr) 306 call SNESGetIterationNumber(snes,its,ierr);CHKERRA(ierr) 307 if (user%rank .eq. 0) then 308 write(6,100) its 309 endif 310 100 format('Number of SNES iterations = ',i5) 311 312! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 313! Free work space. All PETSc objects should be destroyed when they 314! are no longer needed. 315! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 316 if (.not. matrix_free) call MatDestroy(J,ierr);CHKERRA(ierr) 317 call VecDestroy(x,ierr);CHKERRA(ierr) 318 call VecDestroy(r,ierr);CHKERRA(ierr) 319 call SNESDestroy(snes,ierr);CHKERRA(ierr) 320 call DMDestroy(da,ierr);CHKERRA(ierr) 321 322 call PetscFinalize(ierr) 323 end 324 325! --------------------------------------------------------------------- 326! 327! FormInitialGuess - Forms initial approximation. 328! 329! Input Parameters: 330! X - vector 331! 332! Output Parameter: 333! X - vector 334! 335! Notes: 336! This routine serves as a wrapper for the lower-level routine 337! "InitialGuessLocal", where the actual computations are 338! done using the standard Fortran style of treating the local 339! vector data as a multidimensional array over the local mesh. 340! This routine merely handles ghost point scatters and accesses 341! the local vector data via VecGetArrayF90() and VecRestoreArrayF90(). 342! 343 subroutine FormInitialGuess(snes,X,ierr) 344 use f90module 345 use f90moduleinterfaces 346 implicit none 347 348! Input/output variables: 349 SNES snes 350 type(userctx), pointer:: puser 351 Vec X 352 PetscErrorCode ierr 353 DM da 354 355! Declarations for use with local arrays: 356 PetscScalar,pointer :: lx_v(:) 357 358 ierr = 0 359 call SNESGetDM(snes,da,ierr);CHKERRQ(ierr) 360 call SNESGetApplicationContext(snes,puser,ierr);CHKERRQ(ierr) 361! Get a pointer to vector data. 362! - For default PETSc vectors, VecGetArray90() returns a pointer to 363! the data array. Otherwise, the routine is implementation dependent. 364! - You MUST call VecRestoreArrayF90() when you no longer need access to 365! the array. 366! - Note that the interface to VecGetArrayF90() differs from VecGetArray(), 367! and is useable from Fortran-90 Only. 368 369 call VecGetArrayF90(X,lx_v,ierr);CHKERRQ(ierr) 370 371! Compute initial guess over the locally owned part of the grid 372 call InitialGuessLocal(puser,lx_v,ierr);CHKERRQ(ierr) 373 374! Restore vector 375 call VecRestoreArrayF90(X,lx_v,ierr);CHKERRQ(ierr) 376 377! Insert values into global vector 378 379 return 380 end 381 382! --------------------------------------------------------------------- 383! 384! InitialGuessLocal - Computes initial approximation, called by 385! the higher level routine FormInitialGuess(). 386! 387! Input Parameter: 388! x - local vector data 389! 390! Output Parameters: 391! x - local vector data 392! ierr - error code 393! 394! Notes: 395! This routine uses standard Fortran-style computations over a 2-dim array. 396! 397 subroutine InitialGuessLocal(user,x,ierr) 398 use f90module 399 implicit none 400 401! Input/output variables: 402 type (userctx) user 403 PetscScalar x(user%xs:user%xe,user%ys:user%ye) 404 PetscErrorCode ierr 405 406! Local variables: 407 PetscInt i,j 408 PetscReal temp1,temp,hx,hy 409 PetscReal one 410 411! Set parameters 412 413 ierr = 0 414 one = 1.0 415 hx = one/(user%mx-1) 416 hy = one/(user%my-1) 417 temp1 = user%lambda/(user%lambda + one) 418 419 do 20 j=user%ys,user%ye 420 temp = min(j-1,user%my-j)*hy 421 do 10 i=user%xs,user%xe 422 if (i .eq. 1 .or. j .eq. 1 .or. i .eq. user%mx .or. j .eq. user%my) then 423 x(i,j) = 0.0 424 else 425 x(i,j) = temp1 * sqrt(min(hx*min(i-1,user%mx-i),temp)) 426 endif 427 10 continue 428 20 continue 429 430 return 431 end 432 433! --------------------------------------------------------------------- 434! 435! FormFunctionLocal - Computes nonlinear function, called by 436! the higher level routine FormFunction(). 437! 438! Input Parameter: 439! x - local vector data 440! 441! Output Parameters: 442! f - local vector data, f(x) 443! ierr - error code 444! 445! Notes: 446! This routine uses standard Fortran-style computations over a 2-dim array. 447! 448 subroutine FormFunctionLocal(x,f,user,ierr) 449 use f90module 450 451 implicit none 452 453! Input/output variables: 454 type (userctx) user 455 PetscScalar x(user%gxs:user%gxe,user%gys:user%gye) 456 PetscScalar f(user%xs:user%xe,user%ys:user%ye) 457 PetscErrorCode ierr 458 459! Local variables: 460 PetscScalar two,one,hx,hy,hxdhy,hydhx,sc 461 PetscScalar u,uxx,uyy 462 PetscInt i,j 463 464 one = 1.0 465 two = 2.0 466 hx = one/(user%mx-1) 467 hy = one/(user%my-1) 468 sc = hx*hy*user%lambda 469 hxdhy = hx/hy 470 hydhx = hy/hx 471 472! Compute function over the locally owned part of the grid 473 474 do 20 j=user%ys,user%ye 475 do 10 i=user%xs,user%xe 476 if (i .eq. 1 .or. j .eq. 1 .or. i .eq. user%mx .or. j .eq. user%my) then 477 f(i,j) = x(i,j) 478 else 479 u = x(i,j) 480 uxx = hydhx * (two*u - x(i-1,j) - x(i+1,j)) 481 uyy = hxdhy * (two*u - x(i,j-1) - x(i,j+1)) 482 f(i,j) = uxx + uyy - sc*exp(u) 483 endif 484 10 continue 485 20 continue 486 487 return 488 end 489 490! --------------------------------------------------------------------- 491! 492! FormJacobian - Evaluates Jacobian matrix. 493! 494! Input Parameters: 495! snes - the SNES context 496! x - input vector 497! dummy - optional user-defined context, as set by SNESSetJacobian() 498! (not used here) 499! 500! Output Parameters: 501! jac - Jacobian matrix 502! jac_prec - optionally different preconditioning matrix (not used here) 503! flag - flag indicating matrix structure 504! 505! Notes: 506! This routine serves as a wrapper for the lower-level routine 507! "FormJacobianLocal", where the actual computations are 508! done using the standard Fortran style of treating the local 509! vector data as a multidimensional array over the local mesh. 510! This routine merely accesses the local vector data via 511! VecGetArrayF90() and VecRestoreArrayF90(). 512! 513! Notes: 514! Due to grid point reordering with DMDAs, we must always work 515! with the local grid points, and then transform them to the new 516! global numbering with the "ltog" mapping 517! We cannot work directly with the global numbers for the original 518! uniprocessor grid! 519! 520! Two methods are available for imposing this transformation 521! when setting matrix entries: 522! (A) MatSetValuesLocal(), using the local ordering (including 523! ghost points!) 524! - Set matrix entries using the local ordering 525! by calling MatSetValuesLocal() 526! (B) MatSetValues(), using the global ordering 527 528! - Set matrix entries using the global ordering by calling 529! MatSetValues() 530! Option (A) seems cleaner/easier in many cases, and is the procedure 531! used in this example. 532! 533 subroutine FormJacobian(snes,X,jac,jac_prec,user,ierr) 534 use f90module 535 implicit none 536 537! Input/output variables: 538 SNES snes 539 Vec X 540 Mat jac,jac_prec 541 type(userctx) user 542 PetscErrorCode ierr 543 DM da 544 545! Declarations for use with local arrays: 546 PetscScalar,pointer :: lx_v(:) 547 Vec localX 548 549! Scatter ghost points to local vector, using the 2-step process 550! DMGlobalToLocalBegin(), DMGlobalToLocalEnd() 551! Computations can be done while messages are in transition, 552! by placing code between these two statements. 553 554 call SNESGetDM(snes,da,ierr);CHKERRQ(ierr) 555 call DMGetLocalVector(da,localX,ierr);CHKERRQ(ierr) 556 call DMGlobalToLocalBegin(da,X,INSERT_VALUES,localX,ierr);CHKERRQ(ierr) 557 call DMGlobalToLocalEnd(da,X,INSERT_VALUES,localX,ierr);CHKERRQ(ierr) 558 559! Get a pointer to vector data 560 call VecGetArrayF90(localX,lx_v,ierr);CHKERRQ(ierr) 561 562! Compute entries for the locally owned part of the Jacobian preconditioner. 563 call FormJacobianLocal(lx_v,jac_prec,user,ierr);CHKERRQ(ierr) 564 565! Assemble matrix, using the 2-step process: 566! MatAssemblyBegin(), MatAssemblyEnd() 567! Computations can be done while messages are in transition, 568! by placing code between these two statements. 569 570 call MatAssemblyBegin(jac,MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr) 571 if (jac .ne. jac_prec) then 572 call MatAssemblyBegin(jac_prec,MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr) 573 endif 574 call VecRestoreArrayF90(localX,lx_v,ierr);CHKERRQ(ierr) 575 call DMRestoreLocalVector(da,localX,ierr);CHKERRQ(ierr) 576 call MatAssemblyEnd(jac,MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr) 577 if (jac .ne. jac_prec) then 578 call MatAssemblyEnd(jac_prec,MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr) 579 endif 580 581! Tell the matrix we will never add a new nonzero location to the 582! matrix. If we do it will generate an error. 583 584 call MatSetOption(jac,MAT_NEW_NONZERO_LOCATION_ERR,PETSC_TRUE,ierr);CHKERRQ(ierr) 585 586 return 587 end 588 589! --------------------------------------------------------------------- 590! 591! FormJacobianLocal - Computes Jacobian preconditioner matrix, 592! called by the higher level routine FormJacobian(). 593! 594! Input Parameters: 595! x - local vector data 596! 597! Output Parameters: 598! jac_prec - Jacobian preconditioner matrix 599! ierr - error code 600! 601! Notes: 602! This routine uses standard Fortran-style computations over a 2-dim array. 603! 604! Notes: 605! Due to grid point reordering with DMDAs, we must always work 606! with the local grid points, and then transform them to the new 607! global numbering with the "ltog" mapping 608! We cannot work directly with the global numbers for the original 609! uniprocessor grid! 610! 611! Two methods are available for imposing this transformation 612! when setting matrix entries: 613! (A) MatSetValuesLocal(), using the local ordering (including 614! ghost points!) 615! - Set matrix entries using the local ordering 616! by calling MatSetValuesLocal() 617! (B) MatSetValues(), using the global ordering 618! - Then apply this map explicitly yourself 619! - Set matrix entries using the global ordering by calling 620! MatSetValues() 621! Option (A) seems cleaner/easier in many cases, and is the procedure 622! used in this example. 623! 624 subroutine FormJacobianLocal(x,jac_prec,user,ierr) 625 use f90module 626 implicit none 627 628! Input/output variables: 629 type (userctx) user 630 PetscScalar x(user%gxs:user%gxe,user%gys:user%gye) 631 Mat jac_prec 632 PetscErrorCode ierr 633 634! Local variables: 635 PetscInt row,col(5),i,j 636 PetscInt ione,ifive 637 PetscScalar two,one,hx,hy,hxdhy 638 PetscScalar hydhx,sc,v(5) 639 640! Set parameters 641 ione = 1 642 ifive = 5 643 one = 1.0 644 two = 2.0 645 hx = one/(user%mx-1) 646 hy = one/(user%my-1) 647 sc = hx*hy 648 hxdhy = hx/hy 649 hydhx = hy/hx 650 651! Compute entries for the locally owned part of the Jacobian. 652! - Currently, all PETSc parallel matrix formats are partitioned by 653! contiguous chunks of rows across the processors. 654! - Each processor needs to insert only elements that it owns 655! locally (but any non-local elements will be sent to the 656! appropriate processor during matrix assembly). 657! - Here, we set all entries for a particular row at once. 658! - We can set matrix entries either using either 659! MatSetValuesLocal() or MatSetValues(), as discussed above. 660! - Note that MatSetValues() uses 0-based row and column numbers 661! in Fortran as well as in C. 662 663 do 20 j=user%ys,user%ye 664 row = (j - user%gys)*user%gxm + user%xs - user%gxs - 1 665 do 10 i=user%xs,user%xe 666 row = row + 1 667! boundary points 668 if (i .eq. 1 .or. j .eq. 1 .or. i .eq. user%mx .or. j .eq. user%my) then 669 col(1) = row 670 v(1) = one 671 call MatSetValuesLocal(jac_prec,ione,row,ione,col,v,INSERT_VALUES,ierr);CHKERRQ(ierr) 672! interior grid points 673 else 674 v(1) = -hxdhy 675 v(2) = -hydhx 676 v(3) = two*(hydhx + hxdhy) - sc*user%lambda*exp(x(i,j)) 677 v(4) = -hydhx 678 v(5) = -hxdhy 679 col(1) = row - user%gxm 680 col(2) = row - 1 681 col(3) = row 682 col(4) = row + 1 683 col(5) = row + user%gxm 684 call MatSetValuesLocal(jac_prec,ione,row,ifive,col,v,INSERT_VALUES,ierr);CHKERRQ(ierr) 685 endif 686 10 continue 687 20 continue 688 689 return 690 end 691 692! 693!/*TEST 694! 695! test: 696! nsize: 4 697! args: -snes_mf -pc_type none -da_processors_x 4 -da_processors_y 1 -snes_monitor_short -ksp_gmres_cgs_refinement_type refine_always 698! requires: !single 699! 700! test: 701! suffix: 2 702! nsize: 4 703! args: -da_processors_x 2 -da_processors_y 2 -snes_monitor_short -ksp_gmres_cgs_refinement_type refine_always 704! requires: !single 705! 706! test: 707! suffix: 3 708! nsize: 3 709! args: -snes_fd -snes_monitor_short -ksp_gmres_cgs_refinement_type refine_always 710! requires: !single 711! 712! test: 713! suffix: 4 714! nsize: 3 715! args: -snes_mf_operator -snes_monitor_short -ksp_gmres_cgs_refinement_type refine_always 716! requires: !single 717! 718! test: 719! suffix: 5 720! requires: !single 721! 722!TEST*/ 723