Lagrangian Objects

  • Mohamed Houssem Kasbaoui

Lagrangian Objects

The derived types lagrangian_obj and lagrangian_set provide abstract data types to define a single and a collection of Lagrangian objects (e.g. a particle). Note that as abstract types, these data types cannot be instantiated. They need to be first extended.

Basic and extended Lagrangian Objects

The most basic Lagrangian object is a trivial extension of the abstract lagrangian_obj type.

use leapKinds
use leapLagrangian

! Make a trivial extension of the abstract type
type, extends (lagrangian_obj) :: particle_obj
  ! Nothing else to add
end type particle_obj

type (particle_obj) :: p

p%id   = 1
p%p(:) = [0.0_wp, 0.0_wp, 0.0_wp]
p%c(:) = [0, 0, 0]

! ... (need to implemented deferred subroutines)

It represents a point in space. The mininmum information needed to describe this point is an identifying number (p%id), 3D position (p%p(:)) and cell containing this object (p%c(:)). This assumes that there is block_obj that can be used to localize the point on the grid.

Note that some deferred subroutines that users must implement for their extensions. This is not covered in detail in this guide. However, users can refer to src/kernel/leaplagrangian.f90 for more information, and follow the example in src/kernel/particle_point/particle_point.f90.

You may also add additional components to the base object as shown in this example

use leapKinds
use leapLagrangian

! Extend the abstract type with velocity components
type, extends (lagrangian_obj) :: particle_obj
  real(wp) :: v(3) !! Add velocity
end type particle_obj

type (particle_obj) :: p

p%id   = 1
p%p(:) = [0.0_wp, 0.0_wp, 0.0_wp]
p%c(:) = [0, 0, 0]
p%v(:) = [0.0_wp, 0.0_wp, 0.0_wp]

Now we can track the velocity of this particle in addition to the id, position, and containing cell.

You can extend with as many reals and integers as you want. However, extending with allocatable arrays is not supported.

Collections of Lagrangian Objects

Collections of Lagrangian objects are handeled by the lagrangian_set derived type. Since it is also an abstract type, it must be first extended:

use leapKinds
use leapParallel
use leapBlock
use leapLagrangian

! Make a trivial extension of the abstract type
type, extends (lagrangian_obj) :: particle_obj
end type particle_obj

type, extends (lagrangian_set) :: particle_set
  contains
    procedure :: SetObjectType => particle_SetObjectType
end type particle_set

type(block_obj)     :: block
type(parallel_obj)  :: parallel
type (particle_obj) :: p
type (particle_set) :: pset

! Perform initializations
call parallel%Initialize()
call block%Initialize(ngc=1,parallel)
! ... (setup the block)


! Intialize the particle_set
call pset%Intialize('particles',block,parallel)

contains
  subroutine particle_SetOjbectType(this)
    implicit none
    class(particle_set), intent(inout) :: this
    type(particle_obj) :: sample
    allocate(this%sample,source=sample)
    return
  end subroutine particle_SetOjbectType

There deferred subroutineSetObjectType`` configuresparticle_setto useparticle_obj``` as its basic Lagrangian object.

Most Used Operations

  • Resize: This operation will resize the local array of lagrangian objects, thereby adding or removing objects.
! Assuming we are running with 2 MPI ranks

! Intialize the particle_set
call pset%Intialize('particles',block,parallel)

! Create 2 objects on each MPI rank
call pset%Resize(2)

! Update the global count of objects
call pset%UpdateCount()

print *,"Number of objects owned by this rank", pset%count_ ! = 2
print *,"Total number of objects across all MPI ranks", pset%count ! = 2*2=4
  • Recycle: This will delete eulerian objects with %id=-1 and resizes
! Assuming we are running with 2 MPI ranks

! Intialize the particle_set
call pset%Intialize('particles',block,parallel)

! Create 2 objects on each MPI rank
call pset%Resize(2)

pset%p(1)%id = 1
pset%p(2)%id = -1

call pset%Recycle()
call pset%UpdateCount()

print *,"Number of objects owned by this rank", pset%count_ ! = 1
print *,"Total number of objects across all MPI ranks", pset%count ! = 2*1=2
  • Communicate: This procedure will check whether particles owned assigned to an MPI rank are contained within the physical bounds of the MPI rank's subblock. If it falls outside (i.e., the particle is owned by rank=A but is located within the subdomain owned by rank=B), it will send it to the appropriate MPI rank.

  • Localize: This will find the cell containing each Lagrangian Object and update its component %c(:).

real(wp) :: rand(3)

! Intialize the particle_set
call pset%Intialize('particles',block,parallel)

! Create 10 objects on each MPI rank
call pset%Resize(10)

! Place objects randomly in domain
do n=1,pset%count_
   call random_number(rand)
   pset%p(n)%p = block%pmin + rand*(block%pmax-block%pmin)
end do

! Send each object to its owner rank
call pset%Communicate()

! Localize on the grid
call pset%Localize()

! Print cell information for each particle
do n=1,pset%count_
   print*, "Particle ",pset%p(n)%id," is contained in cell (",pset%p(n)%c(:),")
end do
  • ApplyPeriodicity: Objects that fall outside the domain will be wrapped around in periodic directions

Reading/Writing Lagrangian Data

The following example illustrate I/O operations with a lagrangian_set.

! Intialize the particle_set
call pset%Intialize('particles',block,parallel)

! Name of file to read from
call pset%SetReadFileName('pset.h5')

! Read Lagrangian objects stored in file, as well as the iteration and time values
call pset%Read(iter,time)


! Name of file to write to
call pset%SetReadFileName('pset2.h5')

! Set the overwrite value (default is true)
call pset%SetOverwrite(.false.)

! Write the lagrangian objects, iteration and time values.
call pset%Write(iter,time)