Create software platform for Technology cad device simulation based on fem



Yüklə 3 Mb.
tarix22.05.2017
ölçüsü3 Mb.



Create software platform for Technology CAD device simulation based on FEM

  • Create software platform for Technology CAD device simulation based on FEM

  • Should be flexible & modifiable (research)

  • Interactive features: Geometry, meshing and visualization with user interface

  • Separation of IT (software) and physics

  • Ability to run simulations on remote servers

  • Able to run on Windows, Linux, Mac OS X

  • Extendible; users should be able to develop for the platform writing own modules

  • Performance



Finding solution for Partial Differential equations for evaluation of characteristics (e.g. potential)

  • Finding solution for Partial Differential equations for evaluation of characteristics (e.g. potential)

  • Discretizes continuum (i.e. modeled object) into finite number of elements – e.g. triangles, tetrahedron

  • Characteristics are determined in the nodes of the element

  • Solving of linear systems

  • Complex to design and implement, solid mathematical and informatics understanding required for high performance



Suitable discretization of continuous domain to simple volume cell elements

  • Suitable discretization of continuous domain to simple volume cell elements

  • Partial differential equations (PDE’s) can be replaced by system of non-linear algebraic equations

  • Very complex to create code to generate FEM mesh on arbitrary structures

  • Mesh need to be at least Delaunay mesh

  • Tetrahedrons



Commercial codes (Synopsis, Silvaco, etc.)

  • Commercial codes (Synopsis, Silvaco, etc.)

  • Disadvantages: limited extendibility, modifications

  • Free codes 2D, 3D:

    • Archimedes, nextnano
  • Free Meshers - NETGEN, Tetgen

  • Existing simulation frameworks

  • Finite Element Solvers

    • FEniCS/DOLFIN, Libmesh, Getfem++, Rheolef, Tahoe, OOFEM.org, OFELI






SALOME 3.2.6 supports limited number of OS’s =>

  • SALOME 3.2.6 supports limited number of OS’s =>

  • Developed as a VmWare image with:

    • Debian Linux 3.1 (codename Sarge)
    • SALOME 3.2.6
    • FEniCS/DOLFIN 0.7.1
    • MeshAPI – lib. for our FEM module/component
    • Component and additional codes for SALOME
    • KDevelop for development
    • Additional tools (Krusader, …)
  • Running on VmWare Server 1+ or Workstation 6+

  • Eventual distribution by providing this VmWare image





SALOME(LGPL) is a free software that provides a generic platform for Pre and Post-Processing for numerical simulation.

  • SALOME(LGPL) is a free software that provides a generic platform for Pre and Post-Processing for numerical simulation.

  • Interactice geometry modelling, meshing

  • Very good user interface (Qt4)

  • Visualization (2D and 3D graphs)

  • Can use Python scripting to replace or assist GUI: all functionalities are also accessible through the programmatic integrated Python console

  • Modular architecture, we can create own modules

  • Components can run on remote servers (CORBA)

  • Exchanging data: MEDMEM API, .HDF, .MED

  • Much more powerful then any other open source finite element component/software we found





Storing and loading data associated to numerical meshes and fields

  • Storing and loading data associated to numerical meshes and fields

  • Exchange between codes and solvers

  • Comes with C++/Python API

  • Data can be exchanged in memory

  • 3. levels – files, memory, CORBA (data on demand)

  • The persistent data storage is based upon HDF format (developed by Boeing and NASA in the area of Computational Fluid Dynamic).



Free finite element library and solver

  • Free finite element library and solver

  • Supports both direct and iterative solvers (LU, Krylov solvers)

  • Uses PETSc and uBLAS libraries for systems of linear/nonlinear equations => high performance linear algebra

  • Automatic generation of finite elements, evaluation of variational form assembly of matrices for FEM – linear systems

  • Support for general families of finite elements, (Lagrange, BDM, RT, BDFM Nedelec and Crouzeix-Raviart elements)

  • No deeper knowledge about FEM method is needed to use and develop

  • Eigenvalue problems with SLEPSc

  • Simple and intuitive C++ object interface



UFC - unified framework for finite element assembly. More precisely, it defines a fixed (C++) interface for communicating low level routines (functions) for evaluating and assembling finite element variational forms.

  • UFC - unified framework for finite element assembly. More precisely, it defines a fixed (C++) interface for communicating low level routines (functions) for evaluating and assembling finite element variational forms.

  • SyFI – generates an UFC form, from Symbolic Finite Elements

  • FFC – generates UFC from bi/linear form of equations in variational form generation goal: any form, any finite element, maximum efficiency

  • FIAT – automatic finite element tabulator – used by FFC or SyFI

  • DOLFIN – C++ finite element library / interface



Core – mesh, fields, groups

  • Core – mesh, fields, groups

  • Linear-Nonlinear PDE classes

  • Selection from Krylov solver methods and preconditioners

  • XML material database (SAX parser)

  • Boundary conditions

  • Inherited MeshAPI based solvers

  • Wrappers for SALOME platform



Reading Salome mesh from files .med files (MEDMEM API)

  • Reading Salome mesh from files .med files (MEDMEM API)

  • Processing mesh coordinates and connectivities

  • Processing groups of mesh (can be defined in SALOME editor)

  • Passing this information to DOLFIN (to build mesh in memory)

  • Providing core fields (such as Source, Flux, Potential, some visual debug fields)

  • Additional methods to work with mesh and fields

  • Control of storing of core and custom fields to .MED files

  • Clean code design in strictly object oriented C++



  •  

  •    

  •    

  •    

  •  

  •  

  •    

  •  

  •  

  •  



Implemented by correct naming of groups, which are then read in code to retrieve materials and boundary conditions

  • Implemented by correct naming of groups, which are then read in code to retrieve materials and boundary conditions

  • Examples: bottomoxide[Si] metalplate1[dirichlet=1.0]

  • From SALOME mesh, we get ID’s of nodes and assign a group color number to them

  • From group number, we determine material, Dirichlet boundary conditions etc. These data are stored in numeric arrays [0..n] – n is number of groups



Classes allowing solving nonlinear and nonlinear PDE, using DOLFIN, allows to set preconditioners and Krylov methods:

  • Classes allowing solving nonlinear and nonlinear PDE, using DOLFIN, allows to set preconditioners and Krylov methods:

  • Available Krylov methods:

    • cg - The conjugate gradient method
    • gmres - The GMRES method (default)
    • bicgstab - The stabilized biconjugate gradient squared method
  • Preconditioners:

    • none - No preconditioning
    • jacobi - Simple Jacobi preconditioning
    • sor - SOR, successive over-relaxation
    • ilu - Incomplete LU factorization (default)
    • icc - Incomplete Cholesky factorization
    • amg - Algebraic multigrid (through Hypre when available)


Using MeshAPI, one can easily, in few lines define DOLFIN solvers as classes inherited from cl. Dolfin

  • Using MeshAPI, one can easily, in few lines define DOLFIN solvers as classes inherited from cl. Dolfin

  • Behaviour that can be generalized and reused is already defined in Mesh API

  • It can be used in any current and future examples

  • There are 3 example solvers:

    • Poisson example from DOLFIN manual, but using Salome mesh
    • Poisson equation computed on partitioned group
    • Poisson equation computed on partitioned group with permittivity (Eps)
    • Non-linear Poisson equation computed on partitioned group with permittivity (not 100% done)


Solving linear PDE: Poisson equation:

  • Solving linear PDE: Poisson equation:

  • f(x,y,z) – source function (known), can be 0

  • ε(x,y,z) – permittivity of material in given point

  • u(x,y,z) – potential, that we are computing



Bi-linear and linear form of Poisson equation:

  • Bi-linear and linear form of Poisson equation:

  • g(x,y,z) – Neumann boundary condition



# The bilinear form a(v, U) and linear form L(v) for

  • # The bilinear form a(v, U) and linear form L(v) for

  • # Poisson's equation.

  • # Compile this form with FFC: ffc -l dolfin PoissonEps.form

  • element = FiniteElement("Lagrange", "tetrahedron", 1)

  • v = TestFunction(element)

  • u = TrialFunction(element)

  • f = Function(element)

  • g = Function(element)

  • eps = Function(element)

  • a = dot(grad(v), grad(u))*eps*dx

  • L = v*f*dx + eps*v*g*ds

  • # This generates 5239 lines, 191.359 characters



#include "PoissonEps.h“

  • #include "PoissonEps.h“

  • #include "LinearPDE.hxx"

  • int SC::PoissonEps::solve ()

  • {

    • Source f (mesh);
    • Flux g (mesh);
    • DirichletFunction u0 (mesh);
    • DirichletBoundary boundary(mesh);
    • DirichletBC bc (u0, mesh.dolfinMesh, boundary);
    • Eps eps (mesh);
    • PoissonEpsBilinearForm a (eps);
    • PoissonEpsLinearForm L (f, g, eps);
    • SC::LinearPDE pde (a, L, mesh.dolfinMesh, bc);
    • pde.setupKrylov (mesh.krylovMethod, mesh.krylovPc);
    • Function solution;
    • pde.solve(solution);
    • mesh.nodePotential.init (u); mesh.nodeSource.init (f); mesh.nodeFlux.init (g);
    • mesh.resetFieldsToWrite();
    • Field *fields[] = {&mesh.nodePotential, &mesh.nodeSource, &mesh.nodeFlux, NULL};
    • mesh.addFieldsToWrite (fields);
  • }



class DirichletBoundary : public SubDomain

  • class DirichletBoundary : public SubDomain

  • {

    • MeshAPI &mesh;public:
    • DirichletBoundary(MeshAPI & meshInstance) : mesh(meshInstance)
    • {
    • }
    • bool inside(const dolfin::real* x, bool on_boundary) const
    • {
      • int index = mesh.getGroupNumberFromCoordinates(x); MaterialFunction *nodeMaterial = mesh.materialFunctions[index]; return nodeMaterial->materialData == NULL;
      • }
    • };


class DirichletFunction : public Function

  • class DirichletFunction : public Function

  • {

    • MeshAPI &mesh;public:
    • DirichletFunction(MeshAPI& meshInstance) : mesh(meshInstance), Function(meshInstance.dolfinMesh)
    • {
    • }
    • dolfin::real eval(const dolfin::real* x) const
    • {
    • int index = mesh.getGroupNumberFromCoordinates(x); MaterialFunction *nodeMaterial = mesh.materialFunctions[index];
    • return nodeMaterial->dirichlet;
    • }
    • };


class Source : public Function

  • class Source : public Function

  • {

    • MeshAPI &mesh;public:
    • Source(MeshAPI & meshInstance) : mesh(meshInstance), Function(meshInstance.dolfinMesh)
    • {
    • }
    • dolfin::real eval(const dolfin::real* x) const
    • {
      • return 0;
    • }
    • };


class Flux : public Function

  • class Flux : public Function

  • {

    • MeshAPI &mesh;public:
    • Flux(MeshAPI & meshInstance) : mesh(meshInstance), Function(meshInstance.dolfinMesh)
    • {
    • }
    • dolfin::real eval(const dolfin::real* x) const
    • {
      • return 0;
    • }
    • };


class Eps : public Function

  • class Eps : public Function

  • {

    • MeshAPI &mesh;public:
    • Eps (MeshAPI& meshInstance) : mesh(meshInstance), Function(meshInstance.dolfinMesh)
    • {
    • }
    • dolfin::real eval(const dolfin::real* x) const
    • {
    • int index = mesh.getGroupNumberFromCoordinates(x); MaterialFunction *nodeMaterial = mesh.materialFunctions[index];
    • return nodeMaterial->permitivity;
    • }
    • };










NanoFEM platform is a new research environment for TCAD simulations of nanoscale devices.

  • NanoFEM platform is a new research environment for TCAD simulations of nanoscale devices.

  • Based on free LGPL components SALOME Platform and FEniCS/DOLFIN

  • We can concentrate only on developing of our MeshAPI and computational modules

  • Physicist/developers – independent

  • Simple definition of equations instead of programming



Interactive pre- and post-processing

  • Interactive pre- and post-processing

  • Automated meshing

  • Modules can run on remote servers

  • High performance

  • Standard formats - .MED and .HDF

  • .XML material database

  • Good extendibility and modularity



More complex equations (drift, diffusion)

  • More complex equations (drift, diffusion)

  • Compare performance with commercial

  • More modules with exchange of fields

  • Control of simulation flow and coupling

  • Tests of supervision (with scripting)

  • More complex boundary conditions

  • Run on native Debian and Windows



???

  • ???

  • Do you have any questions ?




Yüklə 3 Mb.

Dostları ilə paylaş:




Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©azkurs.org 2020
rəhbərliyinə müraciət

    Ana səhifə