We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you accept and understand our legal notice.
Dismiss

ESR 11

Anh-Luan Phan, MSc. profile photo

Anh-Luan Phan, MSc.

Università degli Studi di Roma “Tor Vergata”

Multiscale simulation of novel III-Sb quantum materials and devices

Modern life with lots of electronic stuff

Nowadays it’s hard to imagine how our modern world would be without electronic devices. We are more and more dependent on them to make our lives much more convenient, efficient, and comfortable. But is that enough? Imagine your boss demands you to finish an unscheduled and urgent task within 30 minutes. In such a situation, you may feel the one-minute boot time of your laptop is like a century. “Why cannot it be just in a couple of seconds, or more ideally, in a blink?”, you mutter to yourself. Then after the booting, the laptop complains that it ran out of its memory so you have to decide to delete some of your old documents (sometimes this is not an easy decision). “The manufacturer should have integrated more memory!”, you mutter again. And, when you think nothing could prevent you from finishing your task anymore, the internet connection is suddenly sluggish for no reason, the battery goes low rapidly, and the laptop becomes hot like a heater,… This time you may lose your control and start to curse at this stupid electronic device!

The above overly exaggerating situation is not likely to happen to you (hopefully!), but you may see a point here: there always exists a need for higher performance, larger memory, longer battery time,… or in short, better electronic devices. To this aim, on one hand, the electronic industry has been trying to increase the integration density of the devices, or in other words, to integrate more and more tiny electronic components (e.g. transistors) into the same space of a chip. This desire, which was quantified by a statement known as Moore’s law, requires the down-scaling process of the conventional electronic components to reach to nanometer scale. On the other hand, many new devices with unique and interesting functionalities are proposed. They demand more advanced investigations which may go beyond our traditional territory of knowledge. To deal with the new challenges, scientists need to be equipped with powerful research tools. One of the must-haves is an efficient and reliable simulation software.

Why simulation?

From the academic viewpoint, simulation is a lengthened arm of the theorists to reach the experimentalists’ results. One general idea in science is that the predicted results of a good physical model/theory should well agree with the corresponding experimentally measured results. For not-too-complicated situations, the theoretical results can be deduced just “by hand”, that is the theorists do the calculations by themselves (pens and paper, and possibly calculators). However, most of the realistic physical systems of interest are very complicated, with lots of objects, and lots of equations to be considered at the same time. This makes by-hand calculations become not efficient, or even impossible. Luckily, the computers, with their outstanding computing power, can perform this boring but laborious kind of tasks much much better than we could. With the support of computers, the remaining task for us, the humans, is to “instruct” them on how to carry out the calculations. That is the meaning of simulation.

Moreover, from the aspect of efficiency in industry, simulation helps the companies save many things: money, time, effort, and opportunities,… To constantly bring the new and innovative products to commercial, the electronic companies have to perform an enormous amount of experiments. Just imagine how much expenses, effort, and time these tasks can consume. It is in this situation that the simulation will play the role of a life-saver, literally. Good simulation results can suggest the engineers reasonable initial guesses about what kind of experiments they should perform, and what the measured results should look like. The contribution here is very valuable, not because the simulation gives you exact predictions, but in the sense that it helps you to neglect a bunch of irrelevant or will-fail-for-sure experimental setups before you spend your valuable resources on them. A good simulation tool, like a good tour guide, will keep you from going on the wrong way.

Why “multiscale/multiphysics”?

So now we see the importance of a good simulator. And how about the concept of “multiscale”, the first word in the title of this blog? Well, there are some facets we should think of to see why “multiscale/multiphysics” is needed:

  • Until now we have not had an “omnipotent” physical model for simulation yet.
  • The modern devices require the investigation in various physical contexts at different length scales.
  • The computational resources (CPU speed, RAM, time,…) are always limited.

A proper treatment of the aforementioned highly-scaled or new devices requires an investigation in various physical contexts (from classical, semi-classical to quantum-mechanical) as well as at different length scales (from continuous media to discrete atomistic structure). Ideally, one expects to have a single omnipotent models that can practically cover such a wide and deep simulation range. The bad news is we don’t, at least until now. All our (present) physical models available for simulation are built on their own assumptions/approximations, that is, they are all tailored in some ways to fit certain specific physical contexts. Thus, in the theoretical aspect each of them is accompanied with its own applicable range and limitations.

On the other hand, the technical limitations of hardware, time, architecture,… prevent us from overusing the heavy and time-consuming calculation methods, which are in general expected to give more accurate results. Let’s consider the active part of a device which is often small compared to the overall simulation size but needs a more elaborate and careful quantum-mechanical treatment. When we look at this part closer and closer to reach the atomistic length scale, the continuous-media models like the Envelope Function Approximation (EFA) would eventually break down. At this point, an atomistic description is inevitable. One may immediately think of performing what is called “first-principle” or “ab initio” calculations such as Density Functional Theory (DFT), which will give us highly-accurate results. These approaches, however, consume a lot of computational resources so that they are typically limited only to systems of hundreds or thousands of atoms. For systems of up to millions of atoms, empirical approaches such as Empirical Tight-Binding (ETB) method are preferable. It is also worth noting that for other parts of the device, the classical or semi-classical models for simulating the mechanical and thermal properties are enough and the advanced quantum-mechanical treatment is overkill.

In other words, in the device simulation, we have to cope with a kind of limited-resource problem. The best deal is to find a balance between what we want (i.e. the quality of simulation results) and what we can pay for it (i.e. our resources). To obtain such a balance, multiple physical models with different scales (from continuous to atomistic) and different physical treatments (from classical to quantum-mechanical) may need to be invoked in the same device simulation. This raises the problem of coupling all the relevant models in a unified and consistent framework: a multiscale/multiphysics device simulation software is therefore naturally desirable. That’s the reason why TiberCAD was born and declared itself as a multiscale/multiphysics simulation software for electronic and optoelectronic devices.

Components of TiberCAD

The physical models used in TiberCAD are divided into three families:

  • The (semi-)classical transport models for the particle in the system based on drift-diffusion or hydrodynamical models, possibly with quantum corrections;
  • The models for the mechanical strain to deal with the lattice mismatch or the external mechanical forces, based on the continuous elasticity theory and the atomistic model of valence force field;
  • The quantum-mechanical models to calculate the electronic and optical properties, including k.p and empirical tight-binding theories as well as non-equilibrium Green’s function (development is in progress).

TiberCAD and its basic modules

These three sets of models and the Poisson’s equation as well as the material database together form a unified and consistent framework for multiscale/multiphysics simulations. They are dependent on each other. Mechanical strain gives an input for calculations of band properties of the materials by quantum-mechanical models. This quantum results, in turn, is needed for the drift-diffusion simulation. Besides, the coupling of different models requires a self-consistency that is obtained by a general iterative scheme. Also, because different space domains of the system may be described by different models, TiberCAD takes special care on the common boundaries between these domains to ensure the simulation results are consistent.

TiberCAD’s codebase is organized in a highly-modularized manner that facilitates the addition of the new physical models and mathematical solvers in the future. Each model is contained in its own module, which has a common interface so that it can be easily handled by the general control module of the simulator. The material database includes the specific parameters of various materials and is accessible to different modules.

Some illustrative examples

Ok! It’s time to look at some simulation results by TiberCAD.

One of the most important concepts when we talk about crystalline materials is their electronic band structure E(k), which is basically a plot of the possible energy levels E an electron in that material can have as functions of its crystal momentum k. The band structure is very important because it encodes a lot of information about the electronic and optical properties of the material. Below are some simulation results of the band structures of some materials/superlattices by TiberCAD using the Empirical Tight-Binding method (the capital letters on the abscissa represent the special k-point values):

More information

About TiberCAD: http://tibercad.org/

About TiberLab, owner of TiberCAD: http://www.tiberlab.com/

About me: https://www.researchgate.net/profile/Anh-Luan-Phan