Free Essay

How Density Functional Theory Has Improved Chemistry

In:

Submitted By novasssweat
Words 4644
Pages 19
How has density functional theory improved for chemical applications? Discussion of some of the recent developments

[0]

Table of Contents

Introduction3
Roots of DFT…………………………………………………………………………………………………………………………………………….3
Modern DFT4
Modifications6
Basis Sets6
Finding the EXC energy6
Hybrid Functionals8
Double Hybrid10
Strengths of DFT10
Challenges for DFT11
Current research12
Conclusions15

Introduction

It has become an accepted fact that computational chemistry has become the partner to experimental chemistry. This is because computational experiments supplement real world experimental data very well. There are many systems in which there is no possible way of getting data about them, and so we must turn to computational methods. One example of this is looking at transition states, which in the real world may only exist for fractions of a second, however with the help of computational methods; we are able to investigate them easier, cheaper and quicker. There are many methods for computational chemistry experiments, however in this paper we will be focusing on how density functional theories (DFT) has impacted the chemistry community. The review will show how DFT started as an alternative to Schrodinger wave function methods, with simulated homogeneous electron gases, and moved on to be non-local, including other short and long range potentials and also combining empirical data to improve on the functionals. How DFT has its strengths and weaknesses and what is a challenge for DFT and a discussion on how to overcome these. We will then go on to look at more recent pieces of research and how they contribute to the wealth of information.

Roots of DFT

Density functional theory and computational chemistry in general has its roots starting with the work done by Erwin Schrödinger in 19261, which gave the equation, assuming the Born-Oppenheimer approximation for a time independent system:

Eφ=Hφ
(Eq 1.)
(Eq 1.)

Known as the Schrödinger equation, where the energy of the system is given by multiplying the wave function, by the Hamiltonian operator, where, for a 1 particle system, the Hamiltonian operator is given by:

H=T+V
(Eq 2.)
(Eq 2.)

Where H is the sum of the operators T, is the kinetic energy, and V, the potential energy of the system.

The problem with this is that it is exactly solvable for a 1 electron system, however most chemical systems have more than 1 electron, making it useless for real chemical applications.

Soon after Hartree and Fock (HF)2, introduced the self-consistent field method (SCF), which was their attempt to solve the Schrödinger equation with and approximate solution, whereby the electrons are treated as being a stationary homogeneous electron gas. In this method the n-electron wave function of the system can be approximated by a single Slater determinant with n-orbitals. This method invokes the variational method, whereby the equation provides a temporary solution, which can then be reintroduced into the equation, to provide a better solution and this is done iteratively until it converges, but the energy can never go lower than the true ground state energy. This has some big draw backs; firstly this method requires a lot of computational power and scales very quickly more electrons. Also by treating the electrons as uniform it often over estimates the overall binding energy of the electrons with respect to each other. It completely ignores relativistic effects and electron correlations, with opposite spins. The original Hartree-fock method restricted the electrons to orbitals, so that they were paired up, which of course is not the case in real systems. It has since been improved to include these factors, which we will come to later.

Also shortly after the introduction of the Schrödinger equation and around the same time as the Hartree-Fock method papers, Llewellyn Thomas and Enrico Fermi3 4, working independently came up with a statistical model to approximate the distribution of electrons in an atom. In this model the electrons were not distributed evenly across the whole body, but that the electrons were uniform within a small volume, but each small volume would not necessarily be identical to the others.

This total energy of the system, in this model, can be represented as:

E=T+VeN+Vee

(Eq 3.)
(Eq 3.)

Where T is kinetic energy and the second and third terms are the potential energy of electron-nucleus attraction and electron-electron repulsion, respectively. This is an important step in the development of density functional theory, as this was the first true density based method for getting the energy of the system. However this method provided very poor results for most systems, and is not useful in a chemical sense, as it ignored many factors of a true chemical system. Dirac5 improved upon the Thomas-Fermi model by adding a new term to represent the energy of the exchange, to be known as the local density approximation. However this was found to still be inaccurate for most applications as it did not represent the kinetic energy well and completely ignored the correlation energy of the electrons. The Thomas-Fermi kinetic energy was improved in 1935 by Weizsacker6, giving the Thomas-Fermi-Dirac-Weizsacker density functional theory (TFDW-DFT). This was the standard for DFT, producing poor results until 1964.

Modern DFT

In 1964 Hohenberg-Kohn7 published the paper “Inhomogeneous electron gas”, where they laid the foundations for modern DFT methods. The Hohenberg-Kohn theorems relate to any system consisting of electrons moving under the influence of an external potential. The first theorem says that the external potential:

υext(r)
(Eq 4.)
(Eq 4.)

And hence the total energy is a functional of the electron density:

n(r)
(Eq 5.)
(Eq 5.)

The energy functional can be written in terms of external potential:

Enr= nrυextrdr+F[nr]

(Eq 6.)
(Eq 6.)

Therefore the Hamiltonian can be written as:

H= F+ υext
(Eq 7.)
(Eq 7.)

Where F is the electronic Hamiltonian consisting of a kinetic energy operator T and the electron-electron operator Vee

F=T+Vee
(Eq 8.)
(Eq 8.)

The second theorem states that the ground state energy can be obtained variationally. I.e. the density that minimises the total energy is the exact ground state density. These theorems are powerful however do not give a way of practically computing the energy of the system, that was until a year later when Kohn and Sham put forward their method of computing DFT.

The Kohn-sham8 paper can be generalised to the equation:

Eρ=Tsρ+ Vextρ+VHρ+EXC[ρ]

(Eq 9.)
(Eq 9.)

Where Ts is the Kohn–Sham kinetic energy:

(Eq 10.)
(Eq 10.)

vext is the external potential acting upon the system. VH is the Coulomb energy:

(Eq 11.)
(Eq 11.)

And Exc is the exchange-correlation energy. Finding a best exchange correlation is somewhat at the heart of improving DFT methods.

Modifications

Since the Kohn-sham theories, DFT has been extended to include spin polarised states and be time dependant, which has allowed for the approximations of reactions barriers. Also DFT is not a quantum mechanical (QM) approach to molecular systems so some modifications must be made to include QM behaviours. One idea is the inclusion of an electron core potential (ECP), which include relativistic effects for atoms where these effects begin to dominate, such as heavy metals.

Basis sets

A basis set is a mathematical function, which can be linearly combined, to form, atomic orbitals (LCAO). The LCAO allows to deconstruct the density of the system to form mathematical atomic orbitals. Prior to 1950 the use of slater type orbitals, was the only method for this. Boys9 proposed the use of Gaussian type orbitals (GTO), which unlike slater type orbitals (STO), can linearly combine with other GTO’s to produce new GTO’s. A single GTO on its own is less accurate to a real orbital than an STO, however GTO’s are mathematically much quicker to compute and can be combined to produce more accurate orbitals (GTO’s scale e-αr2 , where STO’s scale e-βr) But it wasn’t until Taketa et al.10 showed a set of mathematical equations for obtaining matrix elements in the Gaussian basis, after this paper a flurry of other were offered building on one another, leading to the popular split valence Pople11 basis sets. These basis sets are very refined and allow for diffuse, in the case of weak electron-nuclear attraction, and polarisable orbitals, in the case where orbitals are interacting with another atoms orbitals, for more accurate molecular orbital descriptions.

Finding the Exchange-Correlation energy

Modern day DFT methods perform well for most, but not all, chemical applications. The accuracy of DFT comes from the refinement of the Exchange correlation functional. In 2001 the complexity of these functionals was imagined by Perdew and Schmidt12, Figure 1 is Jacob’s ladder, at the bottom is HF approximations, where there are no electron correlations, the higher up the ladder the closer to an exact representation of electron correlations, where everything is non-local.

Fig 1.
Fig 1.

It starts at the bottom with Local density approximation (LDA), which has been extended for spin polarised states, in the local density spin-polarised approximation (LSDA). The most successful local approximations are those that have been derived from the homogeneous electron gas (HEG) model. These methods approximate the density of the electron gas by taking points in space and averaging, which is acceptable for some systems, but as the systems because less homogeneous, the approximation become more inadequate. The LDA approximation as given by Dirac13 has the form:

(Eq 12.)
(Eq 12.)

Moving up the Perdew ladder, the next more complicated functional is the generalised gradient approximation (GGA) are given by the generalised equation:

(Eq 13.)
(Eq 13.)

Using GGA gives very good results for molecular geometries and ground-state energies have been achieved14. The GGA are still local but also take into account the gradient of the density at the same coordinate. GGA have increased the accuracy of LDA atomization by a factor of 5.
The first GGA was proposed by Perdew in 198215, and then followed up in 198616, building on the work of the first but it wasn’t until 1988 when Becke17 proposed a GGA, which has become known as B88 functional for energy of exchange correlation, that this area of research starting making bigger leaps. This energy of exchange is given by:

(Eq 14.)
(Eq 14.)

Another common method PBE18 developed Perdew and Burke is given by:

(Eq 15.)
(Eq 15.)

Meta-GGA DFT functional in its original form includes the second derivative of the electron density whereas GGA includes only the density and its first derivative in the exchange-correlation potential, and was the obvious next step to include higher order derivatives. These functionals include a further term in the expansion, depending on the density, the gradient of the density and the Laplacian of the density. However they did not provide a major advance in the development of functionals.

Hybrid Functionals
First attempts from Pérez-Jordá et al, at hybrid functionals combining DFT and HF methods were mostly unsuccessful and lacked “the beauty and practicality of the original ‘Hartree-Fock plus density functionals’ idea”. So Becke proposed A new mixing of HF and Local Density Functional Theories19. In this paper he proposed that the exchange correlation could be split into an exchange energy, and a correlations energy. The exchange energy was given an exact solution using HF theory, and the correlation given by a local density approximation. The overall equation for the exchange correlation energy is given by:

EXC=c0EX+c1ECLDA
(Eq 16.)
(Eq 16.)

Where c0 and c1 are mixing parameters and the HF exchange energy is represented as a Kohn-Sham orbital in the form:

(Eq 17.)
(Eq 17.)

And the correlation energy given by a modified local density approximation:

ECLDA= UXC[ραr,ρβ(r)] d^3 r
(Eq 18.)
(Eq 18.)

Hybridization with Hartree–Fock exchange provides improvements of many molecular properties, such as atomization energies, bond lengths and vibration frequencies, which were poorly represented from simple HF methods.

Fig 2.
Fig 2.

Expressed in Kcal/mol The figure above shows how the exchange energy is vastly underestimated, when compared to experimental data. Becke initially mixed the exchange and the correlation energy with c0 and c1 both equalling 0.5, shown in ExcHH. Modifying the parameters with respect to empirical data he was able to refine the parameters such that c0 = 0.332 and c1 = 0.575, which gave the results shown in the last column, labelled ExcSE. This paper has lead the way for accurate hybrid functionals.

Within a year Becke published another paper called Density functional thermochemistry. III20. The role of exact exchange, whereby he refined upon the idea of the first paper, giving a new equation for a hybrid functional, given by:

EXC=EXCLDA+ α0EXexact-EXLDA+ αxΔEXB88+ αCΔECPW91
(Eq 19.)
(Eq 19.)

Equation 19 uses a combination of HF methods, combined with LDA and two GGA, ones from Becke’s earlier paper and one GGA from Perdew and Wang in 199121 22. This equation has become known as the Becke-3 (B3) parameter. In 199423, the B3 parameter was combined with a correlation energy devised by Lee, Yang and Parr24, with empirical mixing parameter devised by S. H. Vosko, L. Wilk and M. Nusair (VWN parameters)25 to produce what is commonly known as B3LYP. B3LYP provides consistently very good energies for a wide range of chemical applications, and finding an all-round better Exc than this one, is challenging. Many modifications have been made to B3LYP to improve it accuracy, but it still remains a widely used functional.

EXCB3LYP=0.2EXHF+0.8EXLDA+0.72ΔEXB88+0.81ECLYP+0.19ECVWN
(Eq 20.)
(Eq 20.)

Equation 20 gives the formula for B3LYP , and remains the benchmark for DFT methods. The exchange energy is given by mixing of HF, LDA and Becke’s 88 functional, and the correlation energy given by a mixture of LYP and VWN.

The success of B3LYP remains unmatched by any other method available currently; the graph below shows the occurrence of B3LYP in the literature.

Fig 3.
Fig 3.

Double Hybrid

Double hybrid methods combine exact HF exchange with an Møller-Plesset 2 (MP2)26 like correlation to a DFT calculation. These methods provide very good energies for both covalent and non covalent systems (where DFT normally fails). However these methods are expensive and computationally equivalent to MP2, which means that these methods aren’t of great interest. The energies of systems that DFT performs well at, are not improved upon much by these methods.

Strengths of DFT

The biggest strength of DFT is to give good energies, but at a low computational cost. The table below shows how DFT scales compares to other computational methods. DFT has the lowest cost of any computational method are performs better than HF, and some DFT methods better than MP2/MP3.

Scaling behavior | Methods | N3 | DFT | N4 | HF | N5 | MP2 | N6 | MP3, CISD, CCSD, QCISD | N7 | MP4, CCSD(T), QCISD (T) | N8 | MP5, CISDT, CCSDT | N9 | MP6 |
Table 1.
Table 1.

DFT currently performs very well for bond lengths, bond angles and therefore overall molecular structures. It also performs well for ionisation, atomisation. DFT methods overcome one of the main disadvantages of ab initio methods such as Hartree-Fock, which completely neglect electron correlation. Post-HF methods, have since included this, but as seen in Table 1, at a bigger cost. DFT can also perform calculations on some molecules which aren’t possible with ab initio methods, most notably transition metals.

Challenges for DFT

DFT still has many weaknesses, it doesn’t perform well, compared to post-HF methods for (but not limited to) heats of formations, charge transfer excitations, non bonding interactions such as weak interactions and dipole interactions.

B3LYP in recent years has become almost synonymous with DFT, because so many people use it. One of the challenges for current DFT research is finding a functional that performs overall better than B3LYP, but to keep it from getting too complicated. If DFT starts to become as complicated as full configuration interaction then it starts to lose what makes it so good, its simplicity. However this computational simplicity must not come at a cost of accuracy, it must also not become entirely empirical.

Another challenge for DFT is the need to improve the description of reaction barriers, in local density and gradient approximated methods, the energy of the transition state is systematically underestimated, by several Kcal/mol which leads to big problems in time-dependent excited state and reaction barrier chemistry.

DFT needs to improve the description of weak dispersion/van der waals interactions. Currently many DFT methods fail for strong correlated systems such as hydrogen bonded systems due to this lack of long range dispersion effects, for these same reasons DFT also fails for liquids. LDA/GGA do not include a long range van de waals, due to their local nature, but to include these forces would require some sort of attraction force that decay 1/R6,27 when the distance of interacting particles increases. Hybrid functionals are also not correct, since they all show long-range repulsive behaviour.

One thing that all DFT methods have in common, despite the complexity and accuracy of the description of electron densities, is that they all still make many approximations. This is the brilliance and weakness of DFT. This makes systems quickly computable but the lack of information comes at a cost of accuracy. Here a balance needs to be struck between keeping the speed and aiming for the heaven of Jacob’s ladder at accuracy within 0.1 Kcal/mol. Another important limitation of DFT concerns its non variational nature. Hence, it is not variational and energy values below the true ground-state energy can be obtained, and the use of more complete basis sets does not necessarily lead to an improvement in accuracy.

Current research

Current areas of research are still mostly concerned with trying to come up with new functionals that overall perform better that B3LYP, this comes in several forms. Many are attempting to modify B3LYP to include other terms to improve its understanding of van de waals forces, long range dispersion or strong correlation Some recent improvements on B3LYP include the coulomb attenuating method B3LYP (CAM-B3LYP)28 . Although the weaknesses of DFT have been discussed Yanai et al (2004) provided some improvements on B3LYP to overcome some issues that they considered important challenges. They noticed B3LYP failed for the polarizability of long chains, excitation in time dependant theory (TDDFT) and charge transfer excitations. In this paper they attempted to overcome the problem of coulomb forces acting over a large distance, by splitting the functional into a short range and a long range part.

The CAM-B3LYP functional comprises of 0.19 Hartree–Fock plus 0.81 B88 exchange interaction at short range, and 0.65 HF plus 0.35 B88 at long range. The middle region is smoothly described through the standard error function with parameter 0.33.

They did this by adding an extra term in the 1/r12 part of the equation to give:

1r12= 1-[α+βerfµr12]r12 + α+βerfµr12r12

(Eq 21.)
(Eq 21.)

Where α is the mixing parameter of the HF energies, and β is the DFT mixing parameters. The first term accounts for the short range interaction, and the second term accounts for the long range interaction.

Fig 4.
Fig 4.

You can see in standard B3LYP, the HF and DFT energies have equal weightings between distances 0 and ∞, which has been modified so that the HF energy has more weighting at longer distances, so that there is a more attraction at longer distance, attenuating the nuclear repulsion.

They found that applying these modifications allowed to accurately predict Rydberg states, charge transfer excitations and long chain polarisation within 0.1eV accuracy.

Civalleri et al29 made the point that the critical issue in molecular crystals DFT experiments, is the proper description of non covalent interactions between molecules (i.e. hydrogen bonding, dispersive interactions, di polar effects) and these dictate the crystal structure and the thermodynamic properties controlling phase transitions. B3LYP-D was the new method they proposed, which also added another term to B3LYP.

EB3LYP-D=EB3LYP+EDisp
(Eq 22.)
(Eq 22.)

Where EDisp is an empirical dispersion coefficient given by:

(Eq 23.)
(Eq 23.)

Where S6 is the scaling factor, determined by the DFT method used (for B3LYP S6 = 1.05). Rij.g is the interatomic distance between atoms I and j. C6ij is the dispersion coefficient for the pairs of atoms. A dampening function fdmp was used to dampen any effects this term would have on very small interatomic distances, given by:

(Eq 24.)
(Eq 24.)

Where Rvdw is the sum of atomic van der Waals radii and d determines the steepness of the damping function.

Their results using empirical data from Grimme and PW30, showed great accuracy to the experimental data, when compared to standard B3LYP.

Fig 5.
Fig 5.

These papers have shown the importance of using empirical data as supplement the equations, as we yet do not fully understand how electrons behave, with reference to each other and how they correlate in a system. The holy grail of DFT would to understand with great accuracy how they do, if this could be done it would be possible to construct functionals based on empirical evidence, of how they react, rather than forcing functionals to fit empirical data.

Roy31 presented a new type of DFT called cartesian coordinate grid (CCG). In a Cartesian grid, atom-centered localized basis set, electron density, molecular orbitals, two-body potentials are directly built on the grid. What they found was that a systematic analysis of the results obtained on various properties, such as, total energy, ionization energy, potential energy curve, and atomization energies, clearly demonstrated that the method is capable of producing quite accurate and competitive results. The most promising thing about this method of DFT is that it is variational.

An area of research that has been largely overlooked, until the last few years, are the methods of computation. Although many research experiments are performed on large powerful supercomputers, and an even bigger number of small pieces of research are performed on consumer grade computers. Despite supercomputers having a large amount of raw power they can still benefit from optimisations of processing. By designing computer systems that are better suited to handle the heavy number crunching nature of DFT, the faster the calculations will be to run and therefore the heavier calculations can be handled within time and cost restraints. One piece of research that faced this was Nitsche et al (2014),32 where they used GPU accelerated calculations instead of just CPU based calculations. GPU’s are better suited for these applications which require many calculations in order to converge, which is why this worked so well. In their paper they used moderately sized molecules, with molecular weights between 600-1000 g/mol. They then chose to calculate different parts of the total energy with different methods. They found that in the self-consistent field iteration part, the use of a hybrid partitioning scheme reduces the computational cost by 35% and the use of GPU contributes in an additional 20 to 30× factor (compared with the best implementation in CPU using a single core)

Fig 6.
Fig 6.

Overall the computation time using the new method, dropped the overall time. They concluded that some of the steps that should be considered to further optimization are QM/MM force, Coulomb force, and, to a lesser degree, other terms (basis changes, matrix diagonalization).

Conclusions

DFT has come a long way from the Dirac LDA, with DFT leading the way in computational chemistry. It has been improved to include short and long range correlations and has been applied to many different chemical systems with overall good results.

It seems that you can modify existing theories to tailor to your needs; however there is still not overall grand unified theory which encompasses all factors, which can accurately model all systems. Density functional theory has changed little since 1994, on the introduction of B3LYP where most papers are concerned with modifying this, rather than starting from scratch. By using B3LYP as a starting place, you introduce into the new method all of weaknesses that are intrinsic of B3LYP.

DFT still has a lot of local density in the equations and still contains many approximations, until we can devise a theory that is fully non local and removes these approximations DFT will not be perfect.

References

0 Avogadro: an open-source molecular builder and visualization tool. Version 1.XX. http://avogadro.openmolecules.net/ Marcus D Hanwell, Donald E Curtis, David C Lonie, Tim Vandermeersch, Eva Zurek and Geoffrey R Hutchison; "Avogadro: An advanced semantic chemical editor, visualization, and analysis platform" Journal of Cheminformatics 2012, 4:17.
1 E. Schrödinger, Phys. Rev., 1926, 28, 1049–1070.
2 J. Slater, Phys. Rev., 1951, 81, 385–390.
3 L. H. Thomas, Math. Proc. Cambridge Philos. Soc., 1927, 23, 542–548.
4 E. Fermi, Collected papers(Note e memorie), [Chicago], 1962.
5 P. A. M. Dirac, Proc. R. Soc. A Math. Phys. Eng. Sci., 1926, 112, 661–677.
6 C. F. v. Weizsacker, Zeitschrift fur Phys., 1935, 96, 431–458.
7 P. Hohenberg and W. Kohn, Phys. Rev., 1964, 136, B864–B871.
8 W. Kohn and L. J. Sham, Phys. Rev., 1965, 140, A1133–A1138.
9 S. F. Boys, Proc. R. Soc. A Math. Phys. Eng. Sci., 1950, 200, 542–554.
10 H. Taketa, S. Huzinaga and K. O-ohata, J. Phys. Soc. Japan, 1966, 21, 2313–2324.
11 J. A. Pople and W. J. Hehre, J. Comput. Phys., 1978, 27, 161–168.
12 J. P. Perdew and K. Schmidt, AIP Conf. Proc., 2001, 577, 1–20.
13 P. A. M. Dirac, Math. Proc. Cambridge Philos. Soc., 2008, 26, 376.
14 S. F. Sousa, P. A. Fernandes and M. J. Ramos, J. Phys. Chem. A, 2007, 111, 10439–52.
15 J. P. Perdew, M. Levy and J. L. Balduz, Phys. Rev. Lett., 1982, 49, 1691–1694.
16 J. Perdew, Phys. Rev. B, 1986, 33, 8822–8824.
17 A. D. Becke, Phys. Rev. A, 1988, 38, 3098–3100.
18 J. P. Perdew, K. Burke and M. Ernzerhof, Phys. Rev. Lett., 1996, 77, 3865–3868.
19 A. D. Becke, J. Chem. Phys., 1993, 98, 1372–1377.
20 A. Becke, J. Chem. Phys., 1993, 98, 5648–5652.
21 B. P. Ziesche, H. Eschrig, editors,, Akademie Verlag, Electronic Structure of Solids, 1991.
22 J. P. Perdew, J. A. Chevary, S. H. Vosko, K. A. Jackson, M. R. Pederson, D. J. Singh and C. Fiolhais, Phys. Rev. B, 1992, 46, 6671–6687.
23 P. J. Stephens, F. J. Devlin, C. F. Chabalowski and M. J. Frisch, J. Phys. Chem., 1994, 98, 11623–11627.
24 C. Lee, W. Yang and R. G. Parr, Phys. Rev. B, 1988, 37, 785–789.
25 S. H. Vosko, L. Wilk and M. Nusair, Can. J. Phys., 1980, 58, 1200–1211.
26 C. Møller and M. S. Plesset, Phys. Rev., 1934, 46, 618–622.
27 A. T. Amos and J. A. Yoffe, Chem. Phys. Lett., 1976, 39, 53–56.
28 T. Yanai, D. P. Tew and N. C. Handy, Chem. Phys. Lett., 2004, 393, 51–57.
29 B. Civalleri, C. M. Zicovich-Wilson, L. Valenzano and P. Ugliengo, CrystEngComm, 2008, 10, 405–410.
30 S. Grimme, J. Comput. Chem., 2006, 27, 1787–1799.
31 A. K. Roy, J. Math. Chem., 2011, 49, 1687–1699.
32 M. A. Nitsche, M. Ferreria, E. E. Mocskos and M. C. González Lebrero, J. Chem. Theory Comput., 2014, 10, 959–967.
Figures:

[1] AIP Conf. Proc. (2001) 577, 1

[2] J. Chem. Phys. 98, 1372 (1993)

[3] J. Phys. Chem. A (2007), 111, 10439-10452

[4] Chemical Physics Letters 393 (2004) 51–57

[5] CrystEngComm, (2008) ,10, 405-410

[6] J. Chem. Theory Comput. (2014), 10, 959−967

Similar Documents

Free Essay

Cell Biology Ch1

...structural and functional unit of living organisms The smallest structure capable of performing the essential functions characteristic of life The study of cells •Began with the invention of microscopes in the 17th century •Using a microscope to look at cork, Robert Hooke described little box-like compartments and named them cellulae (little rooms) in 1665 (Micrographia) •Actually, the cellulae were dead plant cells •Limited by the microscope resolution Antonie van Leeuwenhoek Developed superior lenses that had 10-fold better resolution than Hooke’s. Looked at algae, protists, minerals, fossils, animals using a microscope. First to describe bacteria: "I then most always saw, with great wonder, "an unbelievably great there wereof living that in the said matter company many very animalcules, a-swimming more nimbly than little living animalcules, very prettily a-moving. any Ibiggest sort. . . up to this time. The The had ever seen had a very strong and biggest sort. .and shot through into water (or swift motion, . bent their body the curves in going forwards. . . does through the water. The spittle) like a pike Moreover, the other animalcules were in suchspun round numbers, second sort. . . oft-times enormous like a top. that allthese were.far more in number." . . and the water. . seemed to be alive." Modern Cell and Molecular Biology Cytology the study of cell structure Genetics the study of the behavior of genes Biochemistry the study of the chemistry of living...

Words: 1895 - Pages: 8

Premium Essay

Axiomatic Design Approach for Design of Nano Fluids

...February 2008 Abstract The experimental data for nanofluids in thermal-fluid systems have shown that the new fluids promise to become advanced heat transfer fluids in terms of thermal performance. While enhancing thermal characteristics, the solid–liquid mixtures present an unavoidable disadvantage in terms of pumping cost for economic operation of thermal-fluid systems. In addition, there is a lack of agreement between experimental data provided in the literature. The present work found that there would be no comprehensible design strategy in developing nanofluids. In this work, the Axiomatic Design (AD) theory is applied to systemize the design of nanofluids in order to bring its practical use forward. According to the Independence Axiom of the AD theory, the excessive couplings between the functional requirements and the parameters of a nanofluid system prevent from meeting the functional goals of the entire system. At a parametric level, the design of a nanofluid system is inherently coupled due to the characteristics of thermal-fluid system; the design parameters physically affect each other sharing sub-level parameters for nanoparticles with making a feedback loop. Even though parts of the nanofluids are naturally coupled, it is possible to reduce and/or eliminate the degree of coupling by help...

Words: 7984 - Pages: 32

Premium Essay

Flocculation & Coagulation Processes in the Production of Potable Water

...SEPERATION PROCESSES II DESIGN PROJECT COAGULATION & FLOCCULATION PROCESSES IN THE PRODUCTION OF POTABLE WATER SUBMISSION DATE: 14 August 2012 COURSE COORDINATOR : Dr. Netatollah Rahmanian GROUP MEMBERS’ NAMES: Derek Lai Chai Zern 14233 Derek Lai Chai Zern 14233 Sean Suraj Jeremiah 14286 Nabila Syahira Bt Azizuddin 14295 Hazwan Farid B Muhammad Puzi 14382 Karrthik S/O Subramaniam 15450 Kiveeyashini D/O Govindasamy 17252 INTRODUCTION Human settlements have always been centred around sources of clean drinking water. As the population increases and the quality of fresh water declines, it has become an engineering challenge to supply sufficient potable water to the meet demands. Of the many unit processes and operations used in water treatment, coagulation and flocculation required a unique combination of chemical and physical phenomena for producing water acceptable for human consumption. Aggregation of fine particulate matter into larger particulates by the use of coagulation and flocculation facilities permits cost-effective removal in subsequent solid separation processes. Particulates inorganic origin such as clay, silt, and mineral oxides generally enter surface water by natural erosion processes and can decrease the clarity of the water to an unacceptable level. Organic particulates, such as colloidal humic and fulvic acids are a product of decay and leaching of organic debris and litter which have fallen in the water...

Words: 14355 - Pages: 58

Premium Essay

Nanotechnology Applications for Clean Water Solutions for Improving

...Centre, Department of Materials Cranfield University, United Kingdom the aim of this book series is to disseminate the latest developments in small scale technologies with a particular emphasis on accessible and practical content. these books will appeal to engineers from industry, academia and government sectors. for more information about the book series and new book proposals please contact the publisher, Dr. Nigel hollingworth at nhollingworth@williamandrew.com. http://www.williamandrew.com/MNt NaNotechNology applicatioNs for cleaN Water edited by Nora savage Office of Research and Development, US Environmental Protection Agency and (in alphabetical order) Mamadou Diallo Materials and process simulation center, Division of chemistry and chemical engineering, california institute of technology Jeremiah Duncan Nanoscale Science and Engineering Center, University of Wisconsin-Madison anita street Office of Research and Development, US Environmental Protection Agency and Center of Advanced Materials for the Purification of Water with Systems, University of Illinois at Urbana-Champaign richard sustich N o r w i c h , N Y, U S A copyright © 2009 by William andrew inc. No part of this book may be reproduced or utilized in any form or by any means, electronic or mechanical, including photocopying, recording, or by any information storage and retrieval system, without permission in writing from the publisher. ISBN: 978-0-8155-1578-4 Library of Congress Cataloging-in-Publication...

Words: 90139 - Pages: 361

Free Essay

Grid Energy Storage

...Grid Energy Storage   U.S. Department of Energy          December 2013            Acknowledgements  We would like to acknowledge the members of the core team dedicated to developing this report on grid energy storage: Imre Gyuk (OE), Mark Johnson (ARPA-E), John Vetrano (Office of Science), Kevin Lynn (EERE), William Parks (OE), Rachna Handa (OE), Landis Kannberg (PNNL), Sean Hearne & Karen Waldrip (SNL), Ralph Braccio (Booz Allen Hamilton).     Table of Contents    Acknowledgements  ...................................................................................................................................... 1 . Executive Summary ....................................................................................................................................... 4 1.0   Introduction .......................................................................................................................................... 7 2.0   State of Energy Storage in US and Abroad .......................................................................................... 11 3.0   Grid Scale Energy Storage Applications .............................................................................................. 20 4.0   Summary of Key Barriers ..................................................................................................................... 30 5.0    Energy Storage Strategic Goals ...............................................

Words: 22215 - Pages: 89

Free Essay

Analytical Chem

...Chemistry Modern Analytical Chemistry David Harvey DePauw University Boston Burr Ridge, IL Dubuque, IA Madison, WI New York San Francisco St. Louis Bangkok Bogotá Caracas Lisbon London Madrid Mexico City Milan New Delhi Seoul Singapore Sydney Taipei Toronto McGraw-Hill Higher Education A Division of The McGraw-Hill Companies MODERN ANALYTICAL CHEMISTRY Copyright © 2000 by The McGraw-Hill Companies, Inc. All rights reserved. Printed in the United States of America. Except as permitted under the United States Copyright Act of 1976, no part of this publication may be reproduced or distributed in any form or by any means, or stored in a data base or retrieval system, without the prior written permission of the publisher. This book is printed on acid-free paper. 1 2 3 4 5 6 7 8 9 0 KGP/KGP 0 9 8 7 6 5 4 3 2 1 0 ISBN 0–07–237547–7 Vice president and editorial director: Kevin T. Kane Publisher: James M. Smith Sponsoring editor: Kent A. Peterson Editorial assistant: Jennifer L. Bensink Developmental editor: Shirley R. Oberbroeckling Senior marketing manager: Martin J. Lange Senior project manager: Jayne Klein Production supervisor: Laura Fuller Coordinator of freelance design: Michelle D. Whitaker Senior photo research coordinator: Lori Hancock Senior supplement coordinator: Audrey A. Reiter Compositor: Shepherd, Inc. Typeface: 10/12 Minion Printer: Quebecor Printing Book Group/Kingsport Freelance cover/interior designer: Elise Lansdon Cover image: © George Diebold/The...

Words: 88362 - Pages: 354

Free Essay

Hsc Chemistry Notes

...Chemistry Notes 2010 Core Module 1: Production of Materials Contextual Outline Humans have always exploited their natural environment for all their needs including food, clothing and shelter. As the cultural development of humans continued, they looked for a greater variety of materials to cater for their needs. The twentieth century saw an explosion in both the use of traditional materials and in the research for development of a wider range of materials to satisfy technological developments. Added to this was a reduction in availability of the traditional resources to supply the increasing world population. Chemists and chemical engineers continue to play a pivotal role in the search for new sources of traditional materials such as those from the petrochemical industry. As the fossil organic reserves dwindle, new sources of the organic chemicals presently used have to be found. In addition, chemists are continually searching for compounds to be used in the design and production of new materials to replace those that have been deemed no longer satisfactory for needs. This module increases students’ understanding of the implications of chemistry for society and the environment and the current issues, research and developments in chemistry. 1.1 Construct word and balanced formulae equations of all chemical reactions as they are encountered in this module: • Acid reactions: o acid (aq) + base (aq)  salt (aq) + water (l) o acid (aq) + active metal (s)  salt (aq) + hydrogen (g)...

Words: 34562 - Pages: 139

Premium Essay

Damsel

...change. Please consult your faculty or the Registrar’s office if you require clarification regarding the contents of this document. Note: Program map information located in the faculty sections of this document are relevant to students beginning their studies in 2014-2015, students commencing their UOIT studies during a different academic year should consult their faculty to ensure they are following the correct program map. i Message from President Tim McTiernan I am delighted to welcome you to the University of Ontario Institute of Technology (UOIT), one of Canada’s most modern and dynamic university communities. We are a university that lives by three words: challenge, innovate and connect. You have chosen a university known for how it helps students meet the challenges of the future. We have created a leading-edge, technology-enriched learning environment. We have invested in state-of-the-art research and teaching facilities. We have developed industry-ready programs that align with the university’s visionary research portfolio. UOIT is known for its innovative approaches to learning. In many cases, our undergraduate and graduate students are working alongside their professors on research projects and gaining valuable hands-on learning, which we believe is integral in preparing you to lead and succeed. I encourage you to take advantage of these opportunities to become the best you can be. We also invite our students to connect to the campus and the neighbouring communities...

Words: 195394 - Pages: 782

Free Essay

Bio 2f03

...Biology 2F03: Lecture 1 Chapter 2: Life on Land • • • • • • • • Labs start on the Sept 17 Why horses and cattle help restore Guanacaste forest of Costa Rica? o This forest was in decline for thousands of years, when Indians colonized central America, it caused its decline. o Its regenerated when the Europeans came with the cattle o The trees only produce a new plant after processes: the fallen fruit has to be eaten by a larger animal (mule, or horse or cow) à it has to pass through the body and ends up in a pile of fertilizer only then it can regenerate and produce a tree o Why did it evolve to be depended to this process? § There must be animals there in the past, in the past it was a camel (llama, alpaca). When the Indians came from asia (50000 years ago) these animals went extinct and the tree lost its major dispersal system What is the most obvious foundation of life on land? o Is landà soil Climate defines biomes, the ‘shapes’ of vegetation o Defines the major types of land on earth o Temperature and precipitation to be specific Soils in turn greatly affect the aspects (roots, water, nutrient) à rentention, root attachment, etc. Soil typically form layers (horizontal) retaining a range of physical and chemical layers: o Classification of soil: O= organic, A, B, C Soil horizons: description o O: organic, litter on top, fine litter deeper (gets broken down, hence fine), pollen, dead organisms o A: mineral soil, some organic matter...

Words: 18026 - Pages: 73

Premium Essay

Development

...the ability of the earth and its finite resources to feed an exponentially growing population. The purpose of this study is to review the literature on population and environment and to identify the main strands of thought and the assumptions that lie behind them. The author begins with a review of the historical perspective. He then reviews and assesses the evidence on the relationship between population and environment, focusing on selected natural and environmental resources: land use, water use, local pollution, deforestation and climate change. The author also reviews selected recent macro and micro perspectives. The new macro perspective introduces the environment-income relationship and examines the role of population growth and density in mediating this relationship. The new micro perspective introduces the close relationship between poverty and environmental degradation, also examining the roles of gender in decision-making and the role of children as economic assets in fertility decisions. Finally, the author carries out a comparative assessment of the approaches and methods employed in the literature to explain the wide variation in findings and predictions. This literature review demonstrates...

Words: 19985 - Pages: 80

Premium Essay

Boogie Nights

...the ability of the earth and its finite resources to feed an exponentially growing population. The purpose of this study is to review the literature on population and environment and to identify the main strands of thought and the assumptions that lie behind them. The author begins with a review of the historical perspective. He then reviews and assesses the evidence on the relationship between population and environment, focusing on selected natural and environmental resources: land use, water use, local pollution, deforestation and climate change. The author also reviews selected recent macro and micro perspectives. The new macro perspective introduces the environment-income relationship and examines the role of population growth and density in mediating this relationship. The new micro perspective introduces the close relationship between poverty and environmental degradation, also examining the roles of gender in decision-making and the role of children as economic assets in fertility decisions. Finally, the author carries out a comparative assessment of the approaches and methods employed in the literature to explain the wide variation in findings and predictions. This literature review...

Words: 19985 - Pages: 80

Free Essay

Nit-Silchar B.Tech Syllabus

...Technology Programmes amï´>r¶ JH$s g§ñWmZ, m¡Úmo{ à VO o pñ Vw dZ m dY r V ‘ ñ Syllabi and Regulations for Undergraduate PROGRAMME OF STUDY (wef 2012 entry batch) Ma {gb Course Structure for B.Tech (4years, 8 Semester Course) Civil Engineering ( to be applicable from 2012 entry batch onwards) Course No CH-1101 /PH-1101 EE-1101 MA-1101 CE-1101 HS-1101 CH-1111 /PH-1111 ME-1111 Course Name Semester-1 Chemistry/Physics Basic Electrical Engineering Mathematics-I Engineering Graphics Communication Skills Chemistry/Physics Laboratory Workshop Physical Training-I NCC/NSO/NSS L 3 3 3 1 3 0 0 0 0 13 T 1 0 1 0 0 0 0 0 0 2 1 1 1 1 0 0 0 0 4 1 1 0 0 0 0 0 0 2 0 0 0 0 P 0 0 0 3 0 2 3 2 2 8 0 0 0 0 0 2 2 2 2 0 0 0 0 0 2 2 2 6 0 0 8 2 C 8 6 8 5 6 2 3 0 0 38 8 8 8 8 6 2 0 0 40 8 8 6 6 6 2 2 2 40 6 6 8 2 Course No EC-1101 CS-1101 MA-1102 ME-1101 PH-1101/ CH-1101 CS-1111 EE-1111 PH-1111/ CH-1111 Course Name Semester-2 Basic Electronics Introduction to Computing Mathematics-II Engineering Mechanics Physics/Chemistry Computing Laboratory Electrical Science Laboratory Physics/Chemistry Laboratory Physical Training –II NCC/NSO/NSS Semester-4 Structural Analysis-I Hydraulics Environmental Engg-I Structural Design-I Managerial Economics Engg. Geology Laboratory Hydraulics Laboratory Physical Training-IV NCC/NSO/NSS Semester-6 Structural Design-II Structural Analysis-III Foundation Engineering Transportation Engineering-II Hydrology &Flood Control Concrete Lab Structural Engineering Lab L...

Words: 126345 - Pages: 506

Free Essay

Jon Von Nomann

...László Rátz | Doctoral students | Donald B. Gillies Israel Halperin | Other notable students | Paul Halmos Clifford Hugh Dowker Benoit Mandelbrot[1] | Known for |  [show] | Notable awards | Bôcher Memorial Prize (1938) Enrico Fermi Award (1956) | Signature | John von Neumann (/vɒn ˈnɔɪmən/; December 28, 1903 – February 8, 1957) was a Hungarian and American pure and applied mathematician, physicist, inventor and polymath. He made major contributions to a number of fields,[2] including mathematics (foundations of mathematics, functional analysis, ergodic theory, geometry, topology, and numerical analysis), physics (quantum mechanics, hydrodynamics, and fluid dynamics), economics (game theory), computing (Von Neumann architecture, linear programming, self-replicating machines, stochastic computing), and statistics.[3] He was a pioneer of the application of operator theory to quantum mechanics, in the development of functional analysis, a principal member of the Manhattan Project and the Institute for Advanced Study in Princeton (as one of the few...

Words: 9454 - Pages: 38

Free Essay

Photoelectrochemistry

...1 1 Fundamentals of Semiconductor Electrochemistry and Photoelectrochemistry Krishnan Rajeshwar The University of Texas at Arlington, Arlington, Texas 1.1 1.2 1.3 1.3.1 1.3.2 1.3.3 1.3.4 1.4 1.4.1 1.4.2 1.4.3 1.5 1.5.1 1.5.2 1.5.3 1.5.4 1.5.5 1.6 1.7 1.7.1 1.7.2 1.7.3 1.7.4 1.7.5 Introduction and Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Electron Energy Levels in Semiconductors and Energy Band Model . The Semiconductor–Electrolyte Interface at Equilibrium . . . . . . . . The Equilibration Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . The Depletion Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Mapping of the Semiconductor Band-edge Positions Relative to Solution Redox Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Surface States and Other Complications . . . . . . . . . . . . . . . . . . . Charge Transfer Processes in the Dark . . . . . . . . . . . . . . . . . . . . Current-potential Behavior . . . . . . . . . . . . . . . . . . . . . . . . . . . . Dark Processes Mediated by Surface States or by Space Charge Layer Recombination . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Rate-limiting Steps in Charge Transfer Processes in the Dark . . . . . Light Absorption by the Semiconductor Electrode and Carrier Collection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Light Absorption...

Words: 180197 - Pages: 721

Free Essay

Women Affair.Pdf

...WOMEN DEVELOPMENT AND NATIONAL POLICY ON WOMEN IN NIGERIA Olubunmi Aderemi Sokefun Abstract This paper discusses the document on women in Nigeria (National Policy on Women). Several past administrations in this country have treated women issues and affairs with calculated levity: Carefully side - tracking or blatantly refusing to accord it the necessary attention. It is now a thing to gladden the hearts of all women of Nigeria that, "after four attempts by four former heads of Nigeria's Government," Chief Obasanjo's administration finally granted government recognition to women's issues in this country. The official document .on Human Rights' issues as it relates to Nigerian women; this document is known as the NATIONAL POLICY ON WOMEN. This paper therefore focuses on the document which promises to bring delight to the heart of every woman in this country. Introduction When late Mrs. Olufunmilayo Ransome Kuti joined the vanguard team as the only nationalist and activist during the early struggle for Nigerian independence, hardly did .anybody realize then that she had a dream, a clear vision of a future Nigerian woman, that vision was crystal clear in her heart, and like a pivot, it stood firmly on three stand posts-known today as women's rights, women emancipation and women empowerment.. . Mrs. Olufunmilayo Ransome-Kuti later joined by some educated women of like minds, fought daringly and relentlessly for these three .pivotal goals of women emergency and relevance in the socio-political...

Words: 71889 - Pages: 288