Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
1 Correlation functions in QFT In physics it makes sense to ask questions only about observables. In the realm of elementary particles and their interactions, we have primarily two kinds of observables, viz. • cross sections and decay widths of elementary particles undergoing scattering proceses • masses of bound states of elementary particles. Computation of bound state masses in quantum field theory necessarily require non-perturbative techniques and till date the only systematic way know for this is lattice gauge theory. We will not discuss this in this course. Computation of cross sections in scattering processes on the other hand is possible in perturbation theory and we will be mostly concerned with this in this course. It turns out that scattering cross sections are related to correlations between different space-time points of the quantum field. The basic correlation function which is a 2-point Green’s function can be expressed in terms of functional integrals (which are a generalization of path integrals) as R † DφDφ† φ† (xf , tf )φ(xi , ti )eiScl [φ,φ ]/~ R (1) G2 (xf , tf ; xi , ti ) = DφDφ† eiScl [φ,φ† ]/~ R where one integrates over all possible classical fields φ with weights given by the classical action Scl = M L(φ, φ† , ∂φ, ∂φ† )dxdt. A very important point is that it is possible to make sense of the definition of G2 even if L contains non-linear interaction terms in φ and φ† without the knowledge of the eigenfunctions of the system. In the presence of interaction terms, G2 is a highly singular mathematical object and a lot of quantum field theory is about how to make sense of such singular objects. The Green’s functions G2 , G4 , G6 , . . .of a quantum field theory are closely to the moments of the quantum field which contain information on its probability distribution. 2 Gaussian probability distribution The Gaussian probability distribution is used throughout statistics, natural sciences, and social sciences as a simple model for complex phenomena. For example, the observational error in an experiment is usually assumed to follow a Gaussian distribution. The graph of the associated probability density function is “bell”-shaped, with peak at the mean, and is known as the Gaussian function or bell curve (x−µ)2 1 e− 2σ2 , f (x) = √ (2) 2πσ 2 where parameters µ and σ 2 are the mean and the variance. The distribution with µ = 0 and σ 2 = 1 is called standard normal. Properties: 1 • Function f (x) is symmetric around the point x = µ, which is at the same time the mode, the median and the mean of the distribution. • The inflection points of the curve occur one standard deviation away from the mean (i.e., at x = µσ and x = µ + σ). The probability density function (pdf) of a random variable describes the relative frequencies of different values for that random variable. The pdf of the gaussian distribution is given by f (x; µ, σ 2 ) = √ 1 2πσ 2 2 2 e−(x−µ) /(2σ ) . (3) This is a proper function only when the variance σ 2 is not equal to zero. In that case this is a continuous smooth function, defined on the entire real line, and which is called the “Gaussian function”. f (x; µ, 0) = δ(x − µ). (4) This is the Dirac delta function, it is infinite at x = µ and is zero elsewhere. 3 Moments of the Gaussian Distribution For k = 0, 1, 2, . . . , the quantity Mk = R 2 R R 2 xk e−x /2σ dx e−x2 /2σ2 dx R (5) is called the kth moment of the Gaussian distribution in probability theory. Theorem: A probability distribution is uniquely determined by its infinite series M0 , M1 , M2 , . . . of moments. Moments of the distribution can be computed easily by introducing the function Z 2 2 Z(J) = C e−x /2σ eJx dx (6) R of a real variable J, with C chosen in a manner such that Z(0) = 1. Then for k = 0, 1, 2, . . . , dk Z(J) Mk = . dJ k J=0 (7) The function Z(J) is called the generating function of the moments. In quantum field theories, correlation functions can be obtained from such moment generating functions; only normal integrals have to be replaced by functional integrals. Let use define a moment generating function as Z R † † † Z(J, J † ) = C DφDφ† eiScl [φ,φ ]/~ e R (φJ +φ J )dxdt (8) where again the normalization constant C chosen in a manner such that Z(0, 0) = 1. The 2-point Green’s function between the φ and φ† for the action Scl [φ, φ† ] is given by 2 δ 2 Z(J, J † ) ~ G2 (xf , tf ; xi , ti ) = † i δJ(xf , tf )δJ (xi , ti ) J=0,J † =0 (9) δ where the operator δJ(x,t) is known as a functional derivative. Higher order correlation functions are obtained by applying higher order functional derivatives on the generating function Z(J, J † ). The functions J and J † are known as sources and Z(0, 0) is just the partition function. Functional derivatives and functional integrals can be thought of as natural generalizations of classical partial derivatives and multi-dimensional integrals to infinite dimensions. 2 4 Cumulants In probability theory and statistics, the cumulants of a probability distribution are a set of quantities that provide an alternative to the moments of the distribution. The moments determine the cumulants in the sense that any two probability distributions whose moments are identical will have identical cumulants as well, and similarly the cumulants determine the moments. In some cases theoretical treatments of problems in terms of cumulants are simpler than those using moments. The cumulants are defined via the cumulant-generating function, which is the natural logarithm of the momentgenerating function W (J) = − ln Z(J) (10) For the gaussian distribution with expectation value µ and variance σ 2 , the cumulant generating function is W (J) = µJ + σ 2 J 2 /2. The first and second derivatives of the cumulant generating function are W ′ (J) = µ + σ 2 J and W ′′ (J) = σ 2 . The cumulants are C1 = µ , C2 = σ 2 , and C3 = C4 = . . . = 0. Exercise: Show that the even moments of the gaussian distribution are non-zero but can be expressed in terms of the cumulants. 5 Perturbation on the Gaussian Consider the integral Z(J, λ) = Z ∞ 1 2 4 dx e− 2 ax eiλx eiJx (11) −∞ where λ is a small positive constant called the coupling constant. Doing a power series expansion, Z ∞ 1 2 8 − 21 ax2 4 dx e Z(J, λ) = 1 + iλx − λ x + · · · eiJx 2 −∞ (12) This can be rewritten as Z(J, λ) = = ( ) 4 8 1 d 1 2 1 d 1 + iλ dx e − λ + · · · eiJx i dJ 2 i dJ −∞ )Z ( 4 8 ∞ 2 1 1 2 1 d 1 d − λ + ··· dx e− 2 ax eiJx 1 + iλ i dJ 2 i dJ −∞ Z ∞ − 21 ax2 (13) (14) Formally, this can be written as ( Z(J, λ) = exp iλ 1 d i dJ 4 ) Z(J, 0) More generally if V = V (x) is a polynomial with real coefficients, then Z ∞ 1 d − 12 ax2 +iλV (x) iJx dx e Z(J, λ) = Z(J, 0) e = exp iλ V i dJ −∞ Exercise: Calculate Z to second order in perturbation theory for a perturbation λx3 . 3 (15) (16)