Download linear vector space, V, informally. For a rigorous discuss

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Euclidean space wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

System of linear equations wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Hilbert space wikipedia , lookup

Symmetry in quantum mechanics wikipedia , lookup

Oscillator representation wikipedia , lookup

Cross product wikipedia , lookup

Tensor operator wikipedia , lookup

Exterior algebra wikipedia , lookup

Laplace–Runge–Lenz vector wikipedia , lookup

Matrix calculus wikipedia , lookup

Geometric algebra wikipedia , lookup

Euclidean vector wikipedia , lookup

Dual space wikipedia , lookup

Vector space wikipedia , lookup

Linear algebra wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Four-vector wikipedia , lookup

Cartesian tensor wikipedia , lookup

Basis (linear algebra) wikipedia , lookup

Bra–ket notation wikipedia , lookup

Transcript
Dirac Notation
We will introduce the notion of a (finite-dimensional) linear vector space, V, informally.
For a rigorous discussion you can look at P. R. Halmos, “Finite-dimensional Vector Spaces”, for
example.
A linear vector space, V, is a set of vectors with an abstract vector denoted by |vi (and read ‘ket vee’).
This notation introduced by Paul Adrien Maurice Dirac(1902-1984) is elegant and extremely useful
and it is imperative that you master it.1 The space is endowed with the operation of addition(+) for
each pair of vectors and multiplication by ‘scalars’ (which belong to the field of complex numbers,
C, in our case):
If |ui and |vi are vectors so are |ui + |vi and c|vi where c is a complex number(c ∈ C.)
Vector addition is commutative: |ui + |vi = |vi + |ui.
Vector addition is associative: |ui + ( |vi + |wi ) = ( |ui + |vi ) + |wi .
Scalar multiplication is distributive in
(a) the scalars : ( c1 + c2 ) |vi = c1 |vi + c2 |vi c1 c2 ∈ C
(b) the vectors : c ( |ui + |vi ) = c |ui + c |vi c ∈ C
Scalar multiplication is associative: c1 ( c2 |vi ) = (c1 c2 ) |vi.
There exists a null vector, denoted by |0i such that |vi + |0i = |vi.
One can show uniqueness of the null vector and that |0i = 0|vi for any vector |vi.
For every ket |vi there exists a vector denoted by | − vi such that |vi + | − vi = |0i.
One can show that the inverse is unique; we also have |vi + (−1)|vi = 0|vi = |0i. So we have
denoted (−1) |vi by | − vi.
Roughly speaking, we can summarize the axioms by saying that all the normal operations with
which you are familiar while studying ordinary vectors and scalars are legal.
One can endow the linear vector space with an inner product (the generalization of the dot product) to make it an inner product space. The inner product is a complex number denoted by hu|vi.
This is represented by the bracket symbol and hence, the term bra for hu| and ket for |vi. The inner
product has the following properties:
(P1) hu|vi = hv|ui∗ . Since we have defined the vector space over complex numbers this is different from the more familiar case of a vector space over real numbers in which case the dot product
is real and the order is immaterial.
1
The seminal book by Dirac, The Principles of Quantum Mechanics published in 1930 introduced the formalism of
quantum mechanics in general and this notation in particular. This is not an easy book to read!
1
(P2) Let |u0 i = c1 |v 0 i + c2 |w0 i; we have hu|u0 i = c1 hu|v 0 i + c2 hu|w0 i, i.e., the inner product is
linear in the kets.
(P3) Clearly hv|vi is real and is defined to be non-negative: hv|vi ≥ 0 and equals zero if and
only if |vi = |0i.
We also have from property P 1 and P 2, hu0 |ui = c∗1 hv 0 |ui + c∗2 hw0 |ui. Show this!
Note therefore that if |u0 i = c1 |v 0 i + c2 |w0 i, then hu0 | = c∗1 hv 0 | + c∗2 hw0 |. Observe the complex
conjugation.
Two vectors, |ui and |vi, are said to be orthogonal if hu|vi = 0 just as with ordinary vectors.
The norm or length of a vector, |vi is defined to be the non-negative real number
sometimes denoted by k v k.
p
hv|vi; it is
Our definition of the inner product has tacitly used the definition of a bra, hu|. The bra vectors
are in one-to-one correspondence with ket vectors. With every |vi we associate a unique hv|. So the
bras share the same linear vector space structure as the kets. The one key property which makes the
definition of the scalar product well-defined as observed above is that the bra associated with the
ket |wi
|wi = c1 |ui + c2 |vi is hw| = c∗1 hu| + c∗2 hv| .
A set of l vectors |v1 i, |v2 i, . . . , |vl i is said to be linearly independent if
l
X
cj |vj i = 0 implies cj = 0, for every j .
(1)
j=1
Exercise: In ordinary three-dimensional space write down three vectors which are mutually orthogonal. Are they linearly independent? Give a set of three linearly independent vectors which are
not orthogonal.
A set of vectors is said to constitute a basis if it is linearly independent and spans the space, i.e.,
every vector |vi in the space can be expressed as a linear combination of the elements of the basis set
(with complex coefficients.) A linear space is said to be n-dimensional if and only if it has a basis of
n vectors. Clearly then in an n-dimensional vector space one can express any vector |vi as a linear
combination of a set of n linearly independent vectors; if this were not possible we would have n + 1
linearly independent vectors contradicting our assumption. The basis vectors can be made mutually
orthogonal and normalized to unity. We will use the notation |ej i for j = 1, 2, · · · n for one such set
of n orthonormal vectors:
hei |ej i = δij
(2)
where δij is the Kronecker delta function which is one if the two indices are the same and zero
otherwise.
2
Every vector(ket) |vi can be expanded in terms of the orthonormal basis as
|vi =
n
X
vj |ej i .
(3)
j=1
Take the inner product of the above equation with hei |; one obtains, upon using the orthonormality
of the basis vectors the useful result vi = hei |vi .2 Similarly, the corresponding bra vector can be
expressed in terms of the dual basis:
hv| =
n
X
hej | vj∗ =
j=1
n
X
vj∗ hej | .
j=1
Note that sometimes the complex number vj∗ is placed before the bra as in the last expression. It
represents the same linear combination of the basis bra vectors multiplied by the complex numbers vj∗ .
One concrete realization of this formalism is obtained by thinking of the abstract vectors in a
specific basis and associating a column vector with a ket with the elements being complex numbers.
Clearly we can add such column vectors (assuming all of them have the same number of components),
multiply them by complex numbers such that they obey the axioms. Now one can think of a bra
associated with each ket as simply the complex-conjugated transpose (sometimes referred to as the
adjoint) of the column vector: it is a row vector with each element being the complex conjugate of
the corresponding element of the row. Explicitly, with |vi in an n-dimensional space we associate a
column vector (in a particular basis)


v1
 v 
 2 


v 
|vi → 
(4)
 3 
 .. 
 . 
vn
where vj are complex numbers. Note that if one chooses a different basis the same abstract vector,
|vi is represented by a column vector with different entries. The corresponding bra is given by
hv| → (v1∗ , v2∗ , v3∗ , · · · vn∗ ) .
(5)
This concrete identification is helpful in keeping track of the concepts of bras and kets. In more
mathematical parlance, the space of bras and space of kets are dual to each other. The dot product
is given by


v1
 v 
 2 
n
X


∗
∗
∗
∗
v3  =
hu|vi = (u1 , u2 , u3 , · · · un ) 
u∗j vj
(6)


 .. 
j=1
 . 
vn
It is obvious, for example, that hu|vi = hv|ui∗ . The various properties of the inner product are also
clear in this representation. Note that |vi hu| is a very different beast from hu|vi.
2
Note apart from the unfamiliar notation this is nothing more than what you know from vector algebra:
Given ~a = ax î + ay ĵ + az k̂ then ay = ĵ · ~a
generalized to n-dimensions and complex vectors.
3
It is clear from the representation that we have employed that it is an n × n matrix, i.e., it is an
operator! Please write out the operator explicitly. Once more, the advantage of the formal notation
is that many general results can be proved compactly and without explicitly writing vectors and
operators in a basis.
The basis vectors (of the chosen basis) are then given by





|e1 i = 




1
0
0
0
0
..
.















|e2 i = 




0
1
0
0
0
..
.










etc.
(7)
The jth element of |ej i is 1 while all the other elements are 0. It is easy to check that these n unit
vectors constitute an orthonormal basis. The representation of any complex vector in terms of the
basis vectors is obvious.
Linear Operators in Dirac notation
We define an operator  as a map that associates with each vector |ui belonging to the linear
vector space V a vector |wi; this is represented by  |ui = |wi. An operator is said to be linear if it
obeys
 [ c1 |ui + c2 |vi ] = c1 Â|ui + c2 Â|vi
for any pair of vectors |ui and |vi and any pair of complex numbers c1 and c2 .
The linear operators themselves form a linear space in that the sum of two operators  and B̂ is
defined by
( Â + B̂ ) |ui = Â |ui + B̂ |ui
and multiplication by a complex scalar is defined by the action of c  on any ket as follows:
(cÂ) |vi = c( Â|vi ) .
(8)
Ĉ = c1 Â + c2 B̂ ⇒ Ĉ|vi = c1 Â|vi + c2 B̂|vi for all |vi .
(9)
so that we have
4