Download Checking suffi ciency for Optimization Problems

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Microeconomics wikipedia , lookup

Transcript
Checking su¢ ciency for Optimization Problems - Some Results
Concavity/Convexity of Lagrangian
Suppose Lagrangian L(x) is concave (convex) in x; then x , that solves
…rst order conditions obtained from lagrangian function, solves maximization
(minimization) problem.
Thus to check if demand functions, obtained from di¤erentiating Lagrangian
with respect x1 , x2 ; and ; indeed solve maximization problem of consumer, one
could examine concavity of Lagrangian function. While doing this one has to
remeber that sum of concave functions is concave and sum of convex functions
is convex; i.e. say f and g are concave functions, then we know that: f +g is also
concave. Also if f is concave, then f is convex. (Note that since linear function
is concave and convex at the same time, one can ignore budget constraint while
checking concavity/covexity of Lagrangian).
De…nition of concave/convex two-variable functions:
00
00
00
00
00
Theorem 1 (a) f is concave () f11 6 0; f22 6 0 and f11 f22 (f12 )2 > 0:
00
00
00
00
00
(b) f is convex () f11 > 0; f22 > 0 and f11 f22 (f12 )2 > 0:
Example: Problem Set 1, Question 2.
Consumer solves following maximization problem:
Problem 2 max x1 x2 s.t. p1 x1 + p2 x2 6 M
We get following Lagrangian:
L(x1 ; x2 ) = x1 x2
( p1 x1 + p2 x2
M)
By showing that Lagrangian function is concave, one shows that demand
functions (derived in class) are indeed solution of maximization problem. For
this it is su¢ cient to check concavity of utility function: x1 x2 .
f1 = x1
f11 = (
1
x2
1) x1
2
x2 6 0: ()
61
6 0: ()
61
f2 = x1 x2 1
f22 = (
1) x1 x2
f12 =
x1 1 x2 1
00
00
00
f11 f22 (f12 )2 = (
2
1)(
x21
1)
1
2 2
x2
2
(
)2 x21
2 2
x2
2
>0
()
(
1)(
1)
()
+ 6 1:
>0
(General Result applied to 2 variables, 1 constraint)
Su¢ cient Condition for Optimization problems with two variables
and one constraint
If consumer wants to solve following maximization problem:
Max x1 x2 subject to the constraint: p1 x1 + p2 x2 6 M
One could use Lagrange’s method to …nd solution. Lagrangian would be:
L(x1 ; x2 ) = x1 x2
(p1 x1 + p2 x2 M )
First order necessary conditions for maximum are:
1.
@L
x1
= x1
1
x2
p1 = 0
2.
@L
x2
= x1 x2
1
p2 = 0
3.
@L
=
(p1 x1 + p2 x2
M) = 0
To check second order conditions, let’s de…ne following matrix, called Bordered Hessian:
0
De…nition 3 Bordered Hessian: g1
g2
g1
L11
L21
g2
L12
L22
Where g1 for example stands for partial derivative of constraint and L11
respectively for second-partial of Lagrangian with respect to …rst variable.
De…ne determinant of the Bordered Hessian matrix:
D
0
g1
g2
g1
L11
L21
g2
L12
L22
=
[L11 (g2 )2
2L12 g1 g2 + L22 (g1 )2 ]
Theorem 4 If (x1 ; x2 ; ) solve …rst order conditions given by equations (1,2,3)
and if D> 0 (< 0) when evaluated at (x1 ; x2 ; ); then (x1 ; x2 ) is a local maximum
(minimum) of f (x1 ; x2 ) subject to the constraint of g(x1 ; x2 ) = 0:
2
Calculating all the derivatives of the Bordered Hessian matrix:
L11 = (
1) x1 2 x2
L22 = (
1) x1 x2 2
L12 =
x1
g1 = p1
g2 = p2
1
x2
1
To calculate Determinant of Bordered Hessian in our example, one should
evaluate determinant at (x1 ; x2 ; ); i.e. at the demand functions as found solving
…rst order conditions and also
but as one can see determinant does not
depend on in this case.
3