0% found this document useful (0 votes)
80 views

Fundamental of Optimization - WS1

The document introduces optimization methods for linear and non-linear problems. It discusses fundamental concepts like local and global optima, convexity, and provides an overview of deterministic and stochastic optimization algorithms like gradient descent, conjugate gradient, simulated annealing and genetic algorithms.

Uploaded by

fedy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
80 views

Fundamental of Optimization - WS1

The document introduces optimization methods for linear and non-linear problems. It discusses fundamental concepts like local and global optima, convexity, and provides an overview of deterministic and stochastic optimization algorithms like gradient descent, conjugate gradient, simulated annealing and genetic algorithms.

Uploaded by

fedy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 14

Introduction to Optimization Methods

Introduction to Non-Linear
Optimization

© 2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Optimization Tree

Figure 1: Optimization tree.

© 2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

What is Optimization?

 Optimization is an iterative process by which a desired solution


(max/min) of the problem can be found while satisfying all its
constraint or bounded conditions.

Figure 2: Optimum solution is found


while satisfying its constraint (derivative
must be zero at optimum).

 Optimization problem could be linear or non-linear.


 Non –linear optimization is accomplished by numerical ‘Search
Methods’.
 Search methods are used iteratively before a solution is achieved.
 The search procedure is termed as algorithm.
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Fundamental of Optimization
Local and global optima

We need different methods to deal


with these types of minima

Finding the optimal point x* is always defined in relation


to its neighboring points. This in contrast to finding e.g.,
F(x*)=0.
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Fundamental of Optimization

What is Optimization?(Cont.)

 Linear problem – solved by Simplex or Graphical methods.


 The solution of the linear problem lies on boundaries of the feasible
region.

Figure 3: Solution of linear problem Figure 4: Three dimensional solution of


non-linear problem
 Non-linear problem solution lies within and on the boundaries of the
feasible region.
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Fundamental of Optimization

Fundamentals of Non-Linear Optimization

 Single Objective function f(x) Maximize X1 + 1.5 X2


• Maximization Subject to:
X1 + X2 ≤ 150
• Minimization 0.25 X1 + 0.5 X2 ≤ 50
 Design Variables, xi , i=0,1,2,3….. X1 ≥ 50
X2 ≥ 25
 Constraints X1 ≥0, X2 ≥0
• Inequality Figure 5: Example of design variables and
• Equality constraints used in non-linear optimization.
 Optimal points
• Local minima/maxima points: A point or Solution x* is at local point
if there is no other x in its Neighborhood less than x*
• Global minima/maxima points: A point or Solution x** is at global
point if there is no other x in entire search space less than x**

© 2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Fundamentals of Non-Linear Optimization (Cont.)

Figure 6: Global versus local optimization. Figure 7: Local point is equal to global point if
the function is convex.

© 2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Fundamentals of Non-Linear Optimization (Cont.)


 Function f is convex if f(Xa) is less than value of the corresponding
point joining f(X1) and f(X2).
 Convexity condition – Hessian 2nd order derivative) matrix of
function f must be positive semi definite ( eigen values +ve or zero).

Figure 8: Convex and nonconvex set Figure 9: Convex function

© 2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Mathematical Background
 Slop or gradient of the objective function f – represent the
direction in which the function will decrease/increase most rapidly
df f ( x  x)  f ( x) f
 lim  lim
dx x0 x x 0 x

 Taylor series expansion


df 1 d2 f
f ( x p  x)  (x)  2
(x) 2  .......
dx xp 2! dx xp

 Jacobian – matrix of gradient of f with respect to several variables


 f f f 
 x y z 
J  
 g g g 
 x y z 

© 2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Mathematical Background (Cont.)


 Slope -First order Condition (FOC) – Provides function’s slope information
f ( X *)  0
 Hessian – Second derivative of function of several variables, Sign
indicates max.(+ve) or min.(-ve)
 2 f 2 f 
 2 
x yx 
H  2
 f 2 f 
 xy y 2 

 Second order condition (SOC)
• Eigen values of H(X*) are all positive
• Determinants of all lower order of H(X*) are +ve

© 2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Optimization Algorithm

 Deterministic - specific rules to move from one iteration to next ,


gradient, Hessian

 Stochastic – probalistic rules are used for subsequent iteration

 Optimal Design – Engineering Design based on


optimization algorithm

 Lagrangian method – sum of objective function and linear


combination of the constraints.

© 2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Optimization Methods
 Deterministic
• Direct Search – Use Objective function values to locate minimum
• Gradient Based – first or second order of objective function.
• Minimization objective function f(x) is used with –ve sign –
f(x) for maximization problem.
 Single Variable
• Newton – Raphson is Gradient based technique (FOC)
• Golden Search – step size reducing iterative method
 Multivariable Techniques ( Make use of Single variable Techniques
specially Golden Section)
• Unconstrained Optimization
a.) Powell Method – Quadratic (degree 2) objective function polynomial is
non-gradient based.
b.) Gradient Based – Steepest Descent (FOC) or Least Square minimum
(LMS)
c.) Hessian Based -Conjugate Gradient (FOC) and BFGS (SOC)
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Fundamental of Optimization

Optimization Methods - Constrained

• Constrained Optimization
a.) Indirect approach – by transforming into unconstrained
problem.
b.) Exterior Penalty Function (EPF) and Augmented Lagrange
Multiplier
c.) Direct Method Sequential Linear Programming (SLP), SQP and
Steepest Generalized Reduced Gradient Method (GRG)

Figure 10: Descent Gradient or LMS


© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Fundamental of Optimization

Optimization Methods (Cont.)

 Global Optimization – Stochastic techniques

• Simulated Annealing (SA) method – minimum


energy principle of cooling metal crystalline structure
• Genetic Algorithm (GA) – Survival of the fittest
principle based upon evolutionary theory

© 2008 Solutions 4U Sdn Bhd. All Rights Reserved

You might also like