Externally indexed torrent
If you are the original uploader, contact staff to have it moved to your account
Textbook in PDF format
This book presents a short introduction to the main tools of optimization methodology including linear programming, steepest descent, conjugate gradients, and the Karush-Kuhn-Tucker-John conditions. Each topic is developed in terms of a specific physical model, so that the strategy behind every step is motivated by a logical, concrete, easily visualized objective. A quick perusal of the Fibonacci search algorithm provides a simple and tantalizing first encounter with optimization theory, and a review of the max-min exposition of one-dimensional calculus prepares readers for the more sophisticated topics found later in the book. Notable features are the innovative perspectives on the simplex algorithm and Karush-Kuhn-Tucker-John conditions as well as a wealth of helpful diagrams. The author provides pointers to references for readers who would like to learn more about rigorous definitions, proofs, elegant reformulations and extensions, and case studies. However, the book is sufficiently self-contained to serve as a reliable resource for readers who wish to exploit commercially available optimization software without investing the time to develop expertise in its aspects.
This book also:
Features innovative perspectives on the simplex algorithm and Krushal-Kuhn-Tucker-John conditions
Serves as a resource for readers to use the tools of optimization without needing to acquire expertise in the theory
Features plentiful resources that focus on rigorous definitions, proofs, and case studies
Preface
Fibonacci Search
Unimodal Functions and Fibonacci Search
Details (Optional Reading): Proof of the Optimality of Fibonacci Search
References
Linear Programming
An Example in Linear Programming
The Two-Dimensional Linear Program
Review of Matrix Basics
The Diamond Analogy and the Simplex Method
Simple Generalizations
The Basic Simplex Algorithm
A Two-Dimensional Example
Streamlining the Simplex Algorithm
Further Refining the Simplex Algorithm
Phase 1: Finding the First Corner
Degeneracy (Optional Reading)
Duality
References
Nonlinear Programming in One Dimension
The Zero Derivative Rule and Its Limitations
Nonlinear Search
Reference
Nonlinear Multidimensional Optimization
Visualizing the Objective Function
Multidimensional Search
Mathematical Characteristics of the Objective Function
Coordinate Descent
Method of Steepest Descent
Conjugate Directions
Details (Optional Reading): The Conjugate Gradient Algorithm
References
Constrained Optimization
The Karush–Kuhn–Tucker–John Conditions
Examples
References
What’s Left?
Reference