ISBN-13: 9781119490548 / Angielski / Twarda / 2018 / 384 str.
ISBN-13: 9781119490548 / Angielski / Twarda / 2018 / 384 str.
List of Figures xv
List of Tables xix
Preface xxi
Acknowledgments xxiii
Acronyms xxv
Introduction xxvii
References xxxiii
PART I FUNDAMENTALS
1 Mathematical Foundations 3
1.1 Functions and Continuity 3
1.1.1 Functions 3
1.1.2 Continuity 4
1.1.3 Upper and Lower Bounds 5
1.2 Review of Calculus 6
1.2.1 Differentiation 6
1.2.2 Taylor Expansions 9
1.2.3 Partial Derivatives 12
1.2.4 Lipschitz Continuity 14
1.2.5 Integration 14
1.3 Vectors 17
1.3.1 Vector Algebra 17
1.3.2 Norms 18
1.3.3 2D Norms 19
1.4 Matrix Algebra 19
1.4.1 Matrices 19
1.4.2 Determinant 24
1.4.3 Rank of a Matrix 24
1.4.4 Frobenius Norm 25
1.5 Eigenvalues and Eigenvectors 26
1.5.1 Definiteness 29
1.5.2 Quadratic Form 30
1.6 Optimization and Optimality 31
1.6.1 Minimum and Maximum 31
1.6.2 Feasible Solution 32
1.6.3 Gradient and Hessian Matrix 33
1.6.4 Optimality Conditions 35
1.7 General Formulation of Optimization Problems 36
Exercises 36
References 37
2 Algorithms, Complexity and Convexity 39
2.1 What is an Algorithm? 39
2.2 Order Notations 41
2.3 Convergence Rate 43
2.4 Computational Complexity 44
2.4.1 Time and Space Complexity 44
2.4.2 Class P 45
2.4.3 Class NP 46
2.4.4 NP–Completeness 46
2.4.5 Complexity of Algorithms 47
2.5 Convexity 47
2.5.1 Linear and Affine Functions 47
2.5.2 Convex Functions 50
2.5.3 Subgradients 52
2.6 Stochastic Nature in Algorithms 53
2.6.1 Algorithms with Randomization 53
2.6.2 Random Variables 53
2.6.3 Poisson Distribution and Gaussian Distribution 56
2.6.4 Monte Carlo 58
2.6.5 Common Probability Distributions 60
Exercises 63
References 63
PART II OPTIMISATION TECHNIQUES AND ALGORITHMS
3 Optimization 67
3.1 Unconstrained Optimization 67
3.1.1 Univariate Functions 67
3.1.2 Multivariate Functions 70
3.2 Gradient–Based Methods 73
3.2.1 Newton s Method 73
3.2.2 Convergence Analysis 74
3.2.3 Steepest Descent Method 76
3.2.4 Line Search 80
3.2.5 Conjugate Gradient Method 80
3.2.6 Stochastic Gradient Descent 82
3.2.7 Subgradient Method 83
3.3 Gradient–Free Nelder–Mead Method 83
3.3.1 A Simplex 83
3.3.2 Nelder–Mead Downhill Simplex Method 84
Exercises 86
References 86
4 Constrained Optimization 89
4.1 Mathematical Formulation 89
4.2 Lagrange Multipliers 90
4.3 Slack Variables 93
4.4 Generalized Reduced Gradient Method 97
4.5 KKT Conditions 99
4.6 Penalty Method 102
Exercises 103
References 104
5 Optimization Techniques: Approximation Methods 105
5.1 BFGS Method 105
5.2 Trust–Region Method 107
5.3 Sequential Quadratic Programming 109
5.3.1 Quadratic Programming 109
5.3.2 Sequential Quadratic Programming 109
5.4 Convex Optimization 111
5.5 Equality Constrained Optimization 115
5.6 Barrier Functions 117
5.7 Interior–Point Methods 121
5.8 Stochastic and Robust Optimization 122
Exercises 124
References 125
PART III APPLIED OPTIMIZATION
6 Linear Programming 129
6.1 Linear Programming 129
6.2 Simplex Method 131
6.2.1 Slack Variables 131
6.2.2 Standard Formulation 132
6.2.3 Duality 133
6.2.4 Augmented Form 134
6.3 Worked Example by Simplex Method 135
6.4 Interior–Point Method for LP 138
Exercises 141
References 142
7 Integer Programming 143
7.1 Integer Linear Programming 143
7.1.1 Review of LP 143
7.1.2 Integer Linear Programming 144
7.2 LP Relaxation 145
7.3 Branch and Bound 148
7.3.1 Example 1 150
7.3.2 Example 2 152
7.3.3 How to Branch 155
7.4 Mixed Integer Programming 157
7.5 Applications of LP, IP and MIP 158
7.5.1 Transport Problem 158
7.5.2 Product Portfolio 160
7.5.3 Scheduling 162
7.5.4 Knapsack Problem 163
7.5.5 Travelling Salesman Problem 163
Exercises 164
References 165
8 Regression and Regularization 167
8.1 Sample Mean and Variance 167
8.2 Regression Analysis 170
8.2.1 Maximum Likelihood 170
8.2.2 Regression 170
8.2.3 Linearization 175
8.2.4 Generalized Linear Regression 177
8.2.5 Goodness of Fit 181
8.3 Nonlinear Least Squares 181
8.3.1 Gauss–Newton Algorithm 182
8.3.2 Levenberg–Marquardt Algorithm 184
8.3.3 Weighted Least Squares 185
8.4 Over–fitting and Information Criteria 186
8.5 Regularization and Lasso Method 188
8.6 Logistic Regression 190
8.7 Principal Component Analysis 193
Exercises 197
References 198
9 Machine Learning Algorithms 201
9.1 Data Mining 201
9.1.1 Hierarchy Clustering 202
9.1.2 K–Means Clustering 203
9.1.3 Distance Metric 204
9.2 Data Mining for Big Data 205
9.2.1 Characteristics of Big Data 205
9.2.2 Statistical Nature of Big Data 205
9.2.3 Mining Big Data 206
9.3 Artificial Neural Networks 208
9.3.1 Neuron Model 209
9.3.2 Artificial Neural Networks 210
9.3.3 Back Propagation Algorithm 212
9.3.4 Loss Functions in ANN 214
9.3.5 Stochastic Gradient Descent 215
9.3.6 Restricted Boltzmann Machine 216
9.4 Support Vector Machine 217
9.4.1 Statistical Learning Theory 218
9.4.2 Linear Support Vector Machine 219
9.4.3 Kernel Functions and Nonlinear SVM 222
9.5 Deep Learning 223
9.5.1 Learning 223
9.5.2 Deep Neural Nets 224
9.5.3 Tuning of Hyper–Parameters 224
Exercises 225
References 225
10 Queueing Theory and Simulation 229
10.1 Introduction 229
10.1.1 Components of Queueing 229\
10.1.2 Notations 231
10.2 Arrival Model 232
10.2.1 Poisson Distribution 232
10.2.2 Inter–arrival Time 235
10.3 Service Model 235
10.3.1 Exponential Distribution 236
10.3.2 Service Time Model 237
10.3.3 Erlang Distribution 238
10.4 Basic Queueing Model 238
10.4.1 M/M/1 Queue 238
10.4.2 M/M/s Queue 243
10.5 Little s Law 245
10.6 Queue Management and Optimization 246
Exercises 247
References 248
PART IV MULTIOBJECTIVE OPTIMIZATION
11 Multiobjective Optimization 253
11.1 Introduction 253
11.2 Pareto Front and Pareto Optimality 255
11.3 Choice and Challenges 257
11.4 Transformation to Single Objective Optimization 258
11.4.1 Weighted Sum Method 258
11.4.2 Utility Function 261
11.5 The Constraint Method 263
11.6 Evolutionary Approaches 266
11.6.1 Metaheuristics 266
11.6.2 Non–Dominated Sorting Genetic Algorithm 267
Exercises 268
References 268
12 Constraint–Handling Techniques 271
12.1 Introduction and Overview 271
12.2 Method of Lagrange Multipliers 273
12.3 Barrier Function Method 274
12.4 Penalty Method 274
12.5 Equality Constraints via Tolerance 275
12.6 Feasibility Criteria 276
12.7 Stochastic Ranking 276
12.8 Multiobjective Constraint–Handling and Ranking 277
Exercises 278
References 278
PART V EVOLUTIONARY COMPUTATION AND NATUREINSPIRED ALGORITHMS
13 Evolutionary Algorithms 283
13.1 Evolutionary Computation 283
13.2 Evolutionary Strategy 284
13.3 Genetic Algorithms 285
13.3.1 Basic Procedure 286
13.3.2 Choice of Parameters 288
13.4 Simulated Annealing 289
13.5 Differential Evolution 292
Exercises 295
References 295
14 Nature–Inspired Algorithms 299
14.1 Introduction to Swarm Intelligence 299
14.2 Ant and Bee Algorithms 300
14.3 Particle Swarm Optimization 301
14.3.1 Accelerated PSO 303
14.3.2 Binary PSO 304
14.4 Firefly Algorithm 305
14.5 Cuckoo Search 308
14.5.1 Cuckoo Search Algorithm 309
14.5.2 L´evy Flight 311
14.5.3 Advantages of CS 314
14.6 Bat Algorithm 314
14.7 Flower Pollination Algorithm 317
14.8 Other Algorithms 320
Exercises 320
References 321
A Notes on Software Packages 323
A.1 Software Packages 323
A.2 Matlab Codes in This Book 324
A.3 Optimization Software 325
A.4 Data Mining Software 326
A.5 Machine Learning Software 326
B Problem Solutions 329
References 345
Index 357
Xin–She Yang, PhD, is Reader/Professor in Modelling and Optimization at Middlesex University London. He is also an elected Bye–Fellow and College Lecturer at Cambridge University, Adjunct Professor at Reykjavik University, Iceland, as well as Distinguished Chair Professor at Xi′an Polytechnic University, China.
A guide to modern optimization applications and techniques in newly emerging areas spanning optimization, data science, machine intelligence, engineering, and computer sciences
Optimization Techniques and Applications with Examples introduces the fundamentals of all the commonly used techniques in optimization that encompass the broadness and diversity of the methods (traditional and new) and algorithms. The author a noted expert in the field covers a wide range of topics including mathematical foundations, optimization formulation, optimality conditions, algorithmic complexity, linear programming, convex optimization, and integer programming. In addition, the book discusses artificial neural network, clustering and classifications, constraint–handling, queueing theory, support vector machine and multi–objective optimization, evolutionary computation, nature–inspired algorithms and many other topics.
Designed as a practical resource, all topics are explained in detail with step–by–step examples to show how each method works. The book s exercises test the acquired knowledge that can be potentially applied to real problem solving. By taking an informal approach to the subject, the author helps readers to rapidly acquire the basic knowledge in optimization, operational research, and applied data mining. This important resource:
Written for upper undergraduates and graduates in a standard course on optimization, operations research and data mining, Optimization Techniques and Applications with Examples is a highly accessible guide to understanding the fundamentals of all the commonly used techniques in optimization.
1997-2024 DolnySlask.com Agencja Internetowa