Nature-inspired optimization algorithms:
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Elektronisch E-Book |
Sprache: | English |
Veröffentlicht: |
London, United Kingdom ; San Diego, CA, United States ; Cambridge, MA, United States ; Kidlington, Oxford, United Kingdom
Academic Press
[2021]
|
Ausgabe: | Second edition |
Schlagworte: | |
Online-Zugang: | TUM01 |
Beschreibung: | 1 Online-Ressource (xvii, 292 Seiten) Illustrationen, Diagramme |
ISBN: | 9780128219898 |
Internformat
MARC
LEADER | 00000nmm a2200000zc 4500 | ||
---|---|---|---|
001 | BV047441967 | ||
003 | DE-604 | ||
005 | 20240219 | ||
007 | cr|uuu---uuuuu | ||
008 | 210827s2021 |||| o||u| ||||||eng d | ||
020 | |a 9780128219898 |9 978-0-12-821989-8 | ||
035 | |a (ZDB-30-PQE)EBC6346763 | ||
035 | |a (ZDB-30-PAD)EBC6346763 | ||
035 | |a (ZDB-89-EBL)EBL6346763 | ||
035 | |a (OCoLC)1197809406 | ||
035 | |a (DE-599)BVBBV047441967 | ||
040 | |a DE-604 |b ger |e rda | ||
041 | 0 | |a eng | |
049 | |a DE-91 | ||
082 | 0 | |a 571.0284 | |
084 | |a ST 134 |0 (DE-625)143590: |2 rvk | ||
084 | |a DAT 718 |2 stub | ||
084 | |a DAT 530 |2 stub | ||
100 | 1 | |a Yang, Xin-She |d 1965- |e Verfasser |0 (DE-588)1043733906 |4 aut | |
245 | 1 | 0 | |a Nature-inspired optimization algorithms |c Xin-She Yan |
250 | |a Second edition | ||
264 | 1 | |a London, United Kingdom ; San Diego, CA, United States ; Cambridge, MA, United States ; Kidlington, Oxford, United Kingdom |b Academic Press |c [2021] | |
264 | 4 | |c © 2021 | |
300 | |a 1 Online-Ressource (xvii, 292 Seiten) |b Illustrationen, Diagramme | ||
336 | |b txt |2 rdacontent | ||
337 | |b c |2 rdamedia | ||
338 | |b cr |2 rdacarrier | ||
505 | 8 | |a Front Cover -- Nature-Inspired Optimization Algorithms -- Copyright -- Contents -- About the Author -- Preface -- Acknowledgements -- 1 Introduction to Algorithms -- 1.1 What Is an Algorithm? -- 1.2 Newton's Method -- 1.3 Formulation of Optimization Problems -- 1.3.1 Optimization Formulation -- 1.3.2 Classi cation of Optimization Problems -- 1.3.3 Classi cation of Optimization Algorithms -- 1.4 Optimization Algorithms -- 1.4.1 Gradient-Based Algorithms -- 1.4.2 Hill Climbing With Random Restart -- 1.5 Search for Optimality -- 1.6 No-Free-Lunch Theorems -- 1.6.1 NFL Theorems -- 1.6.2 Choice of Algorithms -- 1.7 Nature-Inspired Metaheuristics -- 1.8 A Brief History of Metaheuristics -- References -- 2 Mathematical Foundations -- 2.1 Introduction -- 2.2 Norms, Eigenvalues and Eigenvectors -- 2.2.1 Norms -- 2.2.2 Eigenvalues and Eigenvectors -- 2.2.3 Optimality Conditions -- 2.3 Sequences and Series -- 2.3.1 Convergence Sequences -- 2.3.2 Series -- 2.4 Computational Complexity -- 2.5 Convexity -- 2.6 Random Variables and Probability Distributions -- 2.6.1 Random Variables -- 2.6.2 Common Probability Distributions -- 2.6.3 Distributions With Long Tails -- 2.6.4 Entropy and Information Measures -- References -- 3 Analysis of Algorithms -- 3.1 Introduction -- 3.2 Analysis of Optimization Algorithms -- 3.2.1 Algorithm as an Iterative Process -- 3.2.2 An Ideal Algorithm? -- 3.2.3 A Self-Organization System -- 3.2.4 Exploration and Exploitation -- 3.2.5 Evolutionary Operators -- 3.3 Nature-Inspired Algorithms -- 3.3.1 Simulated Annealing -- 3.3.2 Genetic Algorithms -- 3.3.3 Differential Evolution -- 3.3.4 Ant and Bee Algorithms -- 3.3.5 Particle Swarm Optimization -- 3.3.6 The Fire y Algorithm -- 3.3.7 Cuckoo Search -- 3.3.8 The Bat Algorithm -- 3.3.9 The Flower Algorithm -- 3.4 Other Algorithms and Recent Developments | |
505 | 8 | |a 3.5 Parameter Tuning and Parameter Control -- 3.5.1 Parameter Tuning -- 3.5.1.1 Hyperoptimization -- 3.5.1.2 Multi-Objective View -- 3.5.2 Parameter Control -- 3.6 Discussions -- 3.7 Summary -- References -- 4 Random Walks and Optimization -- 4.1 Isotropic Random Walks -- 4.2 Lévy Distribution and Lévy Flights -- 4.3 Optimization as Markov Chains -- 4.3.1 Markov Chain -- 4.3.2 Optimization as a Markov Chain -- 4.4 Step Sizes and Search Ef ciency -- 4.4.1 Step Sizes, Stopping Criteria, and Ef ciency -- 4.4.2 Why Lévy Flights are more Ef cient -- 4.5 Modality and Optimal Balance -- 4.5.1 Modality and Intermittent Search Strategy -- 4.5.2 Optimal Balance of Exploration and Exploitation -- 4.6 Importance of Randomization -- 4.6.1 Ways to Carry Out Random Walks -- 4.6.2 Importance of Initialization -- 4.6.3 Importance Sampling -- 4.6.4 Low-Discrepancy Sequences -- 4.7 Eagle Strategy -- 4.7.1 Basic Ideas of Eagle Strategy -- 4.7.2 Why Eagle Strategy is so Ef cient -- References -- 5 Simulated Annealing -- 5.1 Annealing and Boltzmann Distribution -- 5.2 SA Parameters -- 5.3 SA Algorithm -- 5.4 Basic Convergence Properties -- 5.5 SA Behavior in Practice -- 5.6 Stochastic Tunneling -- References -- 6 Genetic Algorithms -- 6.1 Introduction -- 6.2 Genetic Algorithms -- 6.3 Role of Genetic Operators -- 6.4 Choice of Parameters -- 6.5 GA Variants -- 6.6 Schema Theorem -- 6.7 Convergence Analysis -- References -- 7 Differential Evolution -- 7.1 Introduction -- 7.2 Differential Evolution -- 7.3 Variants -- 7.4 Choice of Parameters -- 7.5 Convergence Analysis -- 7.6 Implementation -- References -- 8 Particle Swarm Optimization -- 8.1 Swarm Intelligence -- 8.2 PSO Algorithm -- 8.3 Accelerated PSO -- 8.4 Implementation -- 8.5 Convergence Analysis -- 8.5.1 Dynamical System -- 8.5.2 Markov Chain Approach -- 8.6 Binary PSO -- References -- 9 Fire y Algorithms | |
505 | 8 | |a 9.1 The Fire y Algorithm -- 9.1.1 Fire y Behavior -- 9.1.2 Standard Fire y Algorithm -- 9.1.3 Variations of Light Intensity and Attractiveness -- 9.1.4 Controlling Randomization -- 9.2 Algorithm Analysis -- 9.2.1 Scalings and Limiting Cases -- 9.2.2 Attraction and Diffusion -- 9.2.3 Special Cases of FA -- 9.3 Implementation -- 9.4 Variants of the Fire y Algorithm -- 9.4.1 FA Variants -- 9.4.2 How Can We Discretize FA? -- 9.5 Fire y Algorithm in Applications -- 9.6 Why the Fire y Algorithm Is Ef cient -- References -- 10 Cuckoo Search -- 10.1 Cuckoo Breeding Behavior -- 10.2 Lévy Flights -- 10.3 Cuckoo Search -- 10.3.1 Special Cases of Cuckoo Search -- 10.3.2 How to Carry out Lévy Flights -- 10.3.3 Choice of Parameters -- 10.4 Implementation -- 10.5 Variants of Cuckoo Search -- 10.6 Why Cuckoo Search Is so Ef cient -- 10.7 Global Convergence: Brief Mathematical Analysis -- 10.8 Applications -- References -- 11 Bat Algorithms -- 11.1 Echolocation of Bats -- 11.1.1 Behavior of Microbats -- 11.1.2 Acoustics of Echolocation -- 11.2 Bat Algorithms -- 11.2.1 Movement of Virtual Bats -- 11.2.2 Loudness and Pulse Emission -- 11.3 Implementation -- 11.4 Binary Bat Algorithms -- 11.5 Variants of the Bat Algorithm -- 11.6 Convergence and Stability Analysis -- 11.7 Why the Bat Algorithm is Ef cient -- 11.8 Applications -- References -- 12 Flower Pollination Algorithms -- 12.1 Introduction -- 12.2 Flower Pollination Algorithm -- 12.2.1 Characteristics of Flower Pollination -- 12.2.2 Flower Pollination Algorithm -- 12.3 Implementation -- 12.4 Multi-Objective Flower Pollination Algorithm -- 12.5 Validation and Numerical Experiments -- 12.5.1 Single-Objective Test Functions -- 12.5.2 Multi-Objective Test Functions -- 12.5.3 Analysis of Results and Comparison -- 12.6 Engineering Design Benchmarks -- 12.6.1 Single-Objective Design Benchmarks | |
505 | 8 | |a 12.6.1.1 Spring Design Optimization -- 12.6.1.2 Welded Beam Design -- 12.6.1.3 Pressure Vessel Design -- 12.6.2 Bi-Objective Disc Design -- 12.7 Variants and Applications -- References -- 13 A Framework for Self-Tuning Algorithms -- 13.1 Introduction -- 13.2 Algorithm Analysis and Parameter Tuning -- 13.2.1 A General Formula for Algorithms -- 13.2.2 Type of Optimality -- 13.2.3 Parameter Tuning -- 13.3 Framework for Self-Tuning Algorithms -- 13.3.1 Hyperoptimization -- 13.3.2 A Multi-Objective View -- 13.3.3 Self-Tuning Framework -- 13.4 Self-Tuning Fire y Algorithm -- 13.5 Some Remarks -- References -- 14 How to Deal With Constraints -- 14.1 Introduction and Overview -- 14.2 Method of Lagrange Multipliers -- 14.3 KKT Conditions -- 14.4 Classic Constraint-Handling Techniques -- 14.4.1 Penalty Method -- 14.4.2 Barrier Function Method -- 14.4.3 Adaptive and Dynamic Penalty Method -- 14.4.4 Equality With Tolerance -- 14.5 Modern Constraint-Handling Techniques -- 14.5.1 Feasibility Rules -- 14.5.2 Stochastic Ranking -- 14.5.3 The ε-Constrained Approach -- 14.5.4 Multi-Objective Approach to Constraints -- 14.5.5 Recent Developments -- 14.6 An Example: Pressure Vessel Design -- 14.7 Concluding Remarks -- References -- 15 Multi-Objective Optimization -- 15.1 Multi-Objective Optimization -- 15.2 Pareto Optimality -- 15.3 Weighted Sum Method -- 15.4 Utility Method -- 15.5 The ε-Constraint Method -- 15.6 Nature-Inspired Metaheuristics -- 15.6.1 Metaheuristic Approaches -- 15.6.2 NSGA-II -- 15.7 Recent Trends -- References -- 16 Data Mining and Deep Learning -- 16.1 Introduction to Data Mining -- 16.2 Clustering -- 16.2.1 Clustering and Distances -- 16.2.2 The kNN Algorithm -- 16.2.3 The k-Means Algorithm -- 16.2.4 Nature-Inspired Algorithms for Data Analysis -- 16.3 Support Vector Machine -- 16.3.1 Linear SVM -- 16.3.2 Nonlinear SVM. | |
505 | 8 | |a 16.3.3 Nature-Inspired Algorithms for SVM -- 16.4 Arti cial Neural Networks -- 16.4.1 Machine Learning -- 16.4.2 Neural Models -- 16.4.3 Neural Networks -- 16.5 Optimizers for Machine Learning -- 16.6 Deep Learning -- 16.6.1 Recent Developments -- 16.6.2 Hyperparameter Tuning -- 16.6.3 Nature-Inspired Algorithms for Deep Learning -- References -- A Test Function Benchmarks for Global Optimization -- References -- B Matlab Programs -- B.1 Simulated Annealing -- B.2 Accelerated Particle Swarm Optimization -- B.3 Differential Evolution -- B.4 The Fire y Algorithm -- B.5 Cuckoo Search -- B.6 The Bat Algorithm -- B.7 The Flower Pollination Algorithm -- References -- Index -- Back Cover | |
650 | 4 | |a Nature-inspired algorithms | |
650 | 0 | 7 | |a Optimierung |0 (DE-588)4043664-0 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Natur |0 (DE-588)4041358-5 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Algorithmus |0 (DE-588)4001183-5 |2 gnd |9 rswk-swf |
689 | 0 | 0 | |a Algorithmus |0 (DE-588)4001183-5 |D s |
689 | 0 | 1 | |a Optimierung |0 (DE-588)4043664-0 |D s |
689 | 0 | 2 | |a Natur |0 (DE-588)4041358-5 |D s |
689 | 0 | |5 DE-604 | |
776 | 0 | 8 | |i Erscheint auch als |a Yang, Xin-She |t Nature-Inspired Optimization Algorithms |d San Diego : Elsevier Science & Technology,c2020 |n Druck-Ausgabe |z 978-0-12-821986-7 |
912 | |a ZDB-30-PQE | ||
999 | |a oai:aleph.bib-bvb.de:BVB01-032844119 | ||
966 | e | |u https://ebookcentral.proquest.com/lib/munchentech/detail.action?docID=6346763 |l TUM01 |p ZDB-30-PQE |q TUM_PDA_PQE_Kauf |x Aggregator |3 Volltext |
Datensatz im Suchindex
_version_ | 1804182734727282688 |
---|---|
adam_txt | |
any_adam_object | |
any_adam_object_boolean | |
author | Yang, Xin-She 1965- |
author_GND | (DE-588)1043733906 |
author_facet | Yang, Xin-She 1965- |
author_role | aut |
author_sort | Yang, Xin-She 1965- |
author_variant | x s y xsy |
building | Verbundindex |
bvnumber | BV047441967 |
classification_rvk | ST 134 |
classification_tum | DAT 718 DAT 530 |
collection | ZDB-30-PQE |
contents | Front Cover -- Nature-Inspired Optimization Algorithms -- Copyright -- Contents -- About the Author -- Preface -- Acknowledgements -- 1 Introduction to Algorithms -- 1.1 What Is an Algorithm? -- 1.2 Newton's Method -- 1.3 Formulation of Optimization Problems -- 1.3.1 Optimization Formulation -- 1.3.2 Classi cation of Optimization Problems -- 1.3.3 Classi cation of Optimization Algorithms -- 1.4 Optimization Algorithms -- 1.4.1 Gradient-Based Algorithms -- 1.4.2 Hill Climbing With Random Restart -- 1.5 Search for Optimality -- 1.6 No-Free-Lunch Theorems -- 1.6.1 NFL Theorems -- 1.6.2 Choice of Algorithms -- 1.7 Nature-Inspired Metaheuristics -- 1.8 A Brief History of Metaheuristics -- References -- 2 Mathematical Foundations -- 2.1 Introduction -- 2.2 Norms, Eigenvalues and Eigenvectors -- 2.2.1 Norms -- 2.2.2 Eigenvalues and Eigenvectors -- 2.2.3 Optimality Conditions -- 2.3 Sequences and Series -- 2.3.1 Convergence Sequences -- 2.3.2 Series -- 2.4 Computational Complexity -- 2.5 Convexity -- 2.6 Random Variables and Probability Distributions -- 2.6.1 Random Variables -- 2.6.2 Common Probability Distributions -- 2.6.3 Distributions With Long Tails -- 2.6.4 Entropy and Information Measures -- References -- 3 Analysis of Algorithms -- 3.1 Introduction -- 3.2 Analysis of Optimization Algorithms -- 3.2.1 Algorithm as an Iterative Process -- 3.2.2 An Ideal Algorithm? -- 3.2.3 A Self-Organization System -- 3.2.4 Exploration and Exploitation -- 3.2.5 Evolutionary Operators -- 3.3 Nature-Inspired Algorithms -- 3.3.1 Simulated Annealing -- 3.3.2 Genetic Algorithms -- 3.3.3 Differential Evolution -- 3.3.4 Ant and Bee Algorithms -- 3.3.5 Particle Swarm Optimization -- 3.3.6 The Fire y Algorithm -- 3.3.7 Cuckoo Search -- 3.3.8 The Bat Algorithm -- 3.3.9 The Flower Algorithm -- 3.4 Other Algorithms and Recent Developments 3.5 Parameter Tuning and Parameter Control -- 3.5.1 Parameter Tuning -- 3.5.1.1 Hyperoptimization -- 3.5.1.2 Multi-Objective View -- 3.5.2 Parameter Control -- 3.6 Discussions -- 3.7 Summary -- References -- 4 Random Walks and Optimization -- 4.1 Isotropic Random Walks -- 4.2 Lévy Distribution and Lévy Flights -- 4.3 Optimization as Markov Chains -- 4.3.1 Markov Chain -- 4.3.2 Optimization as a Markov Chain -- 4.4 Step Sizes and Search Ef ciency -- 4.4.1 Step Sizes, Stopping Criteria, and Ef ciency -- 4.4.2 Why Lévy Flights are more Ef cient -- 4.5 Modality and Optimal Balance -- 4.5.1 Modality and Intermittent Search Strategy -- 4.5.2 Optimal Balance of Exploration and Exploitation -- 4.6 Importance of Randomization -- 4.6.1 Ways to Carry Out Random Walks -- 4.6.2 Importance of Initialization -- 4.6.3 Importance Sampling -- 4.6.4 Low-Discrepancy Sequences -- 4.7 Eagle Strategy -- 4.7.1 Basic Ideas of Eagle Strategy -- 4.7.2 Why Eagle Strategy is so Ef cient -- References -- 5 Simulated Annealing -- 5.1 Annealing and Boltzmann Distribution -- 5.2 SA Parameters -- 5.3 SA Algorithm -- 5.4 Basic Convergence Properties -- 5.5 SA Behavior in Practice -- 5.6 Stochastic Tunneling -- References -- 6 Genetic Algorithms -- 6.1 Introduction -- 6.2 Genetic Algorithms -- 6.3 Role of Genetic Operators -- 6.4 Choice of Parameters -- 6.5 GA Variants -- 6.6 Schema Theorem -- 6.7 Convergence Analysis -- References -- 7 Differential Evolution -- 7.1 Introduction -- 7.2 Differential Evolution -- 7.3 Variants -- 7.4 Choice of Parameters -- 7.5 Convergence Analysis -- 7.6 Implementation -- References -- 8 Particle Swarm Optimization -- 8.1 Swarm Intelligence -- 8.2 PSO Algorithm -- 8.3 Accelerated PSO -- 8.4 Implementation -- 8.5 Convergence Analysis -- 8.5.1 Dynamical System -- 8.5.2 Markov Chain Approach -- 8.6 Binary PSO -- References -- 9 Fire y Algorithms 9.1 The Fire y Algorithm -- 9.1.1 Fire y Behavior -- 9.1.2 Standard Fire y Algorithm -- 9.1.3 Variations of Light Intensity and Attractiveness -- 9.1.4 Controlling Randomization -- 9.2 Algorithm Analysis -- 9.2.1 Scalings and Limiting Cases -- 9.2.2 Attraction and Diffusion -- 9.2.3 Special Cases of FA -- 9.3 Implementation -- 9.4 Variants of the Fire y Algorithm -- 9.4.1 FA Variants -- 9.4.2 How Can We Discretize FA? -- 9.5 Fire y Algorithm in Applications -- 9.6 Why the Fire y Algorithm Is Ef cient -- References -- 10 Cuckoo Search -- 10.1 Cuckoo Breeding Behavior -- 10.2 Lévy Flights -- 10.3 Cuckoo Search -- 10.3.1 Special Cases of Cuckoo Search -- 10.3.2 How to Carry out Lévy Flights -- 10.3.3 Choice of Parameters -- 10.4 Implementation -- 10.5 Variants of Cuckoo Search -- 10.6 Why Cuckoo Search Is so Ef cient -- 10.7 Global Convergence: Brief Mathematical Analysis -- 10.8 Applications -- References -- 11 Bat Algorithms -- 11.1 Echolocation of Bats -- 11.1.1 Behavior of Microbats -- 11.1.2 Acoustics of Echolocation -- 11.2 Bat Algorithms -- 11.2.1 Movement of Virtual Bats -- 11.2.2 Loudness and Pulse Emission -- 11.3 Implementation -- 11.4 Binary Bat Algorithms -- 11.5 Variants of the Bat Algorithm -- 11.6 Convergence and Stability Analysis -- 11.7 Why the Bat Algorithm is Ef cient -- 11.8 Applications -- References -- 12 Flower Pollination Algorithms -- 12.1 Introduction -- 12.2 Flower Pollination Algorithm -- 12.2.1 Characteristics of Flower Pollination -- 12.2.2 Flower Pollination Algorithm -- 12.3 Implementation -- 12.4 Multi-Objective Flower Pollination Algorithm -- 12.5 Validation and Numerical Experiments -- 12.5.1 Single-Objective Test Functions -- 12.5.2 Multi-Objective Test Functions -- 12.5.3 Analysis of Results and Comparison -- 12.6 Engineering Design Benchmarks -- 12.6.1 Single-Objective Design Benchmarks 12.6.1.1 Spring Design Optimization -- 12.6.1.2 Welded Beam Design -- 12.6.1.3 Pressure Vessel Design -- 12.6.2 Bi-Objective Disc Design -- 12.7 Variants and Applications -- References -- 13 A Framework for Self-Tuning Algorithms -- 13.1 Introduction -- 13.2 Algorithm Analysis and Parameter Tuning -- 13.2.1 A General Formula for Algorithms -- 13.2.2 Type of Optimality -- 13.2.3 Parameter Tuning -- 13.3 Framework for Self-Tuning Algorithms -- 13.3.1 Hyperoptimization -- 13.3.2 A Multi-Objective View -- 13.3.3 Self-Tuning Framework -- 13.4 Self-Tuning Fire y Algorithm -- 13.5 Some Remarks -- References -- 14 How to Deal With Constraints -- 14.1 Introduction and Overview -- 14.2 Method of Lagrange Multipliers -- 14.3 KKT Conditions -- 14.4 Classic Constraint-Handling Techniques -- 14.4.1 Penalty Method -- 14.4.2 Barrier Function Method -- 14.4.3 Adaptive and Dynamic Penalty Method -- 14.4.4 Equality With Tolerance -- 14.5 Modern Constraint-Handling Techniques -- 14.5.1 Feasibility Rules -- 14.5.2 Stochastic Ranking -- 14.5.3 The ε-Constrained Approach -- 14.5.4 Multi-Objective Approach to Constraints -- 14.5.5 Recent Developments -- 14.6 An Example: Pressure Vessel Design -- 14.7 Concluding Remarks -- References -- 15 Multi-Objective Optimization -- 15.1 Multi-Objective Optimization -- 15.2 Pareto Optimality -- 15.3 Weighted Sum Method -- 15.4 Utility Method -- 15.5 The ε-Constraint Method -- 15.6 Nature-Inspired Metaheuristics -- 15.6.1 Metaheuristic Approaches -- 15.6.2 NSGA-II -- 15.7 Recent Trends -- References -- 16 Data Mining and Deep Learning -- 16.1 Introduction to Data Mining -- 16.2 Clustering -- 16.2.1 Clustering and Distances -- 16.2.2 The kNN Algorithm -- 16.2.3 The k-Means Algorithm -- 16.2.4 Nature-Inspired Algorithms for Data Analysis -- 16.3 Support Vector Machine -- 16.3.1 Linear SVM -- 16.3.2 Nonlinear SVM. 16.3.3 Nature-Inspired Algorithms for SVM -- 16.4 Arti cial Neural Networks -- 16.4.1 Machine Learning -- 16.4.2 Neural Models -- 16.4.3 Neural Networks -- 16.5 Optimizers for Machine Learning -- 16.6 Deep Learning -- 16.6.1 Recent Developments -- 16.6.2 Hyperparameter Tuning -- 16.6.3 Nature-Inspired Algorithms for Deep Learning -- References -- A Test Function Benchmarks for Global Optimization -- References -- B Matlab Programs -- B.1 Simulated Annealing -- B.2 Accelerated Particle Swarm Optimization -- B.3 Differential Evolution -- B.4 The Fire y Algorithm -- B.5 Cuckoo Search -- B.6 The Bat Algorithm -- B.7 The Flower Pollination Algorithm -- References -- Index -- Back Cover |
ctrlnum | (ZDB-30-PQE)EBC6346763 (ZDB-30-PAD)EBC6346763 (ZDB-89-EBL)EBL6346763 (OCoLC)1197809406 (DE-599)BVBBV047441967 |
dewey-full | 571.0284 |
dewey-hundreds | 500 - Natural sciences and mathematics |
dewey-ones | 571 - Physiology & related subjects |
dewey-raw | 571.0284 |
dewey-search | 571.0284 |
dewey-sort | 3571.0284 |
dewey-tens | 570 - Biology |
discipline | Biologie Informatik |
discipline_str_mv | Biologie Informatik |
edition | Second edition |
format | Electronic eBook |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>10150nmm a2200553zc 4500</leader><controlfield tag="001">BV047441967</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20240219 </controlfield><controlfield tag="007">cr|uuu---uuuuu</controlfield><controlfield tag="008">210827s2021 |||| o||u| ||||||eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9780128219898</subfield><subfield code="9">978-0-12-821989-8</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(ZDB-30-PQE)EBC6346763</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(ZDB-30-PAD)EBC6346763</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(ZDB-89-EBL)EBL6346763</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1197809406</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV047441967</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">rda</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-91</subfield></datafield><datafield tag="082" ind1="0" ind2=" "><subfield code="a">571.0284</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 134</subfield><subfield code="0">(DE-625)143590:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">DAT 718</subfield><subfield code="2">stub</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">DAT 530</subfield><subfield code="2">stub</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Yang, Xin-She</subfield><subfield code="d">1965-</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)1043733906</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Nature-inspired optimization algorithms</subfield><subfield code="c">Xin-She Yan</subfield></datafield><datafield tag="250" ind1=" " ind2=" "><subfield code="a">Second edition</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">London, United Kingdom ; San Diego, CA, United States ; Cambridge, MA, United States ; Kidlington, Oxford, United Kingdom</subfield><subfield code="b">Academic Press</subfield><subfield code="c">[2021]</subfield></datafield><datafield tag="264" ind1=" " ind2="4"><subfield code="c">© 2021</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 Online-Ressource (xvii, 292 Seiten)</subfield><subfield code="b">Illustrationen, Diagramme</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Front Cover -- Nature-Inspired Optimization Algorithms -- Copyright -- Contents -- About the Author -- Preface -- Acknowledgements -- 1 Introduction to Algorithms -- 1.1 What Is an Algorithm? -- 1.2 Newton's Method -- 1.3 Formulation of Optimization Problems -- 1.3.1 Optimization Formulation -- 1.3.2 Classi cation of Optimization Problems -- 1.3.3 Classi cation of Optimization Algorithms -- 1.4 Optimization Algorithms -- 1.4.1 Gradient-Based Algorithms -- 1.4.2 Hill Climbing With Random Restart -- 1.5 Search for Optimality -- 1.6 No-Free-Lunch Theorems -- 1.6.1 NFL Theorems -- 1.6.2 Choice of Algorithms -- 1.7 Nature-Inspired Metaheuristics -- 1.8 A Brief History of Metaheuristics -- References -- 2 Mathematical Foundations -- 2.1 Introduction -- 2.2 Norms, Eigenvalues and Eigenvectors -- 2.2.1 Norms -- 2.2.2 Eigenvalues and Eigenvectors -- 2.2.3 Optimality Conditions -- 2.3 Sequences and Series -- 2.3.1 Convergence Sequences -- 2.3.2 Series -- 2.4 Computational Complexity -- 2.5 Convexity -- 2.6 Random Variables and Probability Distributions -- 2.6.1 Random Variables -- 2.6.2 Common Probability Distributions -- 2.6.3 Distributions With Long Tails -- 2.6.4 Entropy and Information Measures -- References -- 3 Analysis of Algorithms -- 3.1 Introduction -- 3.2 Analysis of Optimization Algorithms -- 3.2.1 Algorithm as an Iterative Process -- 3.2.2 An Ideal Algorithm? -- 3.2.3 A Self-Organization System -- 3.2.4 Exploration and Exploitation -- 3.2.5 Evolutionary Operators -- 3.3 Nature-Inspired Algorithms -- 3.3.1 Simulated Annealing -- 3.3.2 Genetic Algorithms -- 3.3.3 Differential Evolution -- 3.3.4 Ant and Bee Algorithms -- 3.3.5 Particle Swarm Optimization -- 3.3.6 The Fire y Algorithm -- 3.3.7 Cuckoo Search -- 3.3.8 The Bat Algorithm -- 3.3.9 The Flower Algorithm -- 3.4 Other Algorithms and Recent Developments</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">3.5 Parameter Tuning and Parameter Control -- 3.5.1 Parameter Tuning -- 3.5.1.1 Hyperoptimization -- 3.5.1.2 Multi-Objective View -- 3.5.2 Parameter Control -- 3.6 Discussions -- 3.7 Summary -- References -- 4 Random Walks and Optimization -- 4.1 Isotropic Random Walks -- 4.2 Lévy Distribution and Lévy Flights -- 4.3 Optimization as Markov Chains -- 4.3.1 Markov Chain -- 4.3.2 Optimization as a Markov Chain -- 4.4 Step Sizes and Search Ef ciency -- 4.4.1 Step Sizes, Stopping Criteria, and Ef ciency -- 4.4.2 Why Lévy Flights are more Ef cient -- 4.5 Modality and Optimal Balance -- 4.5.1 Modality and Intermittent Search Strategy -- 4.5.2 Optimal Balance of Exploration and Exploitation -- 4.6 Importance of Randomization -- 4.6.1 Ways to Carry Out Random Walks -- 4.6.2 Importance of Initialization -- 4.6.3 Importance Sampling -- 4.6.4 Low-Discrepancy Sequences -- 4.7 Eagle Strategy -- 4.7.1 Basic Ideas of Eagle Strategy -- 4.7.2 Why Eagle Strategy is so Ef cient -- References -- 5 Simulated Annealing -- 5.1 Annealing and Boltzmann Distribution -- 5.2 SA Parameters -- 5.3 SA Algorithm -- 5.4 Basic Convergence Properties -- 5.5 SA Behavior in Practice -- 5.6 Stochastic Tunneling -- References -- 6 Genetic Algorithms -- 6.1 Introduction -- 6.2 Genetic Algorithms -- 6.3 Role of Genetic Operators -- 6.4 Choice of Parameters -- 6.5 GA Variants -- 6.6 Schema Theorem -- 6.7 Convergence Analysis -- References -- 7 Differential Evolution -- 7.1 Introduction -- 7.2 Differential Evolution -- 7.3 Variants -- 7.4 Choice of Parameters -- 7.5 Convergence Analysis -- 7.6 Implementation -- References -- 8 Particle Swarm Optimization -- 8.1 Swarm Intelligence -- 8.2 PSO Algorithm -- 8.3 Accelerated PSO -- 8.4 Implementation -- 8.5 Convergence Analysis -- 8.5.1 Dynamical System -- 8.5.2 Markov Chain Approach -- 8.6 Binary PSO -- References -- 9 Fire y Algorithms</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">9.1 The Fire y Algorithm -- 9.1.1 Fire y Behavior -- 9.1.2 Standard Fire y Algorithm -- 9.1.3 Variations of Light Intensity and Attractiveness -- 9.1.4 Controlling Randomization -- 9.2 Algorithm Analysis -- 9.2.1 Scalings and Limiting Cases -- 9.2.2 Attraction and Diffusion -- 9.2.3 Special Cases of FA -- 9.3 Implementation -- 9.4 Variants of the Fire y Algorithm -- 9.4.1 FA Variants -- 9.4.2 How Can We Discretize FA? -- 9.5 Fire y Algorithm in Applications -- 9.6 Why the Fire y Algorithm Is Ef cient -- References -- 10 Cuckoo Search -- 10.1 Cuckoo Breeding Behavior -- 10.2 Lévy Flights -- 10.3 Cuckoo Search -- 10.3.1 Special Cases of Cuckoo Search -- 10.3.2 How to Carry out Lévy Flights -- 10.3.3 Choice of Parameters -- 10.4 Implementation -- 10.5 Variants of Cuckoo Search -- 10.6 Why Cuckoo Search Is so Ef cient -- 10.7 Global Convergence: Brief Mathematical Analysis -- 10.8 Applications -- References -- 11 Bat Algorithms -- 11.1 Echolocation of Bats -- 11.1.1 Behavior of Microbats -- 11.1.2 Acoustics of Echolocation -- 11.2 Bat Algorithms -- 11.2.1 Movement of Virtual Bats -- 11.2.2 Loudness and Pulse Emission -- 11.3 Implementation -- 11.4 Binary Bat Algorithms -- 11.5 Variants of the Bat Algorithm -- 11.6 Convergence and Stability Analysis -- 11.7 Why the Bat Algorithm is Ef cient -- 11.8 Applications -- References -- 12 Flower Pollination Algorithms -- 12.1 Introduction -- 12.2 Flower Pollination Algorithm -- 12.2.1 Characteristics of Flower Pollination -- 12.2.2 Flower Pollination Algorithm -- 12.3 Implementation -- 12.4 Multi-Objective Flower Pollination Algorithm -- 12.5 Validation and Numerical Experiments -- 12.5.1 Single-Objective Test Functions -- 12.5.2 Multi-Objective Test Functions -- 12.5.3 Analysis of Results and Comparison -- 12.6 Engineering Design Benchmarks -- 12.6.1 Single-Objective Design Benchmarks</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">12.6.1.1 Spring Design Optimization -- 12.6.1.2 Welded Beam Design -- 12.6.1.3 Pressure Vessel Design -- 12.6.2 Bi-Objective Disc Design -- 12.7 Variants and Applications -- References -- 13 A Framework for Self-Tuning Algorithms -- 13.1 Introduction -- 13.2 Algorithm Analysis and Parameter Tuning -- 13.2.1 A General Formula for Algorithms -- 13.2.2 Type of Optimality -- 13.2.3 Parameter Tuning -- 13.3 Framework for Self-Tuning Algorithms -- 13.3.1 Hyperoptimization -- 13.3.2 A Multi-Objective View -- 13.3.3 Self-Tuning Framework -- 13.4 Self-Tuning Fire y Algorithm -- 13.5 Some Remarks -- References -- 14 How to Deal With Constraints -- 14.1 Introduction and Overview -- 14.2 Method of Lagrange Multipliers -- 14.3 KKT Conditions -- 14.4 Classic Constraint-Handling Techniques -- 14.4.1 Penalty Method -- 14.4.2 Barrier Function Method -- 14.4.3 Adaptive and Dynamic Penalty Method -- 14.4.4 Equality With Tolerance -- 14.5 Modern Constraint-Handling Techniques -- 14.5.1 Feasibility Rules -- 14.5.2 Stochastic Ranking -- 14.5.3 The ε-Constrained Approach -- 14.5.4 Multi-Objective Approach to Constraints -- 14.5.5 Recent Developments -- 14.6 An Example: Pressure Vessel Design -- 14.7 Concluding Remarks -- References -- 15 Multi-Objective Optimization -- 15.1 Multi-Objective Optimization -- 15.2 Pareto Optimality -- 15.3 Weighted Sum Method -- 15.4 Utility Method -- 15.5 The ε-Constraint Method -- 15.6 Nature-Inspired Metaheuristics -- 15.6.1 Metaheuristic Approaches -- 15.6.2 NSGA-II -- 15.7 Recent Trends -- References -- 16 Data Mining and Deep Learning -- 16.1 Introduction to Data Mining -- 16.2 Clustering -- 16.2.1 Clustering and Distances -- 16.2.2 The kNN Algorithm -- 16.2.3 The k-Means Algorithm -- 16.2.4 Nature-Inspired Algorithms for Data Analysis -- 16.3 Support Vector Machine -- 16.3.1 Linear SVM -- 16.3.2 Nonlinear SVM.</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">16.3.3 Nature-Inspired Algorithms for SVM -- 16.4 Arti cial Neural Networks -- 16.4.1 Machine Learning -- 16.4.2 Neural Models -- 16.4.3 Neural Networks -- 16.5 Optimizers for Machine Learning -- 16.6 Deep Learning -- 16.6.1 Recent Developments -- 16.6.2 Hyperparameter Tuning -- 16.6.3 Nature-Inspired Algorithms for Deep Learning -- References -- A Test Function Benchmarks for Global Optimization -- References -- B Matlab Programs -- B.1 Simulated Annealing -- B.2 Accelerated Particle Swarm Optimization -- B.3 Differential Evolution -- B.4 The Fire y Algorithm -- B.5 Cuckoo Search -- B.6 The Bat Algorithm -- B.7 The Flower Pollination Algorithm -- References -- Index -- Back Cover</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Nature-inspired algorithms</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Optimierung</subfield><subfield code="0">(DE-588)4043664-0</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Natur</subfield><subfield code="0">(DE-588)4041358-5</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Algorithmus</subfield><subfield code="0">(DE-588)4001183-5</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Algorithmus</subfield><subfield code="0">(DE-588)4001183-5</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="1"><subfield code="a">Optimierung</subfield><subfield code="0">(DE-588)4043664-0</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="2"><subfield code="a">Natur</subfield><subfield code="0">(DE-588)4041358-5</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Erscheint auch als</subfield><subfield code="a">Yang, Xin-She</subfield><subfield code="t">Nature-Inspired Optimization Algorithms</subfield><subfield code="d">San Diego : Elsevier Science & Technology,c2020</subfield><subfield code="n">Druck-Ausgabe</subfield><subfield code="z">978-0-12-821986-7</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-30-PQE</subfield></datafield><datafield tag="999" ind1=" " ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-032844119</subfield></datafield><datafield tag="966" ind1="e" ind2=" "><subfield code="u">https://ebookcentral.proquest.com/lib/munchentech/detail.action?docID=6346763</subfield><subfield code="l">TUM01</subfield><subfield code="p">ZDB-30-PQE</subfield><subfield code="q">TUM_PDA_PQE_Kauf</subfield><subfield code="x">Aggregator</subfield><subfield code="3">Volltext</subfield></datafield></record></collection> |
id | DE-604.BV047441967 |
illustrated | Not Illustrated |
index_date | 2024-07-03T18:01:24Z |
indexdate | 2024-07-10T09:12:16Z |
institution | BVB |
isbn | 9780128219898 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-032844119 |
oclc_num | 1197809406 |
open_access_boolean | |
owner | DE-91 DE-BY-TUM |
owner_facet | DE-91 DE-BY-TUM |
physical | 1 Online-Ressource (xvii, 292 Seiten) Illustrationen, Diagramme |
psigel | ZDB-30-PQE ZDB-30-PQE TUM_PDA_PQE_Kauf |
publishDate | 2021 |
publishDateSearch | 2021 |
publishDateSort | 2021 |
publisher | Academic Press |
record_format | marc |
spelling | Yang, Xin-She 1965- Verfasser (DE-588)1043733906 aut Nature-inspired optimization algorithms Xin-She Yan Second edition London, United Kingdom ; San Diego, CA, United States ; Cambridge, MA, United States ; Kidlington, Oxford, United Kingdom Academic Press [2021] © 2021 1 Online-Ressource (xvii, 292 Seiten) Illustrationen, Diagramme txt rdacontent c rdamedia cr rdacarrier Front Cover -- Nature-Inspired Optimization Algorithms -- Copyright -- Contents -- About the Author -- Preface -- Acknowledgements -- 1 Introduction to Algorithms -- 1.1 What Is an Algorithm? -- 1.2 Newton's Method -- 1.3 Formulation of Optimization Problems -- 1.3.1 Optimization Formulation -- 1.3.2 Classi cation of Optimization Problems -- 1.3.3 Classi cation of Optimization Algorithms -- 1.4 Optimization Algorithms -- 1.4.1 Gradient-Based Algorithms -- 1.4.2 Hill Climbing With Random Restart -- 1.5 Search for Optimality -- 1.6 No-Free-Lunch Theorems -- 1.6.1 NFL Theorems -- 1.6.2 Choice of Algorithms -- 1.7 Nature-Inspired Metaheuristics -- 1.8 A Brief History of Metaheuristics -- References -- 2 Mathematical Foundations -- 2.1 Introduction -- 2.2 Norms, Eigenvalues and Eigenvectors -- 2.2.1 Norms -- 2.2.2 Eigenvalues and Eigenvectors -- 2.2.3 Optimality Conditions -- 2.3 Sequences and Series -- 2.3.1 Convergence Sequences -- 2.3.2 Series -- 2.4 Computational Complexity -- 2.5 Convexity -- 2.6 Random Variables and Probability Distributions -- 2.6.1 Random Variables -- 2.6.2 Common Probability Distributions -- 2.6.3 Distributions With Long Tails -- 2.6.4 Entropy and Information Measures -- References -- 3 Analysis of Algorithms -- 3.1 Introduction -- 3.2 Analysis of Optimization Algorithms -- 3.2.1 Algorithm as an Iterative Process -- 3.2.2 An Ideal Algorithm? -- 3.2.3 A Self-Organization System -- 3.2.4 Exploration and Exploitation -- 3.2.5 Evolutionary Operators -- 3.3 Nature-Inspired Algorithms -- 3.3.1 Simulated Annealing -- 3.3.2 Genetic Algorithms -- 3.3.3 Differential Evolution -- 3.3.4 Ant and Bee Algorithms -- 3.3.5 Particle Swarm Optimization -- 3.3.6 The Fire y Algorithm -- 3.3.7 Cuckoo Search -- 3.3.8 The Bat Algorithm -- 3.3.9 The Flower Algorithm -- 3.4 Other Algorithms and Recent Developments 3.5 Parameter Tuning and Parameter Control -- 3.5.1 Parameter Tuning -- 3.5.1.1 Hyperoptimization -- 3.5.1.2 Multi-Objective View -- 3.5.2 Parameter Control -- 3.6 Discussions -- 3.7 Summary -- References -- 4 Random Walks and Optimization -- 4.1 Isotropic Random Walks -- 4.2 Lévy Distribution and Lévy Flights -- 4.3 Optimization as Markov Chains -- 4.3.1 Markov Chain -- 4.3.2 Optimization as a Markov Chain -- 4.4 Step Sizes and Search Ef ciency -- 4.4.1 Step Sizes, Stopping Criteria, and Ef ciency -- 4.4.2 Why Lévy Flights are more Ef cient -- 4.5 Modality and Optimal Balance -- 4.5.1 Modality and Intermittent Search Strategy -- 4.5.2 Optimal Balance of Exploration and Exploitation -- 4.6 Importance of Randomization -- 4.6.1 Ways to Carry Out Random Walks -- 4.6.2 Importance of Initialization -- 4.6.3 Importance Sampling -- 4.6.4 Low-Discrepancy Sequences -- 4.7 Eagle Strategy -- 4.7.1 Basic Ideas of Eagle Strategy -- 4.7.2 Why Eagle Strategy is so Ef cient -- References -- 5 Simulated Annealing -- 5.1 Annealing and Boltzmann Distribution -- 5.2 SA Parameters -- 5.3 SA Algorithm -- 5.4 Basic Convergence Properties -- 5.5 SA Behavior in Practice -- 5.6 Stochastic Tunneling -- References -- 6 Genetic Algorithms -- 6.1 Introduction -- 6.2 Genetic Algorithms -- 6.3 Role of Genetic Operators -- 6.4 Choice of Parameters -- 6.5 GA Variants -- 6.6 Schema Theorem -- 6.7 Convergence Analysis -- References -- 7 Differential Evolution -- 7.1 Introduction -- 7.2 Differential Evolution -- 7.3 Variants -- 7.4 Choice of Parameters -- 7.5 Convergence Analysis -- 7.6 Implementation -- References -- 8 Particle Swarm Optimization -- 8.1 Swarm Intelligence -- 8.2 PSO Algorithm -- 8.3 Accelerated PSO -- 8.4 Implementation -- 8.5 Convergence Analysis -- 8.5.1 Dynamical System -- 8.5.2 Markov Chain Approach -- 8.6 Binary PSO -- References -- 9 Fire y Algorithms 9.1 The Fire y Algorithm -- 9.1.1 Fire y Behavior -- 9.1.2 Standard Fire y Algorithm -- 9.1.3 Variations of Light Intensity and Attractiveness -- 9.1.4 Controlling Randomization -- 9.2 Algorithm Analysis -- 9.2.1 Scalings and Limiting Cases -- 9.2.2 Attraction and Diffusion -- 9.2.3 Special Cases of FA -- 9.3 Implementation -- 9.4 Variants of the Fire y Algorithm -- 9.4.1 FA Variants -- 9.4.2 How Can We Discretize FA? -- 9.5 Fire y Algorithm in Applications -- 9.6 Why the Fire y Algorithm Is Ef cient -- References -- 10 Cuckoo Search -- 10.1 Cuckoo Breeding Behavior -- 10.2 Lévy Flights -- 10.3 Cuckoo Search -- 10.3.1 Special Cases of Cuckoo Search -- 10.3.2 How to Carry out Lévy Flights -- 10.3.3 Choice of Parameters -- 10.4 Implementation -- 10.5 Variants of Cuckoo Search -- 10.6 Why Cuckoo Search Is so Ef cient -- 10.7 Global Convergence: Brief Mathematical Analysis -- 10.8 Applications -- References -- 11 Bat Algorithms -- 11.1 Echolocation of Bats -- 11.1.1 Behavior of Microbats -- 11.1.2 Acoustics of Echolocation -- 11.2 Bat Algorithms -- 11.2.1 Movement of Virtual Bats -- 11.2.2 Loudness and Pulse Emission -- 11.3 Implementation -- 11.4 Binary Bat Algorithms -- 11.5 Variants of the Bat Algorithm -- 11.6 Convergence and Stability Analysis -- 11.7 Why the Bat Algorithm is Ef cient -- 11.8 Applications -- References -- 12 Flower Pollination Algorithms -- 12.1 Introduction -- 12.2 Flower Pollination Algorithm -- 12.2.1 Characteristics of Flower Pollination -- 12.2.2 Flower Pollination Algorithm -- 12.3 Implementation -- 12.4 Multi-Objective Flower Pollination Algorithm -- 12.5 Validation and Numerical Experiments -- 12.5.1 Single-Objective Test Functions -- 12.5.2 Multi-Objective Test Functions -- 12.5.3 Analysis of Results and Comparison -- 12.6 Engineering Design Benchmarks -- 12.6.1 Single-Objective Design Benchmarks 12.6.1.1 Spring Design Optimization -- 12.6.1.2 Welded Beam Design -- 12.6.1.3 Pressure Vessel Design -- 12.6.2 Bi-Objective Disc Design -- 12.7 Variants and Applications -- References -- 13 A Framework for Self-Tuning Algorithms -- 13.1 Introduction -- 13.2 Algorithm Analysis and Parameter Tuning -- 13.2.1 A General Formula for Algorithms -- 13.2.2 Type of Optimality -- 13.2.3 Parameter Tuning -- 13.3 Framework for Self-Tuning Algorithms -- 13.3.1 Hyperoptimization -- 13.3.2 A Multi-Objective View -- 13.3.3 Self-Tuning Framework -- 13.4 Self-Tuning Fire y Algorithm -- 13.5 Some Remarks -- References -- 14 How to Deal With Constraints -- 14.1 Introduction and Overview -- 14.2 Method of Lagrange Multipliers -- 14.3 KKT Conditions -- 14.4 Classic Constraint-Handling Techniques -- 14.4.1 Penalty Method -- 14.4.2 Barrier Function Method -- 14.4.3 Adaptive and Dynamic Penalty Method -- 14.4.4 Equality With Tolerance -- 14.5 Modern Constraint-Handling Techniques -- 14.5.1 Feasibility Rules -- 14.5.2 Stochastic Ranking -- 14.5.3 The ε-Constrained Approach -- 14.5.4 Multi-Objective Approach to Constraints -- 14.5.5 Recent Developments -- 14.6 An Example: Pressure Vessel Design -- 14.7 Concluding Remarks -- References -- 15 Multi-Objective Optimization -- 15.1 Multi-Objective Optimization -- 15.2 Pareto Optimality -- 15.3 Weighted Sum Method -- 15.4 Utility Method -- 15.5 The ε-Constraint Method -- 15.6 Nature-Inspired Metaheuristics -- 15.6.1 Metaheuristic Approaches -- 15.6.2 NSGA-II -- 15.7 Recent Trends -- References -- 16 Data Mining and Deep Learning -- 16.1 Introduction to Data Mining -- 16.2 Clustering -- 16.2.1 Clustering and Distances -- 16.2.2 The kNN Algorithm -- 16.2.3 The k-Means Algorithm -- 16.2.4 Nature-Inspired Algorithms for Data Analysis -- 16.3 Support Vector Machine -- 16.3.1 Linear SVM -- 16.3.2 Nonlinear SVM. 16.3.3 Nature-Inspired Algorithms for SVM -- 16.4 Arti cial Neural Networks -- 16.4.1 Machine Learning -- 16.4.2 Neural Models -- 16.4.3 Neural Networks -- 16.5 Optimizers for Machine Learning -- 16.6 Deep Learning -- 16.6.1 Recent Developments -- 16.6.2 Hyperparameter Tuning -- 16.6.3 Nature-Inspired Algorithms for Deep Learning -- References -- A Test Function Benchmarks for Global Optimization -- References -- B Matlab Programs -- B.1 Simulated Annealing -- B.2 Accelerated Particle Swarm Optimization -- B.3 Differential Evolution -- B.4 The Fire y Algorithm -- B.5 Cuckoo Search -- B.6 The Bat Algorithm -- B.7 The Flower Pollination Algorithm -- References -- Index -- Back Cover Nature-inspired algorithms Optimierung (DE-588)4043664-0 gnd rswk-swf Natur (DE-588)4041358-5 gnd rswk-swf Algorithmus (DE-588)4001183-5 gnd rswk-swf Algorithmus (DE-588)4001183-5 s Optimierung (DE-588)4043664-0 s Natur (DE-588)4041358-5 s DE-604 Erscheint auch als Yang, Xin-She Nature-Inspired Optimization Algorithms San Diego : Elsevier Science & Technology,c2020 Druck-Ausgabe 978-0-12-821986-7 |
spellingShingle | Yang, Xin-She 1965- Nature-inspired optimization algorithms Front Cover -- Nature-Inspired Optimization Algorithms -- Copyright -- Contents -- About the Author -- Preface -- Acknowledgements -- 1 Introduction to Algorithms -- 1.1 What Is an Algorithm? -- 1.2 Newton's Method -- 1.3 Formulation of Optimization Problems -- 1.3.1 Optimization Formulation -- 1.3.2 Classi cation of Optimization Problems -- 1.3.3 Classi cation of Optimization Algorithms -- 1.4 Optimization Algorithms -- 1.4.1 Gradient-Based Algorithms -- 1.4.2 Hill Climbing With Random Restart -- 1.5 Search for Optimality -- 1.6 No-Free-Lunch Theorems -- 1.6.1 NFL Theorems -- 1.6.2 Choice of Algorithms -- 1.7 Nature-Inspired Metaheuristics -- 1.8 A Brief History of Metaheuristics -- References -- 2 Mathematical Foundations -- 2.1 Introduction -- 2.2 Norms, Eigenvalues and Eigenvectors -- 2.2.1 Norms -- 2.2.2 Eigenvalues and Eigenvectors -- 2.2.3 Optimality Conditions -- 2.3 Sequences and Series -- 2.3.1 Convergence Sequences -- 2.3.2 Series -- 2.4 Computational Complexity -- 2.5 Convexity -- 2.6 Random Variables and Probability Distributions -- 2.6.1 Random Variables -- 2.6.2 Common Probability Distributions -- 2.6.3 Distributions With Long Tails -- 2.6.4 Entropy and Information Measures -- References -- 3 Analysis of Algorithms -- 3.1 Introduction -- 3.2 Analysis of Optimization Algorithms -- 3.2.1 Algorithm as an Iterative Process -- 3.2.2 An Ideal Algorithm? -- 3.2.3 A Self-Organization System -- 3.2.4 Exploration and Exploitation -- 3.2.5 Evolutionary Operators -- 3.3 Nature-Inspired Algorithms -- 3.3.1 Simulated Annealing -- 3.3.2 Genetic Algorithms -- 3.3.3 Differential Evolution -- 3.3.4 Ant and Bee Algorithms -- 3.3.5 Particle Swarm Optimization -- 3.3.6 The Fire y Algorithm -- 3.3.7 Cuckoo Search -- 3.3.8 The Bat Algorithm -- 3.3.9 The Flower Algorithm -- 3.4 Other Algorithms and Recent Developments 3.5 Parameter Tuning and Parameter Control -- 3.5.1 Parameter Tuning -- 3.5.1.1 Hyperoptimization -- 3.5.1.2 Multi-Objective View -- 3.5.2 Parameter Control -- 3.6 Discussions -- 3.7 Summary -- References -- 4 Random Walks and Optimization -- 4.1 Isotropic Random Walks -- 4.2 Lévy Distribution and Lévy Flights -- 4.3 Optimization as Markov Chains -- 4.3.1 Markov Chain -- 4.3.2 Optimization as a Markov Chain -- 4.4 Step Sizes and Search Ef ciency -- 4.4.1 Step Sizes, Stopping Criteria, and Ef ciency -- 4.4.2 Why Lévy Flights are more Ef cient -- 4.5 Modality and Optimal Balance -- 4.5.1 Modality and Intermittent Search Strategy -- 4.5.2 Optimal Balance of Exploration and Exploitation -- 4.6 Importance of Randomization -- 4.6.1 Ways to Carry Out Random Walks -- 4.6.2 Importance of Initialization -- 4.6.3 Importance Sampling -- 4.6.4 Low-Discrepancy Sequences -- 4.7 Eagle Strategy -- 4.7.1 Basic Ideas of Eagle Strategy -- 4.7.2 Why Eagle Strategy is so Ef cient -- References -- 5 Simulated Annealing -- 5.1 Annealing and Boltzmann Distribution -- 5.2 SA Parameters -- 5.3 SA Algorithm -- 5.4 Basic Convergence Properties -- 5.5 SA Behavior in Practice -- 5.6 Stochastic Tunneling -- References -- 6 Genetic Algorithms -- 6.1 Introduction -- 6.2 Genetic Algorithms -- 6.3 Role of Genetic Operators -- 6.4 Choice of Parameters -- 6.5 GA Variants -- 6.6 Schema Theorem -- 6.7 Convergence Analysis -- References -- 7 Differential Evolution -- 7.1 Introduction -- 7.2 Differential Evolution -- 7.3 Variants -- 7.4 Choice of Parameters -- 7.5 Convergence Analysis -- 7.6 Implementation -- References -- 8 Particle Swarm Optimization -- 8.1 Swarm Intelligence -- 8.2 PSO Algorithm -- 8.3 Accelerated PSO -- 8.4 Implementation -- 8.5 Convergence Analysis -- 8.5.1 Dynamical System -- 8.5.2 Markov Chain Approach -- 8.6 Binary PSO -- References -- 9 Fire y Algorithms 9.1 The Fire y Algorithm -- 9.1.1 Fire y Behavior -- 9.1.2 Standard Fire y Algorithm -- 9.1.3 Variations of Light Intensity and Attractiveness -- 9.1.4 Controlling Randomization -- 9.2 Algorithm Analysis -- 9.2.1 Scalings and Limiting Cases -- 9.2.2 Attraction and Diffusion -- 9.2.3 Special Cases of FA -- 9.3 Implementation -- 9.4 Variants of the Fire y Algorithm -- 9.4.1 FA Variants -- 9.4.2 How Can We Discretize FA? -- 9.5 Fire y Algorithm in Applications -- 9.6 Why the Fire y Algorithm Is Ef cient -- References -- 10 Cuckoo Search -- 10.1 Cuckoo Breeding Behavior -- 10.2 Lévy Flights -- 10.3 Cuckoo Search -- 10.3.1 Special Cases of Cuckoo Search -- 10.3.2 How to Carry out Lévy Flights -- 10.3.3 Choice of Parameters -- 10.4 Implementation -- 10.5 Variants of Cuckoo Search -- 10.6 Why Cuckoo Search Is so Ef cient -- 10.7 Global Convergence: Brief Mathematical Analysis -- 10.8 Applications -- References -- 11 Bat Algorithms -- 11.1 Echolocation of Bats -- 11.1.1 Behavior of Microbats -- 11.1.2 Acoustics of Echolocation -- 11.2 Bat Algorithms -- 11.2.1 Movement of Virtual Bats -- 11.2.2 Loudness and Pulse Emission -- 11.3 Implementation -- 11.4 Binary Bat Algorithms -- 11.5 Variants of the Bat Algorithm -- 11.6 Convergence and Stability Analysis -- 11.7 Why the Bat Algorithm is Ef cient -- 11.8 Applications -- References -- 12 Flower Pollination Algorithms -- 12.1 Introduction -- 12.2 Flower Pollination Algorithm -- 12.2.1 Characteristics of Flower Pollination -- 12.2.2 Flower Pollination Algorithm -- 12.3 Implementation -- 12.4 Multi-Objective Flower Pollination Algorithm -- 12.5 Validation and Numerical Experiments -- 12.5.1 Single-Objective Test Functions -- 12.5.2 Multi-Objective Test Functions -- 12.5.3 Analysis of Results and Comparison -- 12.6 Engineering Design Benchmarks -- 12.6.1 Single-Objective Design Benchmarks 12.6.1.1 Spring Design Optimization -- 12.6.1.2 Welded Beam Design -- 12.6.1.3 Pressure Vessel Design -- 12.6.2 Bi-Objective Disc Design -- 12.7 Variants and Applications -- References -- 13 A Framework for Self-Tuning Algorithms -- 13.1 Introduction -- 13.2 Algorithm Analysis and Parameter Tuning -- 13.2.1 A General Formula for Algorithms -- 13.2.2 Type of Optimality -- 13.2.3 Parameter Tuning -- 13.3 Framework for Self-Tuning Algorithms -- 13.3.1 Hyperoptimization -- 13.3.2 A Multi-Objective View -- 13.3.3 Self-Tuning Framework -- 13.4 Self-Tuning Fire y Algorithm -- 13.5 Some Remarks -- References -- 14 How to Deal With Constraints -- 14.1 Introduction and Overview -- 14.2 Method of Lagrange Multipliers -- 14.3 KKT Conditions -- 14.4 Classic Constraint-Handling Techniques -- 14.4.1 Penalty Method -- 14.4.2 Barrier Function Method -- 14.4.3 Adaptive and Dynamic Penalty Method -- 14.4.4 Equality With Tolerance -- 14.5 Modern Constraint-Handling Techniques -- 14.5.1 Feasibility Rules -- 14.5.2 Stochastic Ranking -- 14.5.3 The ε-Constrained Approach -- 14.5.4 Multi-Objective Approach to Constraints -- 14.5.5 Recent Developments -- 14.6 An Example: Pressure Vessel Design -- 14.7 Concluding Remarks -- References -- 15 Multi-Objective Optimization -- 15.1 Multi-Objective Optimization -- 15.2 Pareto Optimality -- 15.3 Weighted Sum Method -- 15.4 Utility Method -- 15.5 The ε-Constraint Method -- 15.6 Nature-Inspired Metaheuristics -- 15.6.1 Metaheuristic Approaches -- 15.6.2 NSGA-II -- 15.7 Recent Trends -- References -- 16 Data Mining and Deep Learning -- 16.1 Introduction to Data Mining -- 16.2 Clustering -- 16.2.1 Clustering and Distances -- 16.2.2 The kNN Algorithm -- 16.2.3 The k-Means Algorithm -- 16.2.4 Nature-Inspired Algorithms for Data Analysis -- 16.3 Support Vector Machine -- 16.3.1 Linear SVM -- 16.3.2 Nonlinear SVM. 16.3.3 Nature-Inspired Algorithms for SVM -- 16.4 Arti cial Neural Networks -- 16.4.1 Machine Learning -- 16.4.2 Neural Models -- 16.4.3 Neural Networks -- 16.5 Optimizers for Machine Learning -- 16.6 Deep Learning -- 16.6.1 Recent Developments -- 16.6.2 Hyperparameter Tuning -- 16.6.3 Nature-Inspired Algorithms for Deep Learning -- References -- A Test Function Benchmarks for Global Optimization -- References -- B Matlab Programs -- B.1 Simulated Annealing -- B.2 Accelerated Particle Swarm Optimization -- B.3 Differential Evolution -- B.4 The Fire y Algorithm -- B.5 Cuckoo Search -- B.6 The Bat Algorithm -- B.7 The Flower Pollination Algorithm -- References -- Index -- Back Cover Nature-inspired algorithms Optimierung (DE-588)4043664-0 gnd Natur (DE-588)4041358-5 gnd Algorithmus (DE-588)4001183-5 gnd |
subject_GND | (DE-588)4043664-0 (DE-588)4041358-5 (DE-588)4001183-5 |
title | Nature-inspired optimization algorithms |
title_auth | Nature-inspired optimization algorithms |
title_exact_search | Nature-inspired optimization algorithms |
title_exact_search_txtP | Nature-inspired optimization algorithms |
title_full | Nature-inspired optimization algorithms Xin-She Yan |
title_fullStr | Nature-inspired optimization algorithms Xin-She Yan |
title_full_unstemmed | Nature-inspired optimization algorithms Xin-She Yan |
title_short | Nature-inspired optimization algorithms |
title_sort | nature inspired optimization algorithms |
topic | Nature-inspired algorithms Optimierung (DE-588)4043664-0 gnd Natur (DE-588)4041358-5 gnd Algorithmus (DE-588)4001183-5 gnd |
topic_facet | Nature-inspired algorithms Optimierung Natur Algorithmus |
work_keys_str_mv | AT yangxinshe natureinspiredoptimizationalgorithms |