|
|
|
|
LEADER |
00000cam a2200000 i 4500 |
001 |
SCIDIR_ocn866583452 |
003 |
OCoLC |
005 |
20231117044941.0 |
006 |
m o d |
007 |
cr ||||||||||| |
008 |
131115s2014 ne ob 000 0 eng d |
040 |
|
|
|a UKMGB
|b eng
|e pn
|c UKMGB
|d OPELS
|d YDXCP
|d TEFOD
|d CHVBK
|d IDEBK
|d B24X7
|d OCLCQ
|d COO
|d OCLCO
|d OCLCF
|d TPH
|d TEFOD
|d OCLCQ
|d REB
|d OCLCO
|d AU@
|d VT2
|d EBLCP
|d UMC
|d DEBSZ
|d OCLCQ
|d IOG
|d OCLCO
|d Z5A
|d LIV
|d OCLCQ
|d ESU
|d YDX
|d OCLCQ
|d OCLCO
|d MERUC
|d OCLCO
|d OCLCA
|d FEM
|d OCLCO
|d OCLCA
|d U3W
|d D6H
|d STF
|d CUY
|d ZCU
|d ICG
|d K6U
|d OTZ
|d CNCEN
|d OCLCQ
|d OCLCO
|d WYU
|d OCLCO
|d OCLCA
|d S8J
|d TKN
|d OCLCA
|d AUD
|d DKC
|d OCLCQ
|d UAB
|d LQU
|d OCLCQ
|d ERF
|d OCLCQ
|d OCLCO
|d OCLCQ
|d OCLCO
|
066 |
|
|
|c (S
|
016 |
7 |
|
|a 016585959
|2 Uk
|
016 |
7 |
|
|a 016584327
|2 Uk
|
019 |
|
|
|a 871224210
|a 880315949
|a 887852028
|a 969036048
|a 1026441447
|a 1055390780
|a 1065811010
|a 1081297038
|a 1103277579
|a 1105175873
|a 1105568376
|a 1129344318
|a 1153004073
|
020 |
|
|
|a 9780124167452
|q (electronic bk.)
|
020 |
|
|
|a 0124167454
|q (electronic bk.)
|
020 |
|
|
|z 9780124167438
|
020 |
|
|
|a 0124167438
|
020 |
|
|
|a 9780124167438
|
024 |
8 |
|
|a ebc1637335
|
035 |
|
|
|a (OCoLC)866583452
|z (OCoLC)871224210
|z (OCoLC)880315949
|z (OCoLC)887852028
|z (OCoLC)969036048
|z (OCoLC)1026441447
|z (OCoLC)1055390780
|z (OCoLC)1065811010
|z (OCoLC)1081297038
|z (OCoLC)1103277579
|z (OCoLC)1105175873
|z (OCoLC)1105568376
|z (OCoLC)1129344318
|z (OCoLC)1153004073
|
050 |
|
4 |
|a QA402.5
|
060 |
|
4 |
|a Online Book
|
082 |
0 |
4 |
|a 519.6
|2 23
|
100 |
1 |
|
|a Yang, Xin-She,
|e author.
|
245 |
1 |
0 |
|a Nature-inspired optimization algorithms /
|c by Xin-She Yang.
|
264 |
|
1 |
|a Amsterdam :
|b Elsevier,
|c 2014.
|
300 |
|
|
|a 1 online resource
|
336 |
|
|
|a text
|b txt
|2 rdacontent
|
337 |
|
|
|a computer
|b c
|2 rdamedia
|
338 |
|
|
|a online resource
|b cr
|2 rdacarrier
|
347 |
|
|
|a text file
|
490 |
1 |
|
|a Elsevier insights
|
520 |
|
|
|a Nature-Inspired Optimization Algorithms provides a systematic introduction to all major nature-inspired algorithms for optimization. The book's unified approach, balancing algorithm introduction, theoretical background and practical implementation, complements extensive literature with well-chosen case studies to illustrate how these algorithms work. Topics include particle swarm optimization, ant and bee algorithms, simulated annealing, cuckoo search, firefly algorithm, bat algorithm, flower algorithm, harmony search, algorithm analysis, constraint handling, hybrid methods, parameter tuning and control, as well as multi-objective optimization. This book can serve as an introductory book for graduates, doctoral students and lecturers in computer science, engineering and natural sciences. It can also serve a source of inspiration for new applications. Researchers and engineers as well as experienced experts will also find it a handy reference. Discusses and summarizes the latest developments in nature-inspired algorithms with comprehensive, timely literature. Provides a theoretical understanding as well as practical implementation hints. Provides a step-by-step introduction to each algorithm.
|
588 |
0 |
|
|a CIP data: resource not viewed.
|
504 |
|
|
|a Includes bibliographical references.
|
505 |
0 |
|
|6 880-01
|a 1. Introduction to algorithms -- 2. Analysis of algorithms -- 3. Random walks and optimization -- 4. Simulated annealing -- 5. Genetic algorithms -- 6. Differential evolution -- 7. Particle swarm optimization -- 8. Firefly algorithms -- 9. Cuckoo search -- 10. Bat algorithms -- Flower pollination algorithms -- 12. A framework for self-tuning algorithms -- 13. How to deal with constraints -- 14. Multi-objective optimization -- 15. Other algorithms and hybrid algorithms -- Appendices.
|
650 |
|
0 |
|a Mathematical optimization.
|
650 |
|
0 |
|a Algorithms.
|
650 |
1 |
2 |
|a Algorithms
|0 (DNLM)D000465
|
650 |
|
4 |
|a Algorithms.
|
650 |
|
4 |
|a Mathematical optimization.
|
650 |
|
6 |
|a Optimisation math�ematique.
|0 (CaQQLa)201-0007680
|
650 |
|
6 |
|a Algorithmes.
|0 (CaQQLa)201-0001230
|
650 |
|
7 |
|a algorithms.
|2 aat
|0 (CStmoGRI)aat300065585
|
650 |
|
7 |
|a Algorithms
|2 fast
|0 (OCoLC)fst00805020
|
650 |
|
7 |
|a Mathematical optimization
|2 fast
|0 (OCoLC)fst01012099
|
650 |
|
7 |
|a Optimierung
|2 gnd
|0 (DE-588)4043664-0
|
650 |
|
7 |
|a Algorithmus
|2 gnd
|0 (DE-588)4001183-5
|
650 |
|
7 |
|a Bionik
|2 gnd
|0 (DE-588)4006888-2
|
650 |
|
7 |
|a Evolution�arer Algorithmus
|2 gnd
|0 (DE-588)4366912-8
|
650 |
|
7 |
|a Schwarmintelligenz
|2 gnd
|0 (DE-588)4793676-9
|
776 |
0 |
8 |
|i Print version:
|a Yang, Xin-She.
|t Nature-Inspired Optimization Algorithms.
|d Burlington : Elsevier Science, �2014
|z 9780124167438
|
830 |
|
0 |
|a Elsevier insights.
|
856 |
4 |
0 |
|u https://sciencedirect.uam.elogim.com/science/book/9780124167438
|z Texto completo
|
880 |
0 |
0 |
|6 505-01/(S
|g Machine generated contents note:
|g 1.
|t Introduction to Algorithms --
|g 1.1.
|t What is an Algorithm--
|g 1.2.
|t Newton's Method --
|g 1.3.
|t Optimization --
|g 1.3.1.
|t Gradient-Based Algorithms --
|g 1.3.2.
|t Hill Climbing with Random Restart --
|g 1.4.
|t Search for Optimality --
|g 1.5.
|t No-Free-Lunch Theorems --
|g 1.5.1.
|t NFL Theorems --
|g 1.5.2.
|t Choice of Algorithms --
|g 1.6.
|t Nature-Inspired Metaheuristics --
|g 1.7.
|t A Brief History of Metaheuristics --
|t References --
|g 2.
|t Analysis of Algorithms --
|g 2.1.
|t Introduction --
|g 2.2.
|t Analysis of Optimization Algorithms --
|g 2.2.1.
|t Algorithm as an Iterative Process --
|g 2.2.2.
|t An Ideal Algorithm--
|g 2.2.3.
|t A Self-Organization System --
|g 2.2.4.
|t Exploration and Exploitation --
|g 2.2.5.
|t Evolutionary Operators --
|g 2.3.
|t Nature-Inspired Algorithms --
|g 2.3.1.
|t Simulated Annealing --
|g 2.3.2.
|t Genetic Algorithms --
|g 2.3.3.
|t Differential Evolution --
|g 2.3.4.
|t Ant and Bee Algorithms --
|g 2.3.5.
|t Particle Swarm Optimization --
|g 2.3.6.
|t The Firefly Algorithm --
|g 2.3.7.
|t Cuckoo Search --
|g 2.3.8.
|t The Bat Algorithm --
|g 2.3.9.
|t Harmony Search --
|g 2.3.10.
|t The Flower Algorithm --
|g 2.3.11.
|t Other Algorithms --
|g 2.4.
|t Parameter Tuning and Parameter Control --
|g 2.4.1.
|t Parameter Tuning --
|g 2.4.2.
|t Hyperoptimization --
|g 2.4.3.
|t Multiobjective View --
|g 2.4.4.
|t Parameter Control --
|g 2.5.
|t Discussions --
|g 2.6.
|t Summary --
|t References --
|g 3.
|t Random Walks and Optimization --
|g 3.1.
|t Random Variables --
|g 3.2.
|t Isotropic Random Walks --
|g 3.3.
|t Levy Distribution and Levy Flights --
|g 3.4.
|t Optimization as Markov Chains --
|g 3.4.1.
|t Markov Chain --
|g 3.4.2.
|t Optimization as a Markov Chain --
|g 3.5.
|t Step Sizes and Search Efficiency --
|g 3.5.1.
|t Step Sizes, Stopping Criteria, and Efficiency --
|g 3.5.2.
|t Why Levy Flights are More Efficient --
|g 3.6.
|t Modality and Intermittent Search Strategy --
|g 3.7.
|t Importance of Randomization --
|g 3.7.1.
|t Ways to Carry Out Random Walks --
|g 3.7.2.
|t Importance of Initialization --
|g 3.7.3.
|t Importance Sampling --
|g 3.7.4.
|t Low-Discrepancy Sequences --
|g 3.8.
|t Eagle Strategy --
|g 3.8.1.
|t Basic Ideas of Eagle Strategy --
|g 3.8.2.
|t Why Eagle Strategy is So Efficient --
|t References --
|g 4.
|t Simulated Annealing --
|g 4.1.
|t Annealing and Boltzmann Distribution --
|g 4.2.
|t Parameters --
|g 4.3.
|t SA Algorithm --
|g 4.4.
|t Unconstrained Optimization --
|g 4.5.
|t Basic Convergence Properties --
|g 4.6.
|t SA Behavior in Practice --
|g 4.7.
|t Stochastic Tunneling --
|t References --
|g 5.
|t Genetic Algorithms --
|g 5.1.
|t Introduction --
|g 5.2.
|t Genetic Algorithms --
|g 5.3.
|t Role of Genetic Operators --
|g 5.4.
|t Choice of Parameters --
|g 5.5.
|t GA Variants --
|g 5.6.
|t Schema Theorem --
|g 5.7.
|t Convergence Analysis --
|t References --
|g 6.
|t Differential Evolution --
|g 6.1.
|t Introduction --
|g 6.2.
|t Differential Evolution --
|g 6.3.
|t Variants --
|g 6.4.
|t Choice of Parameters --
|g 6.5.
|t Convergence Analysis --
|g 6.6.
|t Implementation --
|t References --
|g 7.
|t Particle Swarm Optimization --
|g 7.1.
|t Swarm Intelligence --
|g 7.2.
|t PSO Algorithm --
|g 7.3.
|t Accelerated PSO --
|g 7.4.
|t Implementation --
|g 7.5.
|t Convergence Analysis --
|g 7.5.1.
|t Dynamical System --
|g 7.5.2.
|t Markov Chain Approach --
|g 7.6.
|t Binary PSO --
|t References --
|g 8.
|t Firefly Algorithms --
|g 8.1.
|t The Firefly Algorithm --
|g 8.1.1.
|t Firefly Behavior --
|g 8.1.2.
|t Standard Firefly Algorithm --
|g 8.1.3.
|t Variations of Light Intensity and Attractiveness --
|g 8.1.4.
|t Controlling Randomization --
|g 8.2.
|t Algorithm Analysis --
|g 8.2.1.
|t Scalings and Limiting Cases --
|g 8.2.2.
|t Attraction and Diffusion --
|g 8.2.3.
|t Special Cases of FA --
|g 8.3.
|t Implementation --
|g 8.4.
|t Variants of the Firefly Algorithm --
|g 8.4.1.
|t FA Variants --
|g 8.4.2.
|t How Can We Discretize FA--
|g 8.5.
|t Firefly Algorithms in Applications --
|g 8.6.
|t Why the Firefly Algorithm is Efficient --
|t References --
|g 9.
|t Cuckoo Search --
|g 9.1.
|t Cuckoo Breeding Behavior --
|g 9.2.
|t Levy Flights --
|g 9.3.
|t Cuckoo Search --
|g 9.3.1.
|t Special Cases of Cuckoo Search --
|g 9.3.2.
|t How to Carry Out Levy Flights --
|g 9.3.3.
|t Choice of Parameters --
|g 9.3.4.
|t Variants of Cuckoo Search --
|g 9.4.
|t Why Cuckoo Search is so Efficient --
|g 9.5.
|t Global Convergence: Brief Mathematical Analysis --
|g 9.6.
|t Applications --
|t References --
|g 10.
|t Bat Algorithms --
|g 10.1.
|t Echolocation of Bats --
|g 10.1.1.
|t Behavior of Microbats --
|g 10.1.2.
|t Acoustics of Echolocation --
|g 10.2.
|t Bat Algorithms --
|g 10.2.1.
|t Movement of Virtual Bats --
|g 10.2.2.
|t Loudness and Pulse Emission --
|g 10.3.
|t Implementation --
|g 10.4.
|t Binary Bat Algorithms --
|g 10.5.
|t Variants of the Bat Algorithm --
|g 10.6.
|t Convergence Analysis --
|g 10.7.
|t Why the Bat Algorithm is Efficient --
|g 10.8.
|t Applications --
|g 10.8.1.
|t Continuous Optimization --
|g 10.8.2.
|t Combinatorial Optimization and Scheduling --
|g 10.8.3.
|t Inverse Problems and Parameter Estimation --
|g 10.8.4.
|t Classifications, Clustering, and Data Mining --
|g 10.8.5.
|t Image Processing --
|g 10.8.6.
|t Fuzzy Logic and Other Applications --
|t References --
|g 11.
|t Flower Pollination Algorithms --
|g 11.1.
|t Introduction --
|g 11.1.
|t Flower Pollination Algorithms --
|g 11.2.1.
|t Characteristics of Flower Pollination --
|g 11.2.2.
|t Flower Pollination Algorithms --
|g 11.3.
|t Multi-Objective Flower Pollination Algorithms --
|g 11.4.
|t Validation and Numerical Experiments --
|g 11.4.1.
|t Single-Objective Test Functions --
|g 11.4.2.
|t Multi-Objective Test Functions --
|g 11.4.3.
|t Analysis of Results and Comparison --
|g 11.5.
|t Applications --
|g 11.5.1.
|t Single-Objective Design Benchmarks --
|g 11.5.2.
|t Multi-Objective Design Benchmarks --
|g 11.6.
|t Further Research Topics --
|t References --
|g 12.
|t A Framework for Self-Tuning Algorithms --
|g 12.1.
|t Introduction --
|g 12.2.
|t Algorithm Analysis and Parameter Tuning --
|g 12.2.1.
|t A General Formula for Algorithms --
|g 12.2.2.
|t Type of Optimality --
|g 12.2.3.
|t Parameter Tuning --
|g 12.3.
|t Framework for Self-Tuning Algorithms --
|g 12.3.1.
|t Hyperoptimization --
|g 12.3.2.
|t A Multi-Objective View --
|g 12.3.3.
|t Self-Tuning Framework --
|g 12.4.
|t A Self-Tuning Firefly Algorithm --
|g 12.5.
|t Some Remarks --
|t References --
|g 13.
|t How to Deal with Constraints --
|g 13.1.
|t Introduction and Overview --
|g 13.2.
|t Method of Lagrange Multipliers --
|g 13.3.
|t KKT Conditions --
|g 13.4.
|t Penalty Method --
|g 13.5.
|t Equality with Tolerance --
|g 13.6.
|t Feasibility Rules and Stochastic Ranking --
|g 13.7.
|t Multi-objective Approach to Constraints --
|g 13.8.
|t Spring Design --
|g 13.9.
|t Cuckoo Search Implementation --
|t References --
|g 14.
|t Multi-Objective Optimization --
|g 14.1.
|t Multi-Objective Optimization --
|g 14.2.
|t Pareto Optimality --
|g 14.3.
|t Weighted Sum Method --
|g 14.4.
|t Utility Method --
|g 14.5.
|t The ε--Constraint Method --
|g 14.6.
|t Metaheuristic Approaches --
|g 14.7.
|t NSGA-II --
|t References --
|g 15.
|t Other Algorithms and Hybrid Algorithms --
|g 15.1.
|t Ant Algorithms --
|g 15.1.1.
|t Ant Behavior --
|g 15.1.2.
|t Ant Colony Optimization --
|g 15.1.3.
|t Virtual Ant Algorithms --
|g 15.2.
|t Bee-Inspired Algorithms --
|g 15.2.1.
|t Honeybee Behavior --
|g 15.2.2.
|t Bee Algorithms --
|g 15.2.3.
|t Honeybee Algorithm --
|g 15.2.4.
|t Virtual Bee Algorithm --
|g 15.2.5.
|t Artificial Bee Colony Optimization --
|g 15.3.
|t Harmony Search --
|g 15.3.1.
|t Harmonics and Frequencies --
|g 15.3.2.
|t Harmony Search --
|g 15.4.
|t Hybrid Algorithms --
|g 15.4.1.
|t Other Algorithms --
|g 15.4.2.
|t Ways to Hybridize --
|g 15.5.
|t Final Remarks --
|t References --
|g Appendix
|t A Test Function Benchmarks for Global Optimization --
|t References --
|g Appendix B
|t Matlab Programs --
|g B.1.
|t Simulated Annealing --
|g B.2.
|t Particle Swarm Optimization --
|g B.3.
|t Differential Evolution --
|g B.4.
|t Firefly Algorithm --
|g B.5.
|t Cuckoo Search --
|g B.6.
|t Bat Algorithm.
|