• Wyszukiwanie zaawansowane
  • Kategorie
  • Kategorie BISAC
  • Książki na zamówienie
  • Promocje
  • Granty
  • Książka na prezent
  • Opinie
  • Pomoc
  • Załóż konto
  • Zaloguj się

Theory of Global Random Search » książka

zaloguj się | załóż konto
Logo Krainaksiazek.pl

koszyk

konto

szukaj
topmenu
Księgarnia internetowa
Szukaj
Książki na zamówienie
Promocje
Granty
Książka na prezent
Moje konto
Pomoc
 
 
Wyszukiwanie zaawansowane
Pusty koszyk
Bezpłatna dostawa dla zamówień powyżej 20 złBezpłatna dostawa dla zamówień powyżej 20 zł

Kategorie główne

• Nauka
 [2946600]
• Literatura piękna
 [1856966]

  więcej...
• Turystyka
 [72221]
• Informatyka
 [151456]
• Komiksy
 [35826]
• Encyklopedie
 [23190]
• Dziecięca
 [619653]
• Hobby
 [140543]
• AudioBooki
 [1577]
• Literatura faktu
 [228355]
• Muzyka CD
 [410]
• Słowniki
 [2874]
• Inne
 [445822]
• Kalendarze
 [1744]
• Podręczniki
 [167141]
• Poradniki
 [482898]
• Religia
 [510455]
• Czasopisma
 [526]
• Sport
 [61590]
• Sztuka
 [243598]
• CD, DVD, Video
 [3423]
• Technologie
 [219201]
• Zdrowie
 [101638]
• Książkowe Klimaty
 [124]
• Zabawki
 [2473]
• Puzzle, gry
 [3898]
• Literatura w języku ukraińskim
 [254]
• Art. papiernicze i szkolne
 [8170]
Kategorie szczegółowe BISAC

Theory of Global Random Search

ISBN-13: 9780792311225 / Angielski / Twarda / 1991 / 341 str.

A. A. Zhigliavskii; Anatoly A. Zhigljavsky; Janos D. Pinter
Theory of Global Random Search A. A. Zhigliavskii Anatoly A. Zhigljavsky Janos D. Pinter 9780792311225 Springer - książkaWidoczna okładka, to zdjęcie poglądowe, a rzeczywista szata graficzna może różnić się od prezentowanej.

Theory of Global Random Search

ISBN-13: 9780792311225 / Angielski / Twarda / 1991 / 341 str.

A. A. Zhigliavskii; Anatoly A. Zhigljavsky; Janos D. Pinter
cena 201,72 zł
(netto: 192,11 VAT:  5%)

Najniższa cena z 30 dni: 192,74 zł
Termin realizacji zamówienia:
ok. 22 dni roboczych
Bez gwarancji dostawy przed świętami

Darmowa dostawa!

One service mathematics has rendered the 'Et moi, ... si j'avait su comment en revenir. je n'y serais point aIle.' human mee. It has put common sense back Jules Verne where it belongs, on the topmost shelf next to the dusty canister labelled 'discarded non The series is divergent; therefore we may be sense'. Eric T. Bell able to do something with it. O. Heaviside Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non linearities abound. Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service topology has rendered mathematical physics .. .'; 'One service logic has rendered com puter science .. .'; 'One service category theory has rendered mathematics .. .'. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series.

Kategorie:
Nauka, Matematyka
Kategorie BISAC:
Mathematics > Prawdopodobieństwo i statystyka
Mathematics > Matematyka stosowana
Business & Economics > Operations Research
Wydawca:
Springer
Seria wydawnicza:
Mathematics and Its Applications. Soviet Series
Język:
Angielski
ISBN-13:
9780792311225
Rok wydania:
1991
Wydanie:
1991
Numer serii:
000151467
Ilość stron:
341
Waga:
1.52 kg
Wymiary:
23.5 x 15.5
Oprawa:
Twarda
Wolumenów:
01

1 Global Optimization: An Overview.- 1. Global Optimization Theory: General Concepts.- 1.1. Statements of the global optimization problem.- 1.2. Types of prior information about the objective function and a classification of methods.- 1.2.1. Types of prior information.- 1.2.2. Classification of principal approaches and methods of the global optimization.- 1.2.3. General properties of multiextremal functions.- 1.3. Comparison and practical use of global optimization algorithms.- 1.3.1. Numerical comparison.- 1.3.2. Theoretical comparison criteria.- 1.3.3. Practical optimization problems.- 2. Global Optimization Methods.- 2.1. Global optimization algorithms based on the use of local search techniques.- 2.1.1. Local optimization algorithms.- 2.1.2. Use of local algorithms in constructing global optimization strategies.- 2.1.3. Multistart.- 2.1.4. Tunneling algorithms.- 2.1.5. Methods of transition from one local minimizer into another.- 2.1.6. Algorithms based on smoothing the objective function.- 2.2. Set covering methods.- 2.2.1. Grid algorithms (Passive coverings).- 2.2.2. Sequential covering methods.- 2.2.3. Optimality of global minimization algorithms.- 2.3. One-dimensional optimization, reduction and partition techniques.- 2.3.1. One-dimensional global optimization.- 2.3.2. Dimension reduction in multiextremal problems.- 2.3.3. Reducing global optimization to other problems in computational mathematics.- 2.3.4. Branch and bound methods.- 2.4. An approach based on stochastic and axiomatic models of the objective function.- 2.4.1. Stochastic models.- 2.4.2. Global optimization methods based on stochastic models.- 2.4.3. The Wiener process case.- 2.4.4. Axiomatic approach.- 2.4.5. Information-statistical approach.- 2. Global Random Search.- 3. Main Concepts and Approaches of Global Random Search.- 3.1. Construction of global random search algorithms: Basic approaches.- 3.1.1. Uniform random sampling.- 3.1.2. General (nonuniform) random sampling.- 3.1.3. Ways of improving the efficiency of random sampling algorithms.- 3.1.4. Random coverings.- 3.1.5. Formal scheme of global random search.- 3.1.6. Local behaviour of global random search algorithm.- 3.2. General results on the convergence of global random search algorithms.- 3.3. Markovian algorithms.- 3.3.1. General scheme of Markovian algorithms.- 3.3.2. Simulated annealing.- 3.3.3. Methods based on solving stochastic differential equations.- 3.3.4. Global stochastic approximation: Zielinski’s method.- 3.3.5. Convergence rate of Baba’s algorithm.- 3.3.6. The case of high dimension.- 4. Statistical Inference in Global Random Search.- 4.1. Some ways of applying statistical procedures to construct global random search algorithms.- 4.1.1. Regression analysis and design.- 4.1.2. Cluster analysis and pattern recognition.- 4.1.3. Estimation of the cumulative distribution function, its density, mode and level surfaces.- 4.1.4. Statistical modelling (Monte Carlo method).- 4.1.5. Design of experiments.- 4.2. Statistical inference concerning the maximum of a function.- 4.2.1. Statement of the problem and a survey of the approaches for its solution.- 4.2.2. Statistical inference construction for estimating M.- 4.2.3. Statistical inference for M, when the value of the tail index ? is known.- 4.2.4. Statistical inference, when the value of the tail index ? is unknown.- 4.2.5. Estimation of F(t).- 4.2.6. Prior determination of the value of the tail index ?.- 4.2.7. Exponential complexity of the uniform random sampling algorithm.- 4.3. Branch and probability bound methods.- 4.3.1. Prospectiveness criteria.- 4.3.2. The essence of branch and bound procedures.- 4.3.3. Principal construction of branch and probability bound methods.- 4.3.4. Typical variants of the branch and probability bound method.- 4.4. Stratified sampling.- 4.4.1. Organization of stratified sampling.- 4.4.2. Statistical inference for the maximum of a function based on its values at the points of stratified sample.- 4.4.3. Dominance of stratified over independent sampling.- 4.5. Statistical inference in random multistart.- 4.5.1. Problem statement.- 4.5.2. Bounded number of local maximizers.- 4.5.3. Bayesian approach.- 4.6. An approach based on distributions neutral to the right.- 4.6.1. Random distributions neutral to the right and their properties ....- 4.6.2. Bayesian testing about quantiles of random distributions.- 4.6.3. Application of distributions neutral to the right to construct global random search algorithms.- 5. Methods of Generations.- 5.1. Description of algorithms and formulation of the basic probabilistic model.- 5.1.1. Algorithms.- 5.1.2. The basic probabilistic model.- 5.2. Convergence of probability measure sequences generated by the basic model.- 5.2.1. Assumptions.- 5.2.2. Auxiliary statements.- 5.2.3. Convergence of the sequences (5.2.7) and (5.2.8) to?*(dx).- 5.3. Methods of generations for eigen-measure functional estimation of linear integral operators.- 5.3.1. Eigen-measures of linear integral operators.- 5.3.2. Closeness of eigen-measures to ?* (dx).- 5.3.3. Description of the generation methods.- 5.3.4. Convergence and rate of convergence of the generation methods.- 5.4. Sequential analogues of the methods of generations.- 5.4.1. Functionals of eigen-measures.- 5.4.2. Sequential maximization algorithms.- 5.4.3. Narrowing the search area.- 6. Random Search Algorithms for Solving Specific Problems.- 6.1. Distribution sampling in random search algorithms for solving constrained optimization problems.- 6.1.1. Basic concepts.- 6.1.2. Properties of D(x).- 6.1.3. General remarks on sampling.- 6.1.4. Manifold defined by linear constraints.- 6.1.5. Uniform distribution on an ellipsoid.- 6.1.6. Sampling on a hyperboloid.- 6.1.7. Sampling on a paraboloid.- 6.1.8. Sampling on a cone.- 6.2. Random search algorithm construction for optimization in functional spaces, in discrete and in multicriterial problems.- 6.2.1. Optimization in functional spaces.- 6.2.2. Random search in multicriterial optimization problems.- 6.2.3. Discrete optimization.- 6.2.4. Relative efficiency of discrete random search.- 3. Auxiliary Results.- 7. Statistical Inference for the Bounds of Random Variables.- 7.1. Statistical inference when the tail index of the extreme value distribution is known.- 7.1.1. Motivation and problem statement.- 7.1.2. Auxiliary statements.- 7.1.3. Estimation of M.- 7.1.4. Confidence intervals for M.- 7.1.5. Testing statistical hypotheses about M.- 7.2. Statistical inference when the tail index is unknown.- 7.2.1. Statistical inference for M.- 7.2.2. Estimation of ?.- 7.2.3. Construction of confidence intervals and statistical hypothesis test for ?.- 7.3. Asymptotic properties of optimal linear estimates.- 7.3.1. Results and consequences.- 7.3.2. Auxiliary statements and proofs of Theorem 7.3.2 and Proposition 7.1.3.- 7.3.3. Proof of Theorem 7.3.1.- 8. Several Problems Connected with Global Random Search.- 8.1. Optimal design in extremal experiments.- 8.1.1. Extremal experiment design.- 8.1.2. Optimal selection of the search direction.- 8.1.3. Experimental design applying the search direction (8.1.15).- 8.2. Optimal simultaneous estimation of several integrals by the Monte Carlo method.- 8.2.1. Problem statement.- 8.2.2. Assumptions.- 8.2.3. Existence and uniqueness of optimal densities.- 8.2.4. Necessary and sufficient optimality conditions.- 8.2.5. Construction and structure of optimal densities.- 8.2.6. Structure of optimal densities for nondifferentiable criteria.- 8.2.7. Connection with the regression design theory.- 8.3. Projection estimation of multivariate regression.- 8.3.1. Problem statement.- 8.3.2. Bias and random inaccuracies of nonparametric estimates.- 8.3.3. Examples of asymptotically optimal projection procedures with deterministic designs.- 8.3.4. Projection estimation via evaluations at random points.- References.



Udostępnij

Facebook - konto krainaksiazek.pl



Opinie o Krainaksiazek.pl na Opineo.pl

Partner Mybenefit

Krainaksiazek.pl w programie rzetelna firma Krainaksiaze.pl - płatności przez paypal

Czytaj nas na:

Facebook - krainaksiazek.pl
  • książki na zamówienie
  • granty
  • książka na prezent
  • kontakt
  • pomoc
  • opinie
  • regulamin
  • polityka prywatności

Zobacz:

  • Księgarnia czeska

  • Wydawnictwo Książkowe Klimaty

1997-2025 DolnySlask.com Agencja Internetowa

© 1997-2022 krainaksiazek.pl
     
KONTAKT | REGULAMIN | POLITYKA PRYWATNOŚCI | USTAWIENIA PRYWATNOŚCI
Zobacz: Księgarnia Czeska | Wydawnictwo Książkowe Klimaty | Mapa strony | Lista autorów
KrainaKsiazek.PL - Księgarnia Internetowa
Polityka prywatnosci - link
Krainaksiazek.pl - płatnośc Przelewy24
Przechowalnia Przechowalnia