• Wyszukiwanie zaawansowane
  • Kategorie
  • Kategorie BISAC
  • Książki na zamówienie
  • Promocje
  • Granty
  • Książka na prezent
  • Opinie
  • Pomoc
  • Załóż konto
  • Zaloguj się

Information Measures: Information and Its Description in Science and Engineering » książka

zaloguj się | załóż konto
Logo Krainaksiazek.pl

koszyk

konto

szukaj
topmenu
Księgarnia internetowa
Szukaj
Książki na zamówienie
Promocje
Granty
Książka na prezent
Moje konto
Pomoc
 
 
Wyszukiwanie zaawansowane
Pusty koszyk
Bezpłatna dostawa dla zamówień powyżej 20 złBezpłatna dostawa dla zamówień powyżej 20 zł

Kategorie główne

• Nauka
 [2946912]
• Literatura piękna
 [1852311]

  więcej...
• Turystyka
 [71421]
• Informatyka
 [150889]
• Komiksy
 [35717]
• Encyklopedie
 [23177]
• Dziecięca
 [617324]
• Hobby
 [138808]
• AudioBooki
 [1671]
• Literatura faktu
 [228371]
• Muzyka CD
 [400]
• Słowniki
 [2841]
• Inne
 [445428]
• Kalendarze
 [1545]
• Podręczniki
 [166819]
• Poradniki
 [480180]
• Religia
 [510412]
• Czasopisma
 [525]
• Sport
 [61271]
• Sztuka
 [242929]
• CD, DVD, Video
 [3371]
• Technologie
 [219258]
• Zdrowie
 [100961]
• Książkowe Klimaty
 [124]
• Zabawki
 [2341]
• Puzzle, gry
 [3766]
• Literatura w języku ukraińskim
 [255]
• Art. papiernicze i szkolne
 [7810]
Kategorie szczegółowe BISAC

Information Measures: Information and Its Description in Science and Engineering

ISBN-13: 9783540408550 / Angielski / Miękka / 2003 / 548 str.

Christoph Arndt
Information Measures: Information and Its Description in Science and Engineering Arndt, Christoph 9783540408550 Springer - książkaWidoczna okładka, to zdjęcie poglądowe, a rzeczywista szata graficzna może różnić się od prezentowanej.

Information Measures: Information and Its Description in Science and Engineering

ISBN-13: 9783540408550 / Angielski / Miękka / 2003 / 548 str.

Christoph Arndt
cena 441,75
(netto: 420,71 VAT:  5%)

Najniższa cena z 30 dni: 424,07
Termin realizacji zamówienia:
ok. 22 dni roboczych
Dostawa w 2026 r.

Darmowa dostawa!

This book is intended to be an introduction to the mathematical description of information in science. The necessary mathematical theory of this introduction will be treated in a more vivid way than in the usual theorem-proof structure. This, however, enables us to develop an idea of the connections between different information measures and to understand the trains of thought in their derivation, which is a crucial point for correct applications. It is therefore our intention in the mathematical descriptions to evolve the important ideas of the derivations, so that we obtain the resulting functions as well as the main thoughts and the conditions for the validity of the result. This simplifies the handling of the information measures, which are sometimes hard to classify without any additional background information. Though the mathematical descriptions are the exact formulations of the measures examined, we do not restrict ourselves to rigorous mathematical considerations, but we will also integrate the different measures into the structure and context of possible information measures. Nevertheless the mathematical approach is unavoidable when we are looking for an objective description and for possible applications in optimization.

Kategorie:
Technologie
Kategorie BISAC:
Technology & Engineering > Electrical
Science > Reference
Computers > Information Theory
Wydawca:
Springer
Seria wydawnicza:
Signals and Communication Technology (Hardcover)
Język:
Angielski
ISBN-13:
9783540408550
Rok wydania:
2003
Wydanie:
Softcover Repri
Numer serii:
000352380
Ilość stron:
548
Waga:
0.84 kg
Wymiary:
22.96 x 15.44 x 3.23
Oprawa:
Miękka
Wolumenów:
01
Dodatkowe informacje:
Bibliografia
Wydanie ilustrowane

"Bioinformaticians are facing the challenge of how to handle immense amounts of raw data, such as are generated from genome mapping, make sense of them, and render them accessible to scientists working on a wide variety of problems. "Information Measures: Information and its Description in Science and Engineering" can be such a tool."

IEEE Engineering in Medicine and Biology

Abstract.- Structure and Structuring.- 1 Introduction.- Science and information.- Man as control loop.- Information, complexity and typical sequences.- Concepts of information.- Information, its technical dimension and the meaning of a message.- Information as a central concept.- 2 Basic considerations.- 2.1 Formal derivation of information.- 2.1.1 Unit and reference scale.- 2.1.2 Information and the unit element.- 2.2 Application of the information measure (Shannon’s information).- 2.2.1 Summary.- 2.3 The law of Weber and Fechner.- 2.4 Information of discrete random variables.- 3 Historic development of information theory.- 3.1 Development of information transmission.- 3.1.1 Samuel F. B. Morse 1837.- 3.1.2 Thomas Edison 1874.- 3.1.3 Nyquist 1924.- 3.1.4 Optimal number of characters of the alphabet used for the coding.- 3.2 Development of information functions.- 3.2.1 Hartley 1928.- 3.2.2 Dennis Gabor 1946.- 3.2.3 Shannon 1948.- 3.2.3.1 Validity of the postulates for Shannon’s Information.- 3.2.3.2 Shannon’s information (another possibility of a derivation).- 3.2.3.3 Properties of Shannon’s information, entropy.- 3.2.3.4 Shannon’s entropy or Shannon’s information.- 3.2.3.5 The Kraft inequality.- Kraft’s inequality:.- Proof of Kraft’s inequality:.- 3.2.3.6 Limits of the optimal length of codewords.- 3.2.3.6.1 Shannon’s coding theorem.- 3.2.3.6.2 A sequence of n symbols (elements).- 3.2.3.6.3 Application of the previous results.- 3.2.3.7 Information and utility (coding, porfolio analysis).- 4 The concept of entropy in physics.- The laws of thermodynamics:.- 4.1 Macroscopic entropy.- 4.1.1 Sadi Carnot 1824.- 4.1.2 Clausius’s entropy 1850.- 4.1.3 Increase of entropy in a closed system.- 4.1.4 Prigogine’s entropy.- 4.1.5 Entropy balance equation.- 4.1.6 Gibbs’s free energy and the quality of the energy.- 4.1.7 Considerations on the macroscopic entropy.- 4.1.7.1 Irreversible transformations.- 4.1.7.2 Perpetuum mobile and transfer of heat.- 4.2 Statistical entropy.- 4.2.1 Boltzmann’s entropy.- 4.2.2 Derivation of Boltzmann’s entropy.- 4.2.2.1 Variation, permutation and the formula of Stirling.- 4.2.2.2 Special case: Two states.- 4.2.2.3 Example: Lottery.- 4.2.3 The Boltzmann factor.- 4.2.4 Maximum entropy in equilibrium.- 4.2.5 Statistical interpretation of entropy.- 4.2.6 Examples regarding statistical entropy.- 4.2.6.1 Energy and fluctuation.- 4.2.6.2 Quantized oscillator.- 4.2.7 Brillouin-Schrödinger negentropy.- 4.2.7.1 Brillouin: Precise definition of information.- 4.2.7.2 Negentropy as a generalization of Carnot’s principle.- Maxwell’s demon.- 4.2.8 Information measures of Hartley and Boltzmann.- 4.2.8.1 Examples.- 4.2.9 Shannon’s entropy.- 4.3 Dynamic entropy.- 4.3.1 Eddington and the arrow of time.- 4.3.2 Kolmogorov’s entropy.- 4.3.3 Rényi’s entropy.- 5 Extension of Shannon’s information.- 5.1 Rényi’s Information 1960.- 5.1.1 Properties of Rényi’s entropy.- 5.1.2 Limits in the interval 0 ? ?< ?.- 5.1.3 Nonnegativity for discrete events.- 5.1.4 Additivity and a connection to Minkowski’s norm.- 5.1.5 The meaning of S?(A) for ? 1.- 5.1.6 Graphical presentations of Rényi’s information.- 5.2 Another generalized entropy (logical expansion).- 5.3 Gain of information via conditional probabilities.- 5.4 Other entropy or information measures.- 5.4.1 Daroczy’s entropy.- 5.4.2 Quadratic entropy.- 5.4.3 R-norm entropy.- 6 Generalized entropy measures.- 6.1 The corresponding measures of divergence.- 6.2 Weighted entropies and expectation values of entropies.- 7 Information functions and gaussian distributions.- 7.1 Rényi’s information of a gaussian distributed random variable.- 7.1.1 Rényi’s ?-information.- 7.1.2 Rényi’s G-divergence.- 7.2 Shannon’s information.- 8 Shannon’s information of discrete probability distributions.- 8.1 Continuous and discrete random variables.- 8.1.1 Summary.- 8.2 Shannon’s information of a gaussian distribution.- 8.3 Shannon’s information as the possible gain of information in an observation.- 8.4 Limits of the information, limitations of the resolution.- 8.4.1 The resolution or the precision of the measurements.- 8.4.2 The uncertainty relation of the Fourier transformation.- 8.5 Maximization of the entropy of a continuous random variable.- 9 Information functions for gaussian distributions part II.- 9.1 Kullback’s information.- 9.1.1 G1 for gaussian distribution densites.- 9.2 Kullback’s divergence.- 9.2.1 Jensen’s inequality for G1.- 9.3 Kolmogorov’s information.- 9.4 Transformation of the coordinate system and the effects on the information.- 9.4.1 S?-information.- 9.4.2 G-divergence.- 9.4.3 S-information.- 9.4.3.1 Example.- 9.4.4 Discrimination information.- 9.4.5 Kolmogorov’s information.- 9.4.6 Prerequisites for the transformations.- 9.5 Transformation, discrete and continuous measures of entropy.- 9.6 Summary of the information functions.- 10 Bounds of the variance.- 10.1 Cramér-Rao bound.- 10.1.1 Fisher’s information for gaussian distribution densities.- 10.1.2 Fisher’s information and Kullback’s information.- 10.1.3 Fisher’s information and the metric tensor.- 10.1.4 Fisher’s information and the stochastic observability.- 10.1.4.1 Fisher’s information and the Matrix-Riccati equation.- 10.1.5 Fisher’s information and maximum likelihood estimation.- 10.1.6 Fisher’s information and weighted least-squares estimation.- 10.1.7 The availability of the Cramér-Rao bound.- 10.1.8 Efficiency, asymptotic efficiency, consistency, bias.- 10.1.8.1 Unbiased estimator.- 10.1.8.2 Consistency.- 10.1.8.3 Efficiency.- 10.1.9 Summary.- 10.2 Chapman-Robbins bound.- 10.2.1 Cramér-Rao bound versus Chapman-Robbins bound.- 10.3 Bhattacharrya bound.- Remark:.- Remark.- 10.3.1 Bhattacharrya bound and Cramér-Rao bound.- 10.3.2 Bhattacharrya’s bound for gaussian distribution densities.- 10.4 Barankin bound.- 10.5 Other bounds.- Fraser-Guttman bound.- Kiefer bound.- Extended Fraser-Guttman bound.- 10.6 Summary.- 10.7 Biased estimator.- 10.7.1 Biased estimator versus unbiased estimator.- 11 Ambiguity function.- 11.1 The ambiguity function and Kullback’s information.- 11.2 Connection between ambiguity function and Fisher’s information.- 11.3 Maximum likelihood estimation and the ambiguity function.- 11.3.1 Maximum likelihood estimation = minimum Kullback estimation = maximum ambiguity estimation = minimum variance estimation.- 11.3.2 Maximum likelihood estimation.- 11.3.2.1 Application: Discriminator (Demodulation).- 11.4 The ML estimation is asymptotically efficient.- 11.5 Transition to the Akaike information criterion.- 12 Akaike’s information criterion.- 12.1 Akaike’s information criterion and regression.- 12.1.1 Least-squares regression.- 12.1.2 Application of the results to the ambiguity function.- 12.2 BIC, SC or HQ.- 13 Channel information.- 13.1 Redundancy.- 13.1.1 Knowledge, redundancy, utility.- 13.2 Rate of transmission and equivocation.- 13.3 Hadamard’s inequality and Gibbs’s second theorem.- 13.4 Kolmogorov’s information.- 13.5 Kullbacks divergence.- 13.6 An example of a transmission.- 13.7 Communication channel and information processing.- 13.7.1 Semantic, syntactic and pragmatic information.- 13.7.2 Information, first-time occurrence, confirmation.- 13.8 Shannon’s bound.- 13.9 Example of the channel capacity.- 14 ‘Deterministic’ and stochastic information.- 14.1 Information in state space models.- 14.2 The observation equation.- 14.3 Transmission faster than light.- 14.4 Information about state space variables.- 15 Maximum entropy estimation.- 15.1 The difference between maximum entropy and minimum variance.- 15.2 The difference from bootstrap or resampling methods.- 15.3 A maximum entropy example.- 15.4 Maximum entropy: The method.- 15.4.1 Maximum Shannon entropy.- 15.4.2 Minimum Kullback-Leibler distance.- 15.5 Maximum entropy and minimum discrimination information.- 15.6 Generation of generalized entropy measures.- 15.6.1 Example: Gaussian distribution and Shannon’s information.- 16 Concluding remarks.- 16.1 Information, entropy and self-organization.- 16.2 Complexity theory.- 16.3 Data reduction.- 16.4 Cryptology.- 16.5 Concluding considerations.- 16.5.1 Information, entropy and probability.- 16.6 Information.- A.1 Inequality for Kullback’s information.- A.2 The log-sum inequality.- A.3 Generalized entropy, divergence and distance measures.- A.3.1 Entropy measures.- A.3.2 Generalized measures of distance.- A.3.3 Generalized measures of the directed divergence.- A.3.4 Generalized measures of divergence.- A.3.4.1 Information radius and the J-divergence.- A.3.4.2 Generalization of the R-divergence.- A.3.4.3 Generalization of the J-divergence.- A.4 A short introduction to probability theory.- A.4.1 Axiomatic definition of probability.- A.4.1.1 Events, elementary events, sample space.- A.4.1.2 Classes of subsets, fields.- A.4.1.3 Axiomatic definition of probability according to Kolmogorov.- Probability space.- A.4.1.4 Random variables.- A.4.1.5 Probability distribution.- A.4.1.6 Probability space, sample space, realization space.- A.4.1.7 Probability distribution and distribution density function.- A.4.1.8 Probability distribution density function (PDF).- A.5 The regularity conditions.- A.6 State space description.

Information Measures introduces the mathematical description of information in science and engineering, treating the necessary mathematical theory in a more vivid way than in the usual theoretical proof structure. This enables readers to develop an idea of the connections between different information measures and to understand the trains of thoughts in their derivation. As there exist a great number of different possible ways to describe information, these measures are presented in a coherent manner. Examples of the measures treated: Shannon information, applied in coding theory; Akaike information criterion, used in system identification to determine auto-regressive models and in neural networks to identify the number of neurons; and Cramer-Rao bound or Fisher information, describing the minimal variances achieved by unbiased estimators. "A tool [to show] bioinformaticians... how to handle immense amounts of raw data, such as are generated from genome mapping, make sense of them, and render them accessible to scientists." IEEE Engineering in Medicine and Biology

This book is an introduction to the mathematical description of information in science and engineering. The necessary mathematical theory will be treated in a more vivid way than in the usual theorem-proof structure. This enables the reader to develop an idea of the connections between different information measures and to understand the trains of thoughts in their derivation. As there exist a great number of different possible ways to describe information, these measures are presented in a coherent manner. Some examples of the information measures examined are: Shannon information, applied in coding theory; Akaike information criterion, used in system identification to determine auto-regressive models and in neural networks to identify the number of neu-rons; and Cramer-Rao bound or Fisher information, describing the minimal variances achieved by unbiased estimators. This softcover edition addresses researchers and students in electrical engineering, particularly in control and communications, physics, and applied mathematics.

Arndt, Christoph Christoph Arndt is assistant professor in the Depa... więcej >


Udostępnij

Facebook - konto krainaksiazek.pl



Opinie o Krainaksiazek.pl na Opineo.pl

Partner Mybenefit

Krainaksiazek.pl w programie rzetelna firma Krainaksiaze.pl - płatności przez paypal

Czytaj nas na:

Facebook - krainaksiazek.pl
  • książki na zamówienie
  • granty
  • książka na prezent
  • kontakt
  • pomoc
  • opinie
  • regulamin
  • polityka prywatności

Zobacz:

  • Księgarnia czeska

  • Wydawnictwo Książkowe Klimaty

1997-2025 DolnySlask.com Agencja Internetowa

© 1997-2022 krainaksiazek.pl
     
KONTAKT | REGULAMIN | POLITYKA PRYWATNOŚCI | USTAWIENIA PRYWATNOŚCI
Zobacz: Księgarnia Czeska | Wydawnictwo Książkowe Klimaty | Mapa strony | Lista autorów
KrainaKsiazek.PL - Księgarnia Internetowa
Polityka prywatnosci - link
Krainaksiazek.pl - płatnośc Przelewy24
Przechowalnia Przechowalnia