ISBN-13: 9780470542965 / Angielski / Twarda / 2013 / 1176 str.
ISBN-13: 9780470542965 / Angielski / Twarda / 2013 / 1176 str.
Originally published in 1968, Harry Van Trees's Detection, Estimation, and Modulation Theory, Part I is one of the great time-tested classics in the field of signal processing. Highly readable and practically organized, it is as imperative today for professionals, researchers, and students in optimum signal processing as it was over thirty years ago. The second edition is a thorough revision and expansion almost doubling the size of the first edition and accounting for the new developments thus making it again the most comprehensive and up-to-date treatment of the subject. With a wide range of applications such as radar, sonar, communications, seismology, biomedical engineering, and radar astronomy, among others, the important field of detection and estimation has rarely been given such expert treatment as it is here. Each chapter includes section summaries, realistic examples, and a large number of challenging problems that provide excellent study material. This volume which is Part I of a set of four volumes is the most important and widely used textbook and professional reference in the field.
Preface xv
Preface to the First Edition xix
1 Introduction 1
1.1 Introduction 1
1.2 Topical Outline 1
1.3 Possible Approaches 11
1.4 Organization 14
2 Classical Detection Theory 17
2.1 Introduction 17
2.2 Simple Binary Hypothesis Tests 20
2.2.1 Decision Criteria 20
2.2.2 Performance: Receiver Operating Characteristic 35
2.3 M Hypotheses 51
2.4 Performance Bounds and Approximations 63
2.5 Monte Carlo Simulation 80
2.5.1 Monte Carlo Simulation Techniques 80
2.5.2 Importance Sampling 86
2.5.2.1 Simulation of PF 87
2.5.2.2 Simulation of PM 91
2.5.2.3 Independent Observations 94
2.5.2.4 Simulation of the ROC 94
2.5.2.5 Examples 96
2.5.2.6 Iterative Importance Sampling 106
2.5.3 Summary 108
2.6 Summary 109
2.7 Problems 110
3 General Gaussian Detection 125
3.1 Detection of Gaussian Random Vectors 126
3.1.1 Real Gaussian Random Vectors 126
3.1.2 Circular Complex Gaussian Random Vectors 127
3.1.3 General Gaussian Detection 132
3.1.3.1 Real Gaussian Vectors 132
3.1.3.2 Circular Complex Gaussian Vectors 136
3.1.3.3 Summary 137
3.2 Equal Covariance Matrices 138
3.2.1 Independent Components with Equal Variance 142
3.2.2 Independent Components with Unequal Variances 146
3.2.3 General Case: Eigendecomposition 147
3.2.4 Optimum Signal Design 156
3.2.5 Interference Matrix: Estimator Subtractor 160
3.2.6 Low–Rank Models 165
3.2.7 Summary 173
3.3 Equal Mean Vectors 174
3.3.1 Diagonal Covariance Matrix on H0: Equal Variance 175
3.3.1.1 Independent, Identically Distributed Signal Components 177
3.3.1.2 Independent Signal Components: Unequal Variances 178
3.3.1.3 Correlated Signal Components 179
3.3.1.4 Low–Rank Signal Model 184
3.3.1.5 Symmetric Hypotheses, Uncorrelated Noise 186
3.3.2 Nondiagonal Covariance Matrix on H0 191
3.3.2.1 Signal on H1 Only 191
3.3.2.2 Signal on Both Hypotheses 195
3.3.3 Summary 196
3.4 General Gaussian 197
3.4.1 Real Gaussian Model 197
3.4.2 Circular Complex Gaussian Model 198
3.4.3 Single Quadratic Form 201
3.4.4 Summary 208
3.5 M Hypotheses 209
3.6 Summary 213
3.7 Problems 215
4 Classical Parameter Estimation 230
4.1 Introduction 230
4.2 Scalar Parameter Estimation 232
4.2.1 Random Parameters: Bayes Estimation 232
4.2.2 Nonrandom Parameter Estimation 246
4.2.3 Bayesian Bounds 261
4.2.3.1 Lower Bound on the MSE 261
4.2.3.2 Asymptotic Behavior 265
4.2.4 Case Study 268
4.2.5 Exponential Family 279
4.2.5.1 Nonrandom Parameters 279
4.2.5.2 Random Parameters 287
4.2.6 Summary of Scalar Parameter Estimation 292
4.3 Multiple Parameter Estimation 293
4.3.1 Estimation Procedures 293
4.3.1.1 Random Parameters 293
4.3.1.2 Nonrandom Parameters 296
4.3.2 Measures of Error 296
4.3.2.1 Nonrandom Parameters 296
4.3.2.2 Random Parameters 299
4.3.3 Bounds on Estimation Error 299
4.3.3.1 Nonrandom Parameters 299
4.3.3.2 Random Parameters 316
4.3.4 Exponential Family 321
4.3.4.1 Nonrandom Parameters 321
4.3.4.2 Random Parameters 324
4.3.5 Nuisance Parameters 325
4.3.5.1 Nonrandom Parameters 325
4.3.5.2 Random Parameters 326
4.3.5.3 Hybrid Parameters 328
4.3.6 Hybrid Parameters 328
4.3.6.1 Joint ML and MAP Estimation 329
4.3.6.2 Nuisance Parameters 331
4.3.7 Summary of Multiple Parameter Estimation 331
4.4 Global Bayesian Bounds 332
4.4.1 Covariance Inequality Bounds 333
4.4.1.1 Covariance Inequality 333
4.4.1.2 Bayesian Bounds 334
4.4.1.3 Scalar Parameters 334
4.4.1.4 Vector Parameters 340
4.4.1.5 Combined Bayesian Bounds 341
4.4.1.6 Functions of the Parameter Vector 342
4.4.1.7 Summary of Covariance Inequality Bounds 344
4.4.2 Method of Interval Estimation 345
4.4.3 Summary of Global Bayesian Bounds 348
4.5 Composite Hypotheses 348
4.5.1 Introduction 348
4.5.2 Random Parameters 350
4.5.3 Nonrandom Parameters 352
4.5.4 Simulation 372
4.5.5 Summary of Composite Hypotheses 375
4.6 Summary 375
4.7 Problems 377
5 General Gaussian Estimation 400
5.1 Introduction 400
5.2 Nonrandom Parameters 401
5.2.1 General Gaussian Estimation Model 401
5.2.2 Maximum Likelihood Estimation 407
5.2.3 Cram´er Rao Bound 409
5.2.4 Fisher Linear Gaussian Model 412
5.2.4.1 Introduction 412
5.2.4.2 White Noise 418
5.2.4.3 Low–Rank Interference 424
5.2.5 Separable Models for Mean Parameters 429
5.2.6 Covariance Matrix Parameters 442
5.2.6.1 White Noise 443
5.2.6.2 Colored Noise 444
5.2.6.3 Rank One Signal Matrix Plus White Noise 445
5.2.6.4 Rank One Signal Matrix Plus Colored Noise 450
5.2.7 Linear Gaussian Mean and Covariance Matrix Parameters 450
5.2.7.1 White Noise 450
5.2.7.2 Colored Noise 451
5.2.7.3 General Covariance Matrix 452
5.2.8 Computational Algorithms 452
5.2.8.1 Introduction 452
5.2.8.2 Gradient Techniques 453
5.2.8.3 Alternating Projection Algorithm 457
5.2.8.4 Expectation Maximization Algorithm 461
5.2.8.5 Summary 469
5.2.9 Equivalent Estimation Algorithms 469
5.2.9.1 Least Squares 470
5.2.9.2 Minimum Variance Distortionless Response 470
5.2.9.3 Summary 472
5.2.10 Sensitivity, Mismatch, and Diagonal Loading 473
5.2.10.1 Sensitivity and Array Perturbations 474
5.2.10.2 Diagonal Loading 477
5.2.11 Summary 481
5.3 Random Parameters 483
5.3.1 Model, MAP Estimation, and the BCRB 483
5.3.2 Bayesian Linear Gaussian Model 487
5.3.3 Summary 494
5.4 Sequential Estimation 495
5.4.1 Sequential Bayes Estimation 495
5.4.2 Recursive Maximum Likelihood 504
5.4.3 Summary 506
5.5 Summary 507
5.6 Problems 510
6 Representation of Random Processes 519
6.1 Introduction 519
6.2 Orthonormal Expansions: Deterministic Signals 520
6.3 Random Process Characterization 528
6.3.1 Random Processes: Conventional Characterizations 528
6.3.2 Series Representation of Sample Functions of Random Processes 532
6.3.3 Gaussian Processes 536
6.4 Homogeous Integral Equations and Eigenfunctions 540
6.4.1 Rational Spectra 540
6.4.2 Bandlimited Spectra 545
6.4.3 Nonstationary Processes 548
6.4.4 White Noise Processes 550
6.4.5 Low Rank Kernels 552
6.4.6 The Optimum Linear Filter 553
6.4.7 Properties of Eigenfunctions and Eigenvalues 559
6.4.7.1 Monotonic Property 559
6.4.7.2 Asymptotic Behavior Properties 560
6.5 Vector Random Processes 564
6.6 Summary 568
6.7 Problems 569
7 Detection of Signals Estimation of Signal Parameters 584
7.1 Introduction 584
7.1.1 Models 584
7.1.1.1 Detection 584
7.1.1.2 Estimation 587
7.1.2 Format 589
7.2 Detection and Estimation in White Gaussian Noise 591
7.2.1 Detection of Signals in Additive White Gaussian Noise 591
7.2.1.1 Simple Binary Detection 591
7.2.1.2 General Binary Detection in White Gaussian Noise 597
7.2.1.3 M–ary Detection in White Gaussian Noise 601
7.2.1.4 Sensitivity 611
7.2.2 Linear Estimation 614
7.2.3 Nonlinear Estimation 616
7.2.4 Summary of Known Signals in White Gaussian Noise 628
7.2.4.1 Detection 628
7.2.4.2 Estimation 628
7.3 Detection and Estimation in Nonwhite Gaussian Noise 629
7.3.1 Whitening Approach 632
7.3.1.1 Structures 632
7.3.1.2 Construction of Qn(t, u) and g(t) 635
7.3.1.3 Summary 639
7.3.2 A Direct Derivation Using the Karhunen–Lo`eve Expansion 639
7.3.3 A Direct Derivation with a Sufficient Statistic 641
7.3.4 Detection Performance 643
7.3.4.1 Performance: Simple Binary Detection Problem 643
7.3.4.2 Optimum Signal Design: Coincident Intervals 644
7.3.4.3 Singularity 645
7.3.4.4 General Binary Receivers 647
7.3.5 Estimation 648
7.3.6 Solution Techniques for Integral Equations 650
7.3.6.1 Infinite Observation Interval: Stationary Noise 650
7.3.6.2 Finite Observation Interval: Rational Spectra 654
7.3.6.3 Finite Observation Time: Separable Kernels 662
7.3.7 Sensitivity, Mismatch, and Diagonal Loading 667
7.3.7.1 Sensitivity 667
7.3.7.2 Mismatch and Diagonal Loading 673
7.3.8 Known Linear Channels 673
7.3.8.1 Summary 675
7.4 Signals with Unwanted Parameters: The Composite Hypothesis Problem 675
7.4.1 Random Phase Angles 677
7.4.2 Random Amplitude and Phase 694
7.4.3 Other Target Models 706
7.4.4 Nonrandom Parameters 709
7.4.4.1 Summary 711
7.5 Multiple Channels 712
7.5.1 Vector Karhunen Lo`eve 712
7.5.1.1 Application 714
7.6 Multiple Parameter Estimation 716
7.6.1 Known Signal in Additive White Gaussian Noise 717
7.6.2 Separable Models 718
7.6.3 Summary 720
7.7 Summary 721
7.8 Problems 722
8 Estimation of Continuous–Time Random Processes 771
8.1 Optimum Linear Processors 771
8.2 Realizable Linear Filters: Stationary Processes, Infinite Past: Wiener Filters 787
8.2.1 Solution of Wiener Hopf Equation 788
8.2.2 Errors in Optimum Systems 798
8.2.3 Unrealizable Filters 801
8.2.4 Closed–Form Error Expressions 803
8.3 Gaussian Markov Processes: Kalman Filter 807
8.3.1 Differential Equation Representation of Linear Systems and Random Process Generation 808
8.3.2 Kalman Filter 825
8.3.3 Realizable Whitening Filter 839
8.3.4 Generalizations 841
8.3.5 Implementation Issues 842
8.4 Bayesian Estimation of Non–Gaussian Models 842
8.4.1 The Extended Kalman Filter 843
8.4.1.1 Linear AWGN Process and Observations 844
8.4.1.2 Linear AWGN Process, Nonlinear AWGN Observations 845
8.4.1.3 Nonlinear AWGN Process and Observations 848
8.4.1.4 General Nonlinear Process and Observations 849
8.4.2 Bayesian Cram´er Rao Bounds: Continuous–Time 849
8.4.3 Summary 852
8.5 Summary 852
8.6 Problems 855
9 Estimation of Discrete Time Random Processes 880
9.1 Introduction 880
9.2 Discrete–Time Wiener Filtering 882
9.2.1 Model 882
9.2.2 Random Process Models 883
9.2.3 Optimum FIR Filters 894
9.2.4 Unrealizable IIR Wiener Filters 900
9.2.5 Realizable IIR Wiener Filters 904
9.2.6 Summary: Discrete–Time Wiener Filter 918
9.3 Discrete–Time Kalman Filter 919
9.3.1 Random Process Models 920
9.3.2 Kalman Filter 926
9.3.2.1 Derivation 927
9.3.2.2 Reduced Dimension Implementations 934
9.3.2.3 Applications 939
9.3.2.4 Estimation in Nonwhite Noise 954
9.3.2.5 Sequential Processing of Estimators 955
9.3.2.6 Square–Root Filters 958
9.3.2.7 Divergence 962
9.3.2.8 Sensitivity and Model Mismatch 966
9.3.2.9 Summary: Kalman Filters 972
9.3.3 Kalman Predictors 973
9.3.3.1 Fixed–Lead Prediction 974
9.3.3.2 Fixed–Point Prediction 975
9.3.3.3 Fixed–Interval Prediction 977
9.3.3.4 Summary: Kalman Predictors 977
9.3.4 Kalman Smoothing 978
9.3.4.1 Fixed–Interval Smoothing 978
9.3.4.2 Fixed–Lag Smoothing 979
9.3.4.3 Summary: Kalman Smoothing 982
9.3.5 Bayesian Estimation of Nonlinear Models 982
9.3.5.1 General Nonlinear Model: MMSE and MAP Estimation 983
9.3.5.2 Extended Kalman Filter 985
9.3.5.3 Recursive Bayesian Cram´er Rao Bounds 987
9.3.5.4 Applications 992
9.3.5.5 Joint State and Parameter Estimation 1005
9.3.5.6 Continuous–Time Processes and Discrete–Time Observations 1009
9.3.5.7 Summary 1013
9.3.6 Summary: Kalman Filters 1013
9.4 Summary 1016
9.5 Problems 1016
10 Detection of Gaussian Signals 1030
10.1 Introduction 1030
10.2 Detection of Continuous–Time Gaussian Processes 1030
10.2.1 Sampling 1032
10.2.2 Optimum Continuous–Time Receivers 1034
10.2.3 Performance of Optimum Receivers 1046
10.2.4 State–Variable Realization 1049
10.2.5 Stationary Process–Long Observation Time (SPLOT) Receiver 1051
10.2.6 Low–Rank Kernels 1061
10.2.7 Summary 1066
10.3 Detection of Discrete–Time Gaussian Processes 1067
10.3.1 Second Moment Characterization 1067
10.3.1.1 Known Means and Covariance Matrices 1067
10.3.1.2 Means and Covariance Matrices with Unknown Parameters 1068
10.3.2 State Variable Characterization 1070
10.3.3 Summary 1076
10.4 Summary 1076
10.5 Problems 1077
11 Epilogue 1084
11.1 Classical Detection and Estimation Theory 1084
11.1.1 Classical Detection Theory 1084
11.1.2 General Gaussian Detection 1086
11.1.3 Classical Parameter Estimation 1088
11.1.4 General Gaussian Estimation 1089
11.2 Representation of Random Processes 1093
11.3 Detection of Signals and Estimation of Signal Parameters 1095
11.4 Linear Estimation of Random Processes 1098
11.5 Observations 1105
11.5.1 Models and Mismatch 1105
11.5.2 Bayes vis–a–vis Fisher 1105
11.5.3 Bayesian and Fisher Bounds 1105
11.5.4 Eigenspace 1106
11.5.5 Whitening 1106
11.5.6 The Gaussian Model 1106
11.6 Conclusion 1106
Appendix A: Probability Distributions and Mathematical Functions 1107
Appendix B: Example Index 1119
References 1125
Index 1145
HARRY L. VAN TREES, ScD., received his BSc. from the United States Military Academy and his ScD. from Massachusetts Institute of Technology. During his fourteen years as a Professor of Electrical Engineering at MIT, he wrote Parts I, II, and III of the DEMT series. On loan from MIT, he served in four senior DoD positions including Chief Scientist of the U.S. Air Force and Principal Deputy Assistant Secretary of Defense (C3I). Returning to academia as an endowed professor at George Mason University, he founded the C3I Center and published Part IV of the DEMT series, Optimum Array Processing. He is currently a University Professor Emeritus.
KRISTINE L. BELL, PhD, is a Senior Scientist at Metron, Inc., and an affiliate faculty member in the Statistics Department at George Mason University. She coedited with Dr. Van Trees the Wiley–IEEE book Bayesian Bounds for Parameter Estimation and Nonlinear Filtering/Tracking.
ZHI TIAN, PhD, is a Professor of Electrical and Computer Engineering at Michigan Technological University. She is a Fellow of the IEEE.
"Since 1968 and after 30 printings of the first edition, Part I of DEMT has been the textbook for the two generations of students and researchers that have designed the signal processing in many of our operational systems. The Second Edition includes subsequent advances, retains clarity of explanation, and promises to be the text and reference for future generations."
Dr. Arthur B. Baggeroer, Ford Professor Emeritus, MIT
The First Edition of Detection, Estimation, and Modulation Theory, Part I, enjoyed a long useful life. However, in the forty–four years since its publication, there have been a large number of changes:
The Second Edition is a significant expansion of the first edition with 450 pages of new material. Chapter 2 in the First Edition, Classical Detection and Estimation Theory, is expanded into four chapters. Many more examples are developed in detail to enhance readability, and more non–Gaussian models are included. A large number of significant developments that are appropriate for an introductory text including global Bayesian bounds, efficient computational algorithms, equivalent estimation algorithms, sequential estimation, and importance sampling are added. The Fisher and Bayesian linear Gaussian models are studied in more detail. The First Edition emphasized continuous–time random processes. The Second Edition includes a comprehensive development of linear estimation of discrete–time random processes leading to discrete–time Wiener and Kalman filters. A brief introduction to Bayesian estimation of non–Gaussian processes is included. An expanded version of material from Part III develops optimum detectors for continuous–time and discrete–time random processes that can be implemented using Wiener or Kalman filters.
As imperative today as it has been since its original publication in 1968, this work is sure to remain the leading reference for engineers who need to apply detection and estimation theory in diverse systems.
1997-2024 DolnySlask.com Agencja Internetowa