ISBN-13: 9780471647355 / Angielski / Twarda / 2013 / 800 str.
ISBN-13: 9780471647355 / Angielski / Twarda / 2013 / 800 str.
Offers the most complete, up-to-date coverage available on the principles of digital communications. Focuses on basic issues, relating theory to practice wherever possible. Numerous examples, worked out in detail, have been included to help the reader develop an intuitive grasp of the theory. Topics covered include the sampling process, digital modulation techniques, error-control coding, robust quantization for pulse-code modulation, coding speech at low bit radio, information theoretic concepts, coding and computer communication. Because the book covers a broad range of topics in digital communications, it should satisfy a variety of backgrounds and interests.
1 Introduction 1
1.1 Historical Background 1
1.2 The Communication Process 2
1.3 Multiple–Access Techniques 4
1.4 Networks 6
1.5 Digital Communications 9
1.6 Organization of the Book 11
2 Fourier Analysis of Signals and Systems 13
2.1 Introduction 13
2.2 The Fourier Series 13
2.3 The Fourier Transform 16
2.4 The Inverse Relationship between Time–Domain and Frequency–Domain Representations 25
2.5 The Dirac Delta Function 28
2.6 Fourier Transforms of Periodic Signals 34
2.7 Transmission of Signals through Linear Time–Invariant Systems 37
2.8 Hilbert Transform 42
2.9 Pre–envelopes 45
2.10 Complex Envelopes of Band–Pass Signals 47
2.11 Canonical Representation of Band–Pass Signals 49
2.12 Complex Low–Pass Representations of Band–Pass Systems 52
2.13 Putting the Complex Representations of Band–Pass Signals and Systems All Together 54
2.14 Linear Modulation Theory 58
2.15 Phase and Group Delays 66
2.16 Numerical Computation of the Fourier Transform 69
2.17 Summary and Discussion 78
3 Probability Theory and Bayesian Inference 87
3.1 Introduction 87
3.2 Set Theory 88
3.3 Probability Theory 90
3.4 Random Variables 97
3.5 Distribution Functions 98
3.6 The Concept of Expectation 105
3.7 Second–Order Statistical Averages 108
3.8 Characteristic Function 111
3.9 The Gaussian Distribution 113
3.10 The Central Limit Theorem 118
3.11 Bayesian Inference 119
3.12 Parameter Estimation 122
3.13 Hypothesis Testing 126
3.14 Composite Hypothesis Testing 132
3.15 Summary and Discussion 133
4 Stochastic Processes 145
4.1 Introduction 145
4.2 Mathematical Definition of a Stochastic Process 145
4.3 Two Classes of Stochastic Processes: Strictly Stationary and Weakly Stationary 147
4.4 Mean, Correlation, and Covariance Functions of Weakly Stationary Processes 149
4.5 Ergodic Processes 157
4.6 Transmission of a Weakly Stationary Process through a Linear Time–invariant Filter 158
4.7 Power Spectral Density of a Weakly Stationary Process 160
4.8 Another Definition of the Power Spectral Density 170
4.9 Cross–spectral Densities 172
4.10 The Poisson Process 174
4.11 The Gaussian Process 176
4.12 Noise 179
4.13 Narrowband Noise 183
4.14 Sine Wave Plus Narrowband Noise 193
4.15 Summary and Discussion 195
5 Information Theory 207
5.1 Introduction 207
5.2 Entropy 207
5.3 Source–coding Theorem 214
5.4 Lossless Data Compression Algorithms 215
5.5 Discrete Memoryless Channels 223
5.6 Mutual Information 226
5.7 Channel Capacity 230
5.8 Channel–coding Theorem 232
5.9 Differential Entropy and Mutual Information for Continuous Random Ensembles 237
5.10 Information Capacity Law 240
5.11 Implications of the Information Capacity Law 244
5.12 Information Capacity of Colored Noisy Channel 248
5.13 Rate Distortion Theory 253
5.14 Summary and Discussion 256
6 Conversion of Analog Waveforms into Coded Pulses 267
6.1 Introduction 267
6.2 Sampling Theory 268
6.3 Pulse–Amplitude Modulation 274
6.4 Quantization and its Statistical Characterization 278
6.5 Pulse–Code Modulation 285
6.6 Noise Considerations in PCM Systems 290
6.7 Prediction–Error Filtering for Redundancy Reduction 294
6.8 Differential Pulse–Code Modulation 301
6.9 Delta Modulation 305
6.10 Line Codes 309
6.11 Summary and Discussion 312
7 Signaling over AWGN Channels 323
7.1 Introduction 323
7.2 Geometric Representation of Signals 324
7.3 Conversion of the Continuous AWGN Channel into a Vector Channel 332
7.4 Optimum Receivers Using Coherent Detection 337
7.5 Probability of Error 344
7.6 Phase–Shift Keying Techniques Using Coherent Detection 352
7.7 M–ary Quadrature Amplitude Modulation 370
7.8 Frequency–Shift Keying Techniques Using Coherent Detection 375
7.9 Comparison of M–ary PSK and M–ary FSK from an
Information–Theoretic Viewpoint 398
7.10 Detection of Signals with Unknown Phase 400
7.11 Noncoherent Orthogonal Modulation Techniques 404
7.12 Binary Frequency–Shift Keying Using Noncoherent Detection 410
7.13 Differential Phase–Shift Keying 411
7.14 BER Comparison of Signaling Schemes over AWGN Channels 415
7.15 Synchronization 418
7.16 Recursive Maximum Likelihood Estimation for Synchronization 419
7.17 Summary and Discussion 431
8 Signaling over Band–Limited Channels 445
8.1 Introduction 445
8.2 Error Rate Due to Channel Noise in a Matched–Filter Receiver 446
8.3 Intersymbol Interference 447
8.4 Signal Design for Zero ISI 450
8.5 Ideal Nyquist Pulse for Distortionless Baseband Data Transmission 450
8.6 Raised–Cosine Spectrum 454
8.7 Square–Root Raised–Cosine Spectrum 458
8.8 Post–Processing Techniques: The Eye Pattern 463
8.9 Adaptive Equalization 469
8.10 Broadband Backbone Data Network: Signaling over Multiple Baseband Channels 474
8.11 Digital Subscriber Lines 475
8.12 Capacity of AWGN Channel Revisited 477
8.13 Partitioning Continuous–Time Channel into a Set of Subchannels 478
8.14 Water–Filling Interpretation of the Constrained Optimization Problem 484
8.15 DMT System Using Discrete Fourier Transform 487
8.16 Summary and Discussion 494
9 Signaling over Fading Channels 501
9.1 Introduction 501
9.2 Propagation Effects 502
9.3 Jakes Model 506
9.4 Statistical Characterization of Wideband Wireless Channels 511
9.5 FIR Modeling of Doubly Spread Channels 520
9.6 Comparison of Modulation Schemes: Effects of Flat Fading 525
9.7 Diversity Techniques 527
9.8 “Space Diversity–on–Receive” Systems 528
9.9 “Space Diversity–on–Transmit” Systems 538
9.10 “Multiple–Input, Multiple–Output” Systems: Basic Considerations 546
9.11 MIMO Capacity for Channel Known at the Receiver 551
9.12 Orthogonal Frequency Division Multiplexing 556
9.13 Spread Spectrum Signals 557
9.14 Code–Division Multiple Access 560
9.15 The RAKE Receiver and Multipath Diversity 564
9.16 Summary and Discussion 566
10 Error–Control Coding 577
10.1 Introduction 577
10.2 Error Control Using Forward Error Correction 578
10.3 Discrete Memoryless Channels 579
10.4 Linear Block Codes 582
10.5 Cyclic Codes 593
10.6 Convolutional Codes 605
10.7 Optimum Decoding of Convolutional Codes 613
10.8 Maximum Likelihood Decoding of Convolutional Codes 614
10.9 Maximum a Posteriori Probability Decoding of Convolutional Codes 623
10.10 Illustrative Procedure for Map Decoding in the Log–Domain 638
10.11 New Generation of Probabilistic Compound Codes 644
10.12 Turbo Codes 645
10.13 EXIT Charts 657
10.14 Low–Density Parity–Check Codes 666
10.15 Trellis–Coded Modulation 675
10.16 Turbo Decoding of Serial Concatenated Codes 681
10.17 Summary and Discussion 688
Appendices
A Advanced Probabilistic Models A1
A.1 The Chi–Square Distribution A1
A.2 The Log–Normal Distribution A3
A.3 The Nakagami Distribution A6
B Bounds on the Q–Function A11
C Bessel Functions A13
C.1 Series Solution of Bessel’s Equation A13
C.2 Properties of the Bessel Function A14
C.3 Modified Bessel Function A16
D Method of Lagrange Multipliers A19
D.1 Optimization Involving a Single Equality Constraint A19
E Information Capacity of MIMO Channels A21
E.1 Log–Det Capacity Formula of MIMO Channels A21
E.2 MIMO Capacity for Channel Known at the Transmitter A24
F Interleaving A29
F.1 Block Interleaving A30
F.2 Convolutional Interleaving A32
F.3 Random Interleaving A33
G The Peak–Power Reduction Problem in OFDM A35
G.1 PAPR Properties of OFDM Signals A35
G.2 Maximum PAPR in OFDM Using M–ary PSK A36
G.3 Clipping–Filtering: A Technique for PAPR Reduction A37
H Nonlinear Solid–State Power Amplifiers A39
H.1 Power Amplifier Nonlinearities A39
H.2 Nonlinear Modeling of Band–Pass Power Amplifiers A42
I Monte Carlo Integration A45
J Maximal–Length Sequences A47
J.1 Properties of Maximal–Length Sequences A47
J.2 Choosing a Maximal–Length Sequence A50
K Mathematical Tables A55
Glossary G1
Bibliography B1
Index I1
Credits C1
Simon Haykin, PhD, is Distinguished University Professor and Director of the Cognitive Systems Laboratory in the Faculty of Engineering at McMaster University. A world-renowned authority on adaptive and learning systems, Dr. Haykin has pioneered signal-processing techniques and systems for radar and communication applications, culminating in the study of cognitive dynamic systems, which has become his research passion.
1997-2024 DolnySlask.com Agencja Internetowa