ISBN-13: 9783540042549 / Angielski / Miękka / 1968 / 209 str.
This book is based on lectures given by the author at the IBM European Systems Research Institute (ESRI) in Geneva. Information Theory on the syntactic level, as introduced by Claude Shannon in 1949, has many limitations when applied to information processing by computers. But in spite of some obvious shortcomings, the underlyin~ principles are of fundamental importance for systems engineers in understanding the nature of the problems of handling information, its acquisition, storage, processing, and interpretation. The lectures, as presented in this book, attempt to give an exposition of the lovical foundation and basic principles, and to provide at the same time a basis for further study in more specific areas of this expan1in~ theory, such as coding, detection, pattern recognition, and filtering. Most of the problems in Appendix C are intended as extensions of the text, while calling for actjve participation by the stu1ent. Some other problems are direct applications of the theory to specific situations. Some problems require extensive numerical calculations. It is assumed in those cases that the student has access to a computer and that he is capable of writing the necessary programs. The stu1ent is assumed to have a good command of the calculus, and of the theory of probability as well as statistics. Therefore no basic mathematical concepts are discussed in this IV book. The Fourier transform and some related mathematical concepts are introduced in Appendix A.