ISBN-13: 9783659480294 / Angielski / Miękka / 2013 / 136 str.
We often hear, in communication engg., that some data is lost during the message communication. The basic object of this book is to find more and more efficient channel capacity to minimize the data loss. I tried to solve so many unbalancing problems, which occures in communication engineering, and can be solved by optimization technique very easily. On the basis of entropy Rohit Kumar Verma also develop such type of mathematical models which are useful for the prediction of futuristic statistical data for making our future plan and strategy. This book covers following topics in detail: -Some generating functions for the measures of entropies and directed-divergence -Some logistic-type growth models based on measures of information -Some multivariate normal distributions based on parametric and non-parametric measures of entropies -An unorthodox parametric measures of information and corresponding measures of fuzzy information -Optimization problems for the new parametric measure of entropy in the dual space -Some inequalities on the basis of the new para-metric measure of information -The channel capacity when the non-Shannon info. measures are maximized by normal distribution.
We often hear,in communication engg.,that some data is lost during the message communication. The basic object of this book is to find more and more efficient channel capacity to minimize the data loss. I tried to solve so many unbalancing problems, which occures in communication engineering, and can be solved by optimization technique very easily. On the basis of entropy Rohit Kumar Verma also develop such type of mathematical models which are useful for the prediction of futuristic statistical data for making our future plan and strategy. This book covers following topics in detail : •Some generating functions for the measures of entropies and directed-divergence •Some logistic-type growth models based on measures of information •Some multivariate normal distributions based on parametric and non-parametric measures of entropies •An unorthodox parametric measures of information and corresponding measures of fuzzy information •Optimization problems for the new parametric measure of entropy in the dual space •Some inequalities on the basis of the new para-metric measure of information •The channel capacity when the non-Shannon info. measures are maximized by normal distribution.