Entropy and Information Theory

· Springer Science & Business Media
Ebook
332
Pages

About this ebook

This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behavior of sample information and expected information. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.

Rate this ebook

Tell us what you think.

Reading information

Smartphones and tablets
Install the Google Play Books app for Android and iPad/iPhone. It syncs automatically with your account and allows you to read online or offline wherever you are.
Laptops and computers
You can listen to audiobooks purchased on Google Play using your computer's web browser.
eReaders and other devices
To read on e-ink devices like Kobo eReaders, you'll need to download a file and transfer it to your device. Follow the detailed Help Center instructions to transfer the files to supported eReaders.