This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning, equipping students with a uniquely well-rounded and rigorous foundation for further study. Introduces core topics such as data compression, channel coding, and rate-distortion theory using a unique finite block-length approach. With over 210 end-of-part exercises and numerous examples, students are introduced to contemporary applications in statistics, machine learning and modern communication theory. This textbook presents information-theoretic methods with applications…
Review the options below to login to check your access.
Log in with your Cambridge Higher Education account to check access.
There are no purchase options available for this title.
If you believe you should have access to this content, please contact your institutional librarian or consult our FAQ page for further information about accessing our content.