Politeknik Siber dan Sandi Negara

Knowledge Center of Cybersecurity and Cryptography

  • Home
  • Information
  • News
  • Help
  • Librarian
  • Member Area
  • Select Language :
    Arabic Bengali Brazilian Portuguese English Espanol German Indonesian Japanese Malay Persian Russian Thai Turkish Urdu

Search by :

ALL Author Subject ISBN/ISSN Advanced Search

Last search:

{{tmpObj[k].text}}
Image of Elements of information theory
Bookmark Share

Text

Elements of information theory

Cover, Thomas M. - Personal Name; Thomas, Joy A. - Personal Name;

Table of Contents
Preface to the Second Edition.
Preface to the First Edition.
Acknowledgments for the Second Edition.
Acknowledgments for the First Edition.
1. Introduction and Preview.
1.1 Preview of the Book.

2. Entropy, Relative Entropy, and Mutual Information.
2.1 Entropy.
2.2 Joint Entropy and Conditional Entropy.
2.3 Relative Entropy and Mutual Information.
2.4 Relationship Between Entropy and Mutual Information.
2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information.
2.6 Jensen's Inequality and Its Consequences.
2.7 Log Sum Inequality and Its Applications.
2.8 Data-Processing Inequality.
2.9 Sufficient Statistics.
2.10 Fano's Inequality. Summary. Problems. Historical Notes.

3. Asymptotic Equipartition Property.
3.1 Asymptotic Equipartition Property Theorem.
3.2 Consequences of the AEP: Data Compression.
3.3 High-Probability Sets and the Typical Set. Summary. Problems. Historical Notes.

4. Entropy Rates of a Stochastic Process.
4.1 Markov Chains.
4.2 Entropy Rate.
4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph.
4.4 Second Law of Thermodynamics.
4.5 Functions of Markov Chains. Summary. Problems. Historical Notes.

5. Data Compression.
5.1 Examples of Codes.
5.2 Kraft Inequality.
5.3 Optimal Codes.
5.4 Bounds on the Optimal Code Length.
5.5 Kraft Inequality for Uniquely Decodable Codes.
5.6 Huffman Codes.
5.7 Some Comments on Huffman Codes.
5.8 Optimality of Huffman Codes.
5.9 Shannon-Fano-Elias Coding.
5.10 Competitive Optimality of the Shannon Code.
5.11 Generation of Discrete Distributions from Fair Coins.

6. Gambling and Data Compression.
6.1 The Horse Race.
6.2 Gambling and Side Information.
6.3 Dependent Horse Races and Entropy Rate.
6.4 The Entropy of English.
6.5 Data Compression and Gambling.
6.6 Gambling Estimate of the Entropy of English

7. Channel Capacity.
7.1 Examples of Channel Capacity.
7.2 Symmetric Channels.
7.3 Properties of Channel Capacity.
7.4 Preview of the Channel Coding Theorem.
7.5 Definitions.
7.6 Jointly Typical Sequences.
7.7 Channel Coding Theorem.
7.8 Zero-Error Codes.
7.9 Fano's Inequality and the Converse to the Coding Theorem.
7.10 Equality in the Converse to the Channel Coding Theorem.
7.11 Hamming Codes.
7.12 Feedback Capacity.
7.13 Source-Channel Separation Theorem.

8. Differential Entropy.
8.1 Definitions.
8.2 AEP for Continuous Random Variables.
8.3 Relation of Differential Entropy to Discrete Entropy.
8.4 Joint and Conditional Differential Entropy.
8.5 Relative Entropy and Mutual Information.
8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information.

9. Gaussian Channel.
9.1 Gaussian Channel: Definitions.
9.2 Converse to the Coding Theorem for Gaussian Channels.
9.3 Bandlimited Channels. 9.4 Parallel Gaussian Channels.
9.5 Channels with Colored Gaussian Noise.
9.6 Gaussian Channels with Feedback.

10. Rate Distortion Theory.
10.1 Quantization.
10.2 Definitions.
10.3 Calculation of the Rate Distortion Function.
10.4 Converse to the Rate Distortion Theorem.
10.5 Achievability of the Rate Distortion Function.
10.6 Strongly Typical Sequences and Rate Distortion.
10.7 Characterization of the Rate Distortion Function.
10.8 Computation of Channel Capacity and the Rate Distortion Function.

11. Information Theory and Statistics.
11.1 Method of Types.
11.2 Law of Large Numbers.
11.3 Universal Source Coding.
11.4 Large Deviation Theory.
11.5 Examples of Sanov's Theorem.
11.6 Conditional Limit Theorem.
11.7 Hypothesis Testing.
11.8 Chernoff-Stein Lemma.
11.9 Chernoff Information.
11.10 Fisher Information and the Cram-er-Rao Inequality.

12. Maximum Entropy.
12.1 Maximum Entropy Distributions.
12.2 Examples.
12.3 Anomalous Maximum Entropy Problem.
12.4 Spectrum Estimation.
12.5 Entropy Rates of a Gaussian Process.
12.6 Burg's Maximum Entropy Theorem.

13. Universal Source Coding.
13.1 Universal Codes and Channel Capacity.
13.2 Universal Coding for Binary Sequences.
13.3 Arithmetic Coding.
13.4 Lempel-Ziv Coding.
13.5 Optimality of Lempel-Ziv Algorithms.

14. Kolmogorov Complexity.
14.1 Models of Computation.
14.2 Kolmogorov Complexity: Definitions and Examples.
14.3 Kolmogorov Complexity and Entropy.
14.4 Kolmogorov Complexity of Integers.
14.5 Algorithmically Random and Incompressible Sequences.
14.6 Universal Probability.
14.7 Kolmogorov complexity.
14.9 Universal Gambling.
14.10 Occam's Razor.
14.11 Kolmogorov Complexity and Universal Probability.
14.12 Kolmogorov Sufficient Statistic.
14.13 Minimum Description Length Principle.

15. Network Information Theory.
15.1 Gaussian Multiple-User Channels.
15.2 Jointly Typical Sequences.
15.3 Multiple-Access Channel.
15.4 Encoding of Correlated Sources.
15.5 Duality Between Slepian-Wolf Encoding and Multiple-Access Channels.
15.6 Broadcast Channel.
15.7 Relay Channel.
15.8 Source Coding with Side Information.
15.9 Rate Distortion with Side Information.
15.10 General Multiterminal Networks.

16. Information Theory and Portfolio Theory.
16.1 The Stock Market: Some Definitions.
16.2 Kuhn-Tucker Characterization of the Log-Optimal Portfolio.
16.3 Asymptotic Optimality of the Log-Optimal Portfolio.
16.4 Side Information and the Growth Rate.
16.5 Investment in Stationary Markets.
16.6 Competitive Optimality of the Log-Optimal Portfolio.
16.7 Universal Portfolios.
16.8 Shannon-McMillan-Breiman Theorem (General AEP).

17. Inequalities in Information Theory.
17.1 Basic Inequalities of Information Theory.
17.2 Differential Entropy.
17.3 Bounds on Entropy and Relative Entropy.
17.4 Inequalities for Types.
17.5 Combinatorial Bounds on Entropy.
17.6 Entropy Rates of Subsets.
17.7 Entropy and Fisher Information.
17.8 Entropy Power Inequality and Brunn-Minkowski Inequality.
17.9 Inequalities for Determinants.
17.10 Inequalities for Ratios of Determinants.

List of Symbols.
Index.


Availability
#
Perpustakaan Poltek SSN (Rak 000) 003.54 COV e
00000572
Available but not for loan - Missing
#
Perpustakaan Poltek SSN (Rak 000) 003.54 COV e/2
b0001244
Available - Available
Detail Information
Series Title
--
Call Number
003.54 COV e
Publisher
New Jersey : Willey., 2006
Collation
xxiii,776 hal.;illus.;23cm
Language
English
ISBN/ISSN
9780471241959
Classification
003.54
Content Type
-
Media Type
-
Carrier Type
-
Edition
Second Edition
Subject(s)
Elektronik
Theory Information
Specific Detail Info
--
Statement of Responsibility
Thomas M. Cover dan Joy A. Thomas
Other version/related

No other version available

File Attachment
No Data
Comments

You must be logged in to post a comment

Politeknik Siber dan Sandi Negara
  • Information
  • Services
  • Librarian
  • Member Area

About Us

Perpustakaan Politeknik Siber dan Sandi Negara menyediakan berbagai macam koleksi seperti Buku, Jurnal, Majalah, Koran, Referensi dan Konten Lokal.

Search

start it by typing one or more keywords for title, author or subject

Keep SLiMS Alive Want to Contribute?

© 2025 — Senayan Developer Community

Powered by SLiMS
Select the topic you are interested in
  • Computer Science, Information & General Works
  • Philosophy & Psychology
  • Religion
  • Social Sciences
  • Language
  • Pure Science
  • Applied Sciences
  • Art & Recreation
  • Literature
  • History & Geography
Icons made by Freepik from www.flaticon.com
Advanced Search
Where do you want to share?