出版時間:2012-6 出版社:科學(xué)出版社 作者:格雷 頁數(shù):409 字數(shù):580000
內(nèi)容概要
由格雷編寫的這本《熵與信息論(影印版)》保留了第一版清晰、簡明的寫作風(fēng)格。信息論的內(nèi)容主要包括熵、數(shù)據(jù)壓縮、信道容量、率失真、網(wǎng)絡(luò)信息論以及假設(shè)檢驗等?!鹅嘏c信息論(影印版)》旨在為讀者在理論研究和應(yīng)用等方面打下堅實的基礎(chǔ)。每章的結(jié)尾配有習(xí)題集、要點總結(jié)以及主要內(nèi)容論點的回顧。
《熵與信息論(影印版)》是電子工程、統(tǒng)計學(xué)以及通信方向高年級本科生和研究生學(xué)習(xí)信息論基礎(chǔ)課程的理想?yún)⒖紩?/pre>書籍目錄
Preface
Introduction
1 Information Sources
1.1 Probability Spaces and Random Variables
1.2 Random Processes and Dynamical Systems
1.3 Distributions
1.4 Standard Alphabets
1.5 Expectation
1.6 Asymptotic Mean Stationarity
1.7 Ergodic Properties
2 Pair Processes: Channels, Codes, and Couplings
2.1 Pair Processes
2.2 Channels
2.3 Stationariw Properties of Channels
2.4 Extremes: Noiseless and Completely Random Channels
2.5 Deterministic Channels and Sequence Coders
2.6 Stationary and Sliding-Block Codes
2.7 Block Codes
2.8 Random Punctuation Sequences
2.9 Memoryless Channels
2.10 Finite-Memory Channels
2.11 Output Mixing Channels
2.12 Block independent Channels
2.13 Conditionally Block independent Channels
2.14 Stationarizing Block Independent Channels
2.15 Primitive Channels
2.16 Additive Noise Channels
2.17 Markov Channels
2.18 Finite-State Channels and Codes
2.19 Cascade Channels
2.20 Commuication Systems
2.21 Couplings
2.22 Block to Sliding-Block: The Rohiin-Kakutani Theorem
3 Entropy
3.1 Entropy and Entropy Rate
3.2 Divergence Inequality and Relative Entropy
3.3 Basic Properties of Entropy
3.4 Entropy Rate
3.5 Relative Entropy Rate
3.6 Conditional Entropy and Mutual Information
3.7 Entropy Rate Revisited
3.8 Markov Approximations
3.9 Relative Entropy Densities
4 The Entropy Ergodic Theorem
4.1 History
4.2 Stationary Ergodic Sources
4.3 Stationary Nonergodic Sources
4.4 AMS Sources
4.5 The Asymptotic Equipartition Property
5 Distortion and Approximation
5.1 Distortion Measures
5.2 Fidelity Criteria
5.3 Average Limiting Distortion
5.4 Communications Systems Performance
5.5 Optima] Performance
5.6 Code Approximation
5.7 Approximating Random Vectors and Processes
5.8 The Monge/Kantorovich/Vasershtein Distance
5.9 Variation and Distribution Distance
5.10 Coupling Discrete Spaces with the Hamming Distance
5.11 Process Distance and Approximation
5.12 Source Approximation and Codes
5.13 d-bar Continuous Channels
6 Distortion and Entropy
6.1 The Fano Inequality
6.2 Code Approximation and Entropy Rate
6.3 Pinsker's and Matron's Inequalities
6.4 Entropy and Isomorphism
6.5 Almost Lossless Source Coding
6.6 Asymptotically Optimal Almost Lossless Codes
6.7 Modeling and Simulation
Relative Entropy
7.1 Divergence
7.2 Conditional Relative Entropy
7.3 Limiting Entropy Densities
7.4 Information for General Alphabets
7.5 Convergence Results
8 Information Rates
8.1 Information Rates for Finite Alphabets
8.2 Information Rates for General Alphabets
8.3 A Mean Ergodic Theorem for Densities
8.4 Information Rates of Stationary Processes
8.5 The Data Processing Theorem
8.6 Memoryless Channels and Sources
9 Distortion and Information
9.1 The Shannon Distortion-Rate Function
9.2 Basic Properties
9.3 Process Definitions of the Distortion-Rate Function
9.4 The Distortion-Rate Function as a Lower Bound
9.5 Evaluating the Rate-Distortion Function
10 Relative Entropy Rates
10.1 Relative Entropy Densities and Rates
10.2 Markov Dominating Measures
10.3 Stationary Processes
10.4 Mean Ergodic Theorems
11 Ergodic Theorems for Densities
11.1 Stationary Ergodic Sources
11.2 Stationary Nonergodic Sources
11.3 AMS Sources
11.4 Ergodic Theorems for Information Densities
12 Source Coding Theorems
12.1 Source Coding and Channel Coding
12.2 Block Source Codes for AMS Sources
12.3 Block Source Code Mismatch
12.4 Block Coding Stationary Sources
12.5 Block Cod|rig AMS Ergodic Sources
12.6 Subadditive FideliW Criteria
12.7 Asynchronous Block Codes
12.8 Sliding-Block Source Codes
12.9 A Geometric Interpretation
13 Properties of Good Source Codes
13.1 Optimal and Asymptotically Optimal Codes
13.2 Block Codes
13.3 Sliding-Block Codes
14 Coding for Noisy Channels
14.1 Noisy Channels
14.2 Feinstein's Lemma
14.3 Feinstein's Theorem
14.4 Channel Capacity
14.5 Robust Block Codes
14.6 Block Coding Theorems for Noisy Channels
14.7 Joint Source and Channel Block Codes
14.8 Synchronizing Block Channel Codes
14.9 Sliding-block Source and Channel Coding
References
Index圖書封面
評論、評分、閱讀與下載