出版時間:2003-11-1 出版社:清華大學出版社 作者:Thomas M.Cover,Joy A.Thomas 頁數(shù):545
Tag標簽:無
前言
信息理論是當代國內(nèi)外大學電子工程系、計算機系和統(tǒng)計系等為研究生和高年級本科生開設的一門基礎專業(yè)課程。自20世紀80年代以來,在國內(nèi)外廣為流傳的教材有30多種。Thomas M.Cover-所著的“Elements of Information Theory”一書在時間上雖不是最新的,但在美國是獲得最廣泛應用的一本教科書,如麻省理工學院(MIT)、斯坦福大學(Stanford University)、加州大學伯克利分校(University of California at Berkeley)等美國一流大學均采用本書作為該課程的教材或主要參考書。因此本書很值得向國內(nèi)推薦?! ”緯闹饕獌?yōu)點是: (1)概念清晰。信息論涉及很多數(shù)學問題,其概念很多淹沒在數(shù)學推導中,而本書將講清概念放在第一位,且能深入淺出,使讀者很快得其要領?! 。?)數(shù)學工具的使用和數(shù)學推導過程的介紹恰到好處,既沒有過于簡化又沒有拘泥于數(shù)學細節(jié)?! 。?)理論與應用并重,既保證理論的完整性和系統(tǒng)性,又突出理論研究面向應用的性質,使讀者能帶著問題學,具有啟發(fā)性。 ?。?)雖然此書出版較早(此后在美國又有三本教材問世),但從內(nèi)容的覆蓋面來講此書仍有優(yōu)勢,且迄今仍有一定的先進性?! 。?)與美國的其他教材類似,本書也不可避免地帶有作者愛好的印記,書中的某些內(nèi)容是作者偏愛而放入的,在一般的信息論課程中大都不介紹這些內(nèi)容。作為參考書,擴大學生視野,這也是很有益的。選用本書的教員可以選擇其中的若干章講授。
內(nèi)容概要
本書系統(tǒng)介紹了信息論基本原理及其在通信理論、統(tǒng)計學、計算機科學、概率論以及投資理論等領域的應用。作者以循序漸進的方式,介紹了信息量的基本定義、相對熵、互信息以及他們?nèi)绾巫匀坏赜脕斫鉀Q數(shù)據(jù)壓縮、信道容量、信息率失真、統(tǒng)計假設、網(wǎng)絡信息流等問題。除此以外,本書還探討了很多教材中從未涉及的問題,如:熱力學第二定律與馬爾可夫鏈之間的聯(lián)系;Huffman編碼的最優(yōu)性;數(shù)據(jù)壓縮的對偶性;Lempel Ziv編碼;Kolmogorov復雜笥;Porfolio理論;信息論不等式及其數(shù)學結論;本書可作為通信、電子、計算機、自動控制、統(tǒng)計、經(jīng)濟等專業(yè)高年級本科生和研究生的教材或參考書,也可供相關領域的科研人員和專業(yè)技術人員參考。
作者簡介
Thomas M.Cover斯坦福大學電氣工程系、統(tǒng)計系教授。曾任IEEE信息論學會主席,現(xiàn)任數(shù)理統(tǒng)計研究所研究員、IEEE高級會員。1 972年以論文“Broadcast Channels”榮獲信息論優(yōu)秀論文獎,1990年被選為“Shannon Lecturer”,這是信息論領域的最高榮譽。最近20年,他致力于研究信息論和統(tǒng)計學之間的關系。
書籍目錄
list of figures 1 introduction and preview 1.1 preview of the book 2 entropy, relative entropy and mutual information 2.1 entropy 2.2 joint entropy and conditional entropy 2.3 relative entropy and mutual information 2.4 relationship between entropy and mutual information 2.5 chain rules for entropy, relative entropy and mutual information 2.6 jensen's inequality and its consequences 2.7 the log sum inequality and its applications 2.8 data processing inequality 2.9 the second law of thermodynamics 2.10 sufficient statistics 2.11 fano's inequality summary of chapter 2 problems for chapter 2 historical notes 3 the asymptotic equipartition property 3.1 the aep 3.2 consequences of the aep: data compression 3.3 high probability sets and the typical set summary of chapter 3 problems for chapter 3 historical notes 4 entropy rates of a stochastic process 4.1 markov chains 4.2 entropy rate 4.3 example: entropy rate of a random walk on a weighted graph 4.4 hidden markov models summary of chapter 4 problems for chapter 4 historical notes 5 data compression6 gambling and data compression 7 kolmogorov complexity 8 channel capacity 9 differential entropy 10 the gaussian channel 11 maximum entropy and spectral estimation 12 information theory and statistics 13 rate distortion theory 14 network information theory 15 information theory and the stock market 16 inequalities in information theory index
章節(jié)摘錄
Philosophy of Science (Occam's Razor). William of Occam said "Causes shall not be multiplied beyond necessity," or to paraphrase it, "The simplest explanation is best". Solomonoff, and later Chaitin, argue persuasively that one gets a universally good prediction procedure if one takes a weighted combination of all programs that explain the data and observes what they print next. Moreover, this inference will work in many problems not handled by statistics. For example, this procedure will eventually predict the subsequent digits of π. When this procedure is applied to coin flips that come up heads with probability 0.7, this too will be inferred. When applied to the stock market, the procedure should essentially find all the "laws" of the stock market and extrapolate them optimally. In principle, such a procedure would have found Newton's laws of physics. Of course, such inference is highly impractical, because weeding out all computer programs that fail to generate existing data will take impossibly long. We would predict what happens tomorrow a hundred years from now. ……
圖書封面
圖書標簽Tags
無
評論、評分、閱讀與下載