神經(jīng)計算原理

出版時間:2003-7  出版社:機械工業(yè)出版社  作者:Fredric M.Ham,Ivica Kostanic  頁數(shù):642  
Tag標(biāo)簽:無  

內(nèi)容概要

本書是一部優(yōu)秀的教材,著重講述人工神經(jīng)網(wǎng)絡(luò)基本原理以及如何運用各種神經(jīng)計算技術(shù)來解決科學(xué)和工程領(lǐng)域中的現(xiàn)實問題:模式識別、最優(yōu)化、事件分類、非線性系統(tǒng)的控制和識別以及統(tǒng)計分析等。算法——大多數(shù)訓(xùn)練算法都用上下框線框出,便于讀者查找 MATLAB函數(shù)——一些訓(xùn)練算法有一個附帶的MATLAB函數(shù)實現(xiàn)(在文中用黑體字顯示)。代碼部分相對簡短,僅用幾分鐘就可以輸入MATLAB MATLAB Toolbox——書中大量使用MATLAB的Neural Network Toolbox來舉例說明某些神經(jīng)計算概念 Web站點——登錄本書的Web站點http://www.mhhe.com/engcs/electrical/ham可獲取最新、最全面的信息示例——在大多數(shù)章節(jié)中都給出了詳盡的示例,闡釋重要的神經(jīng)計算概念 習(xí)題集——每章最后都給出大量應(yīng)用神經(jīng)計算技術(shù)的習(xí)題。一些習(xí)題需要使用MATLAB和MATLAB的Neural Network Toolbox。在某些情況下,還提供了MATLAB函數(shù)代碼附錄——附錄A全面介紹了神經(jīng)計算的數(shù)學(xué)基礎(chǔ)。

書籍目錄

About the AuthorsPrefaceAcknowledgmentsList of Important Symbols and OperatorsList of Important AbbreviationsPARTI Fundamental Neurocomputing Concepts andSelected Neural Network Architectures andLearning Rules1 Introduction to Neurocomputing  1.1 What Is Neurocomputing?  1.2 Historical Notes  1.3 Neurocomputing and Neuroscience  1.4 Classification of Neural Networks  1.5 Guide to the Book  References2 Fundamental Neurocomputing Concepts  2.1 Introduction  2.2 Basic Models of Artificial Neurons  2.3 Basic Activation Functions  2.4 Hopfield Model of the Artificial Neuron  2.5 Adaline and Madaline  2.6 Simple Perceptron  2.7 Feedforward Multilayer Perceptron  2.8 Overview of Basic Learning Rules for a Single Neuron  2.9 Data Preprocessing  Problems  References3 Mapping Networks  3.1 Introduction  3.2 Associative Memory Networks  3.3 Backpropagation Learning Algorithms  3.4 Accelerated Learning Backpropagation Algorithms  3.5 Counterpropagation  3.6 Radial Basis Function Neural Networks  Problems  References4 Self-Organizing Networks  4.1 Introduction  4.2 Kohonen Self-Organizing Map  4.3 Learning Vector Quantization  4.4 Adaptive Resonance Theory (ART) Neural Networks  Problems  References5 Recurrent Networks and Temporal Feedforward Networks  5.1 Introduction  5.2 Overview of Recurrent Neural Networks  5.3 Hopfield Associative Memory  5.4 Simulated Annealing  5.5 Boltzmann Machine  5.6 Overview of Temporal Feedforward Networks  5.7 Simple Recurrent Network  5.8 Time-Delay Neural Networks  5.9 Distributed Time-Lagged Feedforward Neural  Networks  Problems  References  PART II Applications of Neurocomputing6 Neural Networks for Optimization Problems  6.1 Introduction  6.2 Neural Networks for Linear Programming Problems  6.3 Neural Networks for Quadratic Programming  Problems  6.4 Neural Networks for Nonlinear Continuous  Constrained Optimization Problems  Problems  References  Solving Matrix Algebra Problems with Neural Networks  7.1 Introduction  7.2 Inverse and Pseudoinverse of a Matrix  7.3 LU Decomposition  7.4 QR Factorization  7.5 Schur Decomposition  7.6 Spectral Factorization - Eigenvalue Decomposition  (EVD) (Symmetric Eigenvalue Problem)  7.7 Neural Network Approach for the Symmetric  Eigenvalue Problem  7.8 Singular Value Decomposition  7.9 A Neurocomputing Approach for Solving the  Algebraic Lyapunov Equation  7.10 A Neurocomputing Approach for Solving the  Algebraic Riccati Equation  Problems  References8 Solution of Linear Algebraic Equations Using Neural  Networks  8.1 Introduction  8.2 Systems of Simultaneous Linear Algebraic Equations  8.3 Least-Squares Solution of Systems of Linear  Equations  8.4 A Least-Squares Neurocomputing Approach for  Solving Systems of Linear Equations  8.5 Conjugate Gradient Learning Rule for Solving  Systems of Linear Equations  8.6 A Generalized Robust Approach for Solving  Systems of Linear Equations Corrupted with Noise  8.7 Regularization Methods for Ill-Posed Problems with  Ill-Determined Numerical Rank  8.8 Matrix Splittings for Iterative Discrete-Time  Methods for Solving Linear Equations  8.9 Total Least-Squares problem  8.10 An L-Norm (Minimax) Neural Network for  Solving Linear Equations  8.11 An L1-Norm (Least-Absolute-Deviations) Neural  Network for Solving Linear Equations  Problems  References9 Statistical Methods Using Neural Networks  9.1 Introduction  9.2 Principal-Component Analysis  9.3 Learning Algorithms for Neural Network Adaptive  Estimation of Principal Components  9.4 Principal-Component Regression  9.5 Partial Least-Squares Regression  9.6 A Neural Network Approach for Partial  Least-Squares Regression  9.7 Robust PLSR: A Neural Network Approach  Problems  References10 Identification, Control, and Estimation Using Neural Networks  10.1 Introduction  10.2 Linear System Representation  10.3 Autoregressive Moving Average Models  10.4 Identification of Linear Systems with ARMA Models  10.5 Parametric System Identification of Linear Systems Using PLSNET  10.6 Nonlinear System Representation  10.7 Identification and Control of Nonlinear Dynamical Systems  10.8 Independent-Component Analysis: Blind Separation of Unknown Source Signals  10.9 Spectrum Estimation of Sinusoids in Additive Noise  10.10 Other Case StudiesProblemsReferencesApp A Mathematical Foundation for NeurocomputingA.1 IntroductionA.2 Linear AlgebraA.3 Principles of Multivariable AnalysisA.4 Lyapunov's Direct MethodA.5 Unconstrained Optimization MethodsA.6 Constrained Nonlinear ProgrammingA.7 Random Variables and Stochastic ProcessesA.8 Fuzzy Set TheoryA.9 Selected Trigonometric IdentitiesReferencesName IndexSubject Index

圖書封面

圖書標(biāo)簽Tags

評論、評分、閱讀與下載


    神經(jīng)計算原理 PDF格式下載


用戶評論 (總計5條)

 
 

  •   正在看,有收獲。英文版就是累點哈哈
  •   書講得很通俗 適合入門
  •   看了中文版的,但覺得上邊有得東西沒寫清楚。所以拿本英文原版看看。
  •   特別是英文版,很不好學(xué)
  •   強烈建議初學(xué)神經(jīng)網(wǎng)絡(luò)的同學(xué)研究研究,講得很詳細(xì),推導(dǎo)過程詳細(xì),附錄有用到的數(shù)學(xué)知識可查。不過紙張有點差,而且送過來還臟了。
 

250萬本中文圖書簡介、評論、評分,PDF格式免費下載。 第一圖書網(wǎng) 手機版

京ICP備13047387號-7