基于人工神經(jīng)網(wǎng)絡(luò)的機(jī)器翻譯

出版時(shí)間:2007-6  出版社:科學(xué)  作者:許羅邁  頁數(shù):216  
Tag標(biāo)簽:無  

內(nèi)容概要

基于語料庫統(tǒng)計(jì)的機(jī)器翻譯模式把機(jī)器翻譯分為翻譯模式和語言模式兩種處理過程,作者嘗試把人工神經(jīng)網(wǎng)絡(luò)技術(shù)應(yīng)用于兩種模式的處理,使之應(yīng)用于機(jī)器翻譯的全過程,是一項(xiàng)創(chuàng)造性工作,作者采用神經(jīng)元自學(xué)習(xí)的方法,從少量實(shí)例開始,系統(tǒng)通過自學(xué)習(xí)建立機(jī)器詞庫和對應(yīng)的譯文,本研究實(shí)驗(yàn)證明對于確定的領(lǐng)域,該系統(tǒng)可以輸出相當(dāng)通順的目的語,這種用分布式神經(jīng)網(wǎng)絡(luò)體系解決翻譯模式的訓(xùn)練,較好地解決了單一網(wǎng)絡(luò)學(xué)習(xí)能力有限的問題,對神經(jīng)網(wǎng)絡(luò)語言處理技術(shù)開發(fā)了新思路,有相當(dāng)意義。    作者在應(yīng)用神經(jīng)網(wǎng)絡(luò)處理語言模式方面,也提出了新的解決方案,改變了以往神經(jīng)網(wǎng)絡(luò)以復(fù)雜句法、語義特征為訓(xùn)練對象的普遍做法,采用詞性標(biāo)注為訓(xùn)練對象,以自創(chuàng)的一套詞語移動(dòng)符號基為訓(xùn)練目標(biāo)的神經(jīng)網(wǎng)絡(luò)處理方法,是一種獨(dú)特的處理方法,雖然作者指出這種方法未能得到預(yù)期的結(jié)果,但是如果能夠如作者提出的把分布式神經(jīng)網(wǎng)絡(luò)體系也用于語言模式的訓(xùn)練,這種獨(dú)特的方法成敗與否還未可知。

書籍目錄

PrefaceAcknowledgementsChapter One PrologueChapter Two MT state of the art 2.1 MT as symbolic systems 2.2 Practical MT 2.3 Alternative technique of MT  2.3.1 Theoretical foundation  2.3.2 Translation model  2.3.3 Language model 2.4 DiscussionChapter Three Connectionist solutions 3.1 NLP models 3.2 Representation 3.3 Phonological processing 3.4 Learning verb past tense 3.5 Part of speech tagging 3.6 Chinese collocation learning 3.7 Syntactic parsing  3.7.1 Learning active/passive transformation  3.7.2 Confluent preorder parsing  3.7.3 Parsing with fiat structures  3.7.4 Parsing embedded clauses  3.7.5 Parsing with deeper structures 3.8 Discourse analysis  3.8.1 Story gestalt and text understanding  3.8.2 Processing stories with scriptural knowledge 3.9 Machine translation 3.10 ConclusionChapter Four NeuroTrans design considerations 4.1 Scalability and extensibility 4.2 Transfer or inter lingual 4.3 Hybrid or fully connectionist 4.4 The use of linguistic knowledge 4.5 Translation as a two stage process 4.6 Selection of network models 4.7 Connectionist implementation 4.8 Connectionist representation issues 4.9 ConclusionChapter Five A neural lexicon model 5.1 Language data 5.2 Knowledge representation  5.2.1 Symbolic approach  5.2.2 The statistical approach  5.2.3 Connectionist approach  5.2.4 NeuroTrans' input/output representation  5.2.5 NeuroTrans' lexicon representation 5.3 Implementing the neural lexicon  5.3.1 Words in context  5.3.2 Context with weights  5.3.3 Details of algorithm  5.3.4 The Neural Lexicon Builder 5.4 Training  5.4.1 Sample preparation  5.4.2 Training results  5.4.3 Generalization test 5.5 Discussion  5.5.1 Adequacy  5.5.2 Scalability and Extensibility  5.5.3 Efficiency  5.5.4 WeaknessesChapter Six Implementing the language model 6.1 Overview 6.2 Design  6.2.1 Redefining the generation problem  6.2.2 Defining jumble activity  6.2.3 Language model structure 6.3 Implementation  6.3.1 Network structure Sampling Training and results  6.3.2 Generalization test 6.4 Discussion  6.4.1 Insufficient data  6.4.2 Information richness  6.4.3 Insufficient contextual information  6.4.4 Distributed language modelChapter Seven ConclusionChapter Eight ReferencesIndex

圖書封面

圖書標(biāo)簽Tags

評論、評分、閱讀與下載


    基于人工神經(jīng)網(wǎng)絡(luò)的機(jī)器翻譯 PDF格式下載


用戶評論 (總計(jì)1條)

 
 

  •   沒有多少真正的技術(shù)內(nèi)容。洋洋灑灑,空洞無物。
 

250萬本中文圖書簡介、評論、評分,PDF格式免費(fèi)下載。 第一圖書網(wǎng) 手機(jī)版

京ICP備13047387號-7