出版時間:2005-9 出版社:人民郵電出版社 作者:博克斯,雷恩斯,詹金斯 頁數(shù):598 字數(shù):840000
Tag標簽:無
內(nèi)容概要
本書自1970年初版以來,不斷修訂再版,以其經(jīng)典性和權(quán)威性成為有關(guān)時間序列分析領(lǐng)域書籍的典范。書中涉及時間序列隨機(統(tǒng)計)模型的建立及許多重要的應用領(lǐng)域的使用,包括預測,模型的描述、估計、識別和診斷,動態(tài)關(guān)系的傳遞函數(shù)的識別、擬合及檢驗,干預事件影響的建模和過程控制等專題。本書敘述簡明,強調(diào)實際技術(shù),配有大量實例。 本書可作為統(tǒng)計和相關(guān)專業(yè)高年級本科生或研究生教材,也可以作為統(tǒng)計專業(yè)技術(shù)人員的參考書。
作者簡介
George E.P.Box 國際級統(tǒng)計學家。曾于1960年創(chuàng)立威斯康星大學統(tǒng)計系并任該系主任,現(xiàn)為該校名譽教授。BOX發(fā)表過200多篇論文,出版過很多重要著作,其中本書和STATISTICE FOR
EXPERIMENTERS為其代表作。
Gwilym M.Jenkins 已故國際級統(tǒng)計學家。曾于1966年創(chuàng)立了英國
書籍目錄
1 INTRODUCTION 11.1 Four Important Practical Problems 21.1.1 Forecasting Time Series 21.1.2 Estimation of Transfer Functions 31.1.3 Analysis of Effects of Unusual Intervention Events To a System 41.1.4 Discrete Control Systems 51.2 Stochastic and Deterministic Dynamic Mathematical Models 71.2.1 Stationary and Nonstationary Stochastic Models for Forecasting and Control 71.2.2 Transfer Function Models 121.2.3 Models for Discrete Control Systems 141.3 Basic Ideas in Model Building 161.3.1 Parsimony 161.3.2 Iterative Stages in the Selection of a Model 16Part I Stochastic Models and Their Forecasting 192 AUTOCORRELATION FUNCTION AND SPECTRUM OF STATIONARY PROCESSES 212.1 Autocorrelation Properties of Stationary Models 212.1.1 Time Series and Stochastic Processes 212.1.2 Stationary Stochastic Processes 232.1.3 Positive Definiteness and the Autocovariance Matrix 262.1.4 Autocovariance and Autocorrelation Functions 292.1.5 Estimation of Autocovariance and Autocorrelation Functions 302.1.6 Standard Error of Autocorrelation Estimates 322.2 Spectral Properties of Stationary Models 352.2.1 Periodogram of a Time Series 352.2.2 Analysis of Variance 362.2.3 Spectrum and Spectral Density Function 372.2.4 Simple Examples of Autocorrelation and Spectral Density Functions 412.2.5 Advantages and Disadvantages of the Autocorrelation and Spectral Density Functions 43A2.1 Link Between the Sample Spectrum and Autocovariance Function Estimate 443 LINEAR STATIONARY MODELS 463.1 General Linear Process 463.1.1 Two Equivalent Forms for the Linear Process 463.1.2 Autocovariance Generating Function of a Linear Process 493.1.3 Stationarity and Invertibility Conditions for a Linear Process 503.1.4 Autoregressive and Moving Average Processes 523.2 Autoregressive Processes 543.2.1 Stationarity Conditions for Autoregressive Processes 543.2.2 Autocorrelation Function and Spectrum of Autoregressiue Processes 55 3.2.3 First-Order Autoregressive (Markov) Process 583.2.4 Second-Order Autoregressive Process 603.2.5 Partial Autocorrelation Function 643.2.6 Estimation of the Partial Autocorrelation Function 673.2.7 Standard Errors of Partial Autocorrelation Estimates 683.3 Moving Average Processes 693.3.1 Invertibility Conditions for Moving Average Processes 693.3.2 Autocorrelation Function and Spectrum of Moving Average Processes 703.3.3 First-Order Moving Average Process 723.3.4 Second-Order Moving Average Process 733.3.5 Duality Between Autoregressive and Moving Average Processes 753.4 Mixed Autoregressive-Moving Average Processes 773.4.1 Stationarity and Invertibility Properties 773.4.2 Autocorrelation Function and Spectrum of Mixed Processes 783.4.3 First-Order Autoregressive-First-Order Moving Average Process 803.4.4 Summary 83A3.1 Autocovariances Autocovariance Generating Function and Stationarity Conditions for a General Linear Process 85A3.2 Recursive Method for Calculating Estimates of Autoregressive Parameters 874 LINEAR NONSTATIONARY MODELS 894.1 Autoregressive Integrated Moving Average Processes 894.1.1 Nonstationary First-Order Autoregressive Process 894.1.2 General Model for a Nonstationary Process Exhibiting Homogeneity 924.1.3 General Form of the Autoregressive Integrated Moving Average Process 964.2 Three Explicit Forms for the Autoregressive Integrated Moving Average Model 994.2.1 Difference Equation Form of the Model 994.2.2 Random Shock Form of the Model I004.2.3 Inverted Form of the Model 1064.3 Integrated Moving Average Processes 1094.3.1 Integrated Moving Average Process of Order (0,1,1) 1104.3.2 Integrated Moving Average Process of Order (0,2,2) 1144.3.3 General Integrated Moving Average Process of Order (0,d,q) 118A4.1 Linear Difference Equations 120A4.2 IMA(0,1,1) Process With Deterministic Drift 125A4.3 ARIMA Processes With Added Noise 126A4.3.1 Sum of Two Independent Moving Average Processes 126A4.3.2 Effect of Added Noise on the General Model 127A4.3.3 Example for an IMA(O,1,1) Process with Added White Noise 128A4.3.4 Relation Between the IMA(O,1,1) Process and a Random Walk 129A4.3.5 Autocovariance Function of the General Model with Added Correlated Noise 1295 FORECASTING 1315.1 Minimum Mean Square Error Forecasts and Their Properties 1315.1.1 Derivation of the Minimum Mean Square Error Forecasts 1335.1.2 Three Basic Forms for the Forecast 1355.2 Calculating and Updating Forecasts 1395.2.1 Convenient Format for the Forecasts 1395.2.2 Calculation of the ψ Weights 1395.2.3 Use of the ψ Weights in Updating the Forecasts 1415.2.4 Calculation of the Probability Limits of the Forecasts at Any Lead Time 1425.3 Forecast Function and Forecast Weights 1455.3.1 Eventual Forecast Function Determined by the Autoregressive Operator 1465.3.2 Role of the Mooing Average Operator in Fixing the Initial Values 1475.3.3 Lead l Forecast Weights 1485.4 Examples of Forecast Functions and Their Updating 1515.4.1 Forecasting an IMA(O,1,1) Process 1515.4.2 Forecasting an IMA(O,2,2) Process 1545.4.3 Forecasting a General IMA(O,d,q) Process 1565.4.4 Forecasting Autoregressive Processes 1575.4.5 Forecasting a (1,O,1) Process 1605.4.6 Forecasting a (1,1,1) Process 1625.5 Use of State Space Model Formulation for Exact Forecasting 1635.5.1 State Space Model Representation for the ARIMA Process 1635.5.2 Kalman Filtering Relations for Use in Prediction 1645.6 Summary 166A5.1 Correlations Between Forecast Errors 169A5.1.1 Autocorrelation Function of Forecast Errors at Different Origins 169A5.1.2 Correlation Between Forecast Errors at the Same Origin with Different Lead Times 170A5.2 Forecast Weights for Any Lead Time 172A5.3 Forecasting in Terms of the General Integrated Form 174A5.3.1 General Method of Obtaining the Integrated Form 174A5.3.2 Updating the General Integrated Form 176A5.3.3 Comparison with the Discounted Least Squares Method 176Part II Stochastic Model Building 1816 MODELDENTIFICATION 1836.l Objectives of Identification 1836.1.1 Stages in the Identification Procedure 1846.2 Identification Techniques 1846.2.1 Use of the Autocorrelation and Partial Autocorrelation Functions in Identification 1846.2.2 Standard Errors for Estimated Autocorrelations and Partial Autocorrelations 1886.2.3 Identification of Some Actual Time Series 1886.2.4 Some Additional Model Identification Tools 1976.3 Initial Estimates for the Parameters 2026.3.1 Uniqueness of Estimates Obtained from the Autocovariance Function 2026.3.2 Initial Estimates for Moving Average Processes 2026.3.3 Initial Estimates for Autoregressive Processes 2046.3.4 Initial Estimates for Mixed Autoregressive-Moving Average Processes 2066.3.5 Choice Between Stationary and Nonstationary Models in Doubtful Cases 2076.3.6 More Formal Tests for Unit Roots in ARIMA Models 2086.3.7 Initial Estimate of Residual Variance 2116.3.8 Approximate Standard Error for 2126.4 Model Multiplicity 2146.4.1 Multiplicity of Autoregressive-Moving Average Models 2146.4.2 Multiple Moment Solutions for Moving Average Parameters 2166.4.3 Use of the Backward Process to Determine Starting Values 218A6.1 Expected Behavior of the Estimated Autocorrelation Function for a Nonstationary Process 218A6.2 General Method for Obtaining Initial Estimates of the Parameters of a Mixed Autoregressive-Moving Average Process 2207 MODELESTIMATION 2247.l Study of the Likelihood and Sum of Squares Functions 2247.1.1 Likelihood Function 2247.1.2 Conditional Likelihood for an ARIMA Process 2267.1.3 Choice of Starting Values for Conditional Calculation 2277.1.4 Unconditional Likelihood; Sum of Squares Function; Least Squares Estimates 2287.1.5 General Procedure for Calculating the Unconditional Sum of Squares 2337.1.6 Graphical Study of the Sum of Squares Function 2387.1.7 Description of“Well-Behaved” Estimation Situations; Confidence Regions 2417.2 Nonlinear Estimation 2487.2.1 General Method of Approach 2487.2.2 Numerical Estimates of the Derivatives 2497.2.3 Direct Evaluation of the Derivatives 2517.2.4 General Least Squares Algorithm for the Conditional Model 2527.2.5 Summary of Models Fitted to Series A to F 2557.2.6 Large-Sample Information Matrices and Covariance Estimates 2567.3 Some Estimation Results for Specific Models 2597.3.1 Autoregressive Processes 2607.3.2 Moving Average Processes 2627.3.3 Mixed Processes 2627.3.4 Separation of Linear and Nonlinear Components in Estimation 2637.3.5 Parameter Redundancy 2647.4 Estimation Using Bayes' Theorem 2677.4.1 Bayes' Theorem 2677.4.2 Bayesian Estimation of Parameters 2697.4.3 Autoregressive Processes 2707.4.4 Moving Average Processes 2727.4.5 Mixed processes 2747.5 Likelihood Function Based on The State Space Model 275A7.1 Review of Normal Distribution Theory 279A7.1.1 Partitioning of a Positive-Definite Quadratic Form 279A7.1.2 Two Useful Integrals 280A7.1.3 Normal Distribution 281A7.1.4 Student's t-Distribution 283A7.2 Review of Linear Least Squares Theory 286A7.2.1 Normal Equations 286A7.2.2 Estimation of Residual Variance 287A7.2.3 Covariance Matrix of Estimates 288A7.2.4 Confidence Regions 288A7.2.5 Correlated Errors 288A7.3 Exact Likelihood Function for Moving Average and Mixed Processes 289A7.4 Exact Likelihood Function for an Autoregressive Process 296A7.5 Examples of the Effect of Parameter Estimation Errors on Probability Limits for Forecasts 304A7.6 Special Note on Estimation of Moving Average Parameters 3078 MODEL DIAGNOSTIC CHECKING 3088.1 Checking the Stochastic Model 3088.1.1 General Philosophy 3088.1.2 Overfitting 3098.2 Diagnostic Checks Applied to Residuals 3128.2.1 Autocorrelation Check 3128.2.2 Portmanteau Lack-of-Fit Test 3148.2.3 Model Inadequacy Arising from Changes in Parameter Values 3178.2.4 Score Tests for Model Checking 3188.2.5 Cumulative Periodogram Check 3218.3 Use of Residuals to Modify the Model 3248.3.1 Nature of the Correlations in the Residuals When an Incorrect Model Is Used 3248.3.2 Use of Residuals to Modify the Model 3259 SEASONALMODELS 3279.1 Parsimonious Models for Seasonal Time Series 3279.1.1 Fitting versus Forecasting 3289.1.2 Seasonal Models Involving Adaptive Sines and Cosines 3299.1.3 General Multiplicative Seasonal Model 3309.2 Representation of the Airline Data by a Multiplicative (0,1,1) ~ (0,1,1)12 Seasonal Model 3339.2.1 Multiplicative (0,l,l) ~ (0,l,1)12 Model 3339.2.2 Forecasting 3349.2.3 Identification 3419.2.4 Estimation 3449.2.5 Diagnostic Checking 3499.3 Some Aspects of More General Seasonal Models 3519.3.1 Multiplicative and Nonmultiplicative Models 3519.3.2 Identification 3539.3.3 Estimation 3559.3.4 Eventual Forecast Functions for Various Seasonal Models 3559.3.5 Choice of Transformation 3589.4 Structural Component Models and Deterministic Seasonal Components 3599.4.1 Deterministic Seasonal and Trend Components and Common Factors 3609.4.2 Models with Regression Terms and Time Series Error Terms 361A9.1 Autocovariances for Some Seasonal Models 366Part III Transfer Function Model Building 37110 TRANSFER FUNCTION MODELS 37310.1 Linear Transfer Function Models 37310.1.1 Discrete Transfer Function 37410.1.2 Continuous Dynamic Models Represented by Differential Equations 37610.2 Discrete Dynamic Models Represented by Difference Equations 38110.2.1 General Form of the Difference Equation 38110.2.2 Nature of the Transfer Function 38310.2.3 First- and Second-Order Discrete Transfer Function Models 38410.2.4 Recursive Computation of Output for Any Input 39010.2.5 Transfer Function Models with Added Noise 39210.3 Relation Between Discrete and Continuous Models 39210.3.1 Response to a Pulsed Input 39310.3.2 Relationships for First-and Second-Order Coincident Systems 39510.3.3 Approximating General Continuous Models by Discrete Models 398A10.1 Continuous Models With Pulsed Inputs 399A10.2 Nonlinear Transfer Functions and Linearization 40411 IDENTIFICATION FITTING AND CHECKING OF TRANSFER FUNCTION MODELS 407ll.1 Cross Correlation Function 40811.1.1 Properties of the Cross Covariance and Cross Correlation Functions 40811.1.2 Estimation of the Cross Covariance and Cross Correlation Functions 41111.1.3 Approximate Standard Errors of Cross Correlation Estimates 41311.2 Identification of Transfer Function Models 41511.2.1 Identification of Transfer Function Models by Prewhitening the Input 41711.2.2 Example of the Identification of a Transfer Function Model 41911.2.3 Identification of the Noise Model 42211.2.4 Some General Considerations in Identifying Transfer Function Models 42411.3 Fitting and Checking Transfer Function Models 42611.3.1 Conditional Sum of Squares Function 42611.3.2 Nonlinear Estimation 42911.3.3 Use of Residuals for Diagnostic Checking 43111.3.4 Specific Checks Applied to the Residuals 43211.4 Some Examples of Fitting and Checking Transfer Function Models 43511.4.1 Fitting and Checking of the Gas Furnace Model 43511.4.2 Simulated Example with Two Inputs 44111.5 Forecasting Using Leading Indicators 44411.5.1 Minimum Mean Square Error Forecast 44411.5.2 Forecast of C02 Output from Gas Furnace 44811.5.3 Forecast of Nonstationary Sales Data Using a Leading Indicator 45111.6 Some Aspects of the Design of Experiments to Estimate Transfer Functions 453A11.1 Use of Cross Spectral Analysis for Transfer Function Model Identification 455All.I.1 Identification of Single Input Transfer Function Models 455All.l.2 Identification of Multiple Input Transfer Function Models 456AI1.2 Choice of Input to Provide Optimal Parameter Estimates 457All.2.1 Design of Optimal Inputs for a Simple System 457All.2.2 Numerical Example 46012 INTERVENTION ANALYSIS MODELS AND OUTLIER DETECTION 46212.1 Intervention Analysis Methods 46212.1.1 Models for Intervention Analysis 46212.1.2 Example of Intervention Analysis 46512.1.3 Nature of the MLE for a Simple Level Change Parameter Model 46612.2 Outlier Analysis for Time Series 46912.2.1 Models for Additive and Innovational Outliers 46912.2.2 Estir m ation of Outlier Effect for Known Timing of the Outlier 47012.2.3 Iterative Procedure for Outlier Detection 47112.2.4 Examples of Analysis of Outliers 47312.3 Estimation for ARMA Models With Missing Values 474Part IV Design of Discrete Control Schemes 48113 ASPECTS OF PROCESS CONTROL 48313.1 Process Monitoring and Process Adjustment 48413.1.1 Process Monitoring 48413.1.2 Process Adjustment 48713.2 Process Adjustment Using Feedback Control~48813.2.1 Feedback Adjustment Chart 48913.2.2 Modeling the Feedback Loop 49213.2.3 Simple Models for Disturbances and Dynamics 49313.2.4 General Minimum Mean Square Error Feedback Control Schemes 49713.2.5 Manual Adjustment for Discrete Proportional-Integral Schemes 49913.2.6 Complementary Roles of Monitoring and Adjustment 50313.3 Excessive Adjustment Sometimes Required by MMSE Control 50513.3.1 Constrained Control 50613.4 Minimum Cost Control With Fixed Costs of Adjustment And Monitoring 50813.4.1 Bounded Adjustment Scheme for Fixed Adjustment Cost 50813.4.2 Indirect Approach for Obtaining a Bounded Adjustment Scheme 51013.4.3 Inclusion of the Cost of Monitoring 51113.5 Monitoring Values of Parameters of Forecasting and Feedback Adjustment Schemes 514A13.1 Feedback Control Schemes Where the Adjustment Variance Is Restricted 516A13.1.1 Derivation of Optimal Adjustment 517A13.2 Choice of the Sampling Interval 526A13.2.1 Illustration of the Effect of Reducing Sampling Frequency 526A13.2.2 Sampling an IMA(O,I,I) Process 526Part V Charts and Tables 531COLLECTION OF TABLES AND CHARTS 533COLLECTION OF TIME SERIES USED FOR EXAMPLES IN THE TEXT AND IN EXERCISES 540REFERENCES 556Part VI EXERCISES AND PROBLEMS 569INDEX 589
媒體關(guān)注與評論
時間序列分析是一門實用性很強、蓬勃發(fā)展的數(shù)據(jù)分析技術(shù),現(xiàn)在廣泛地應用于工業(yè)質(zhì)量控制、生物基因工程和金融數(shù)據(jù)分析等諸多領(lǐng)域。而這一切的發(fā)展不能不提到G.E.P.BOX和G.M.JE-NKINS以及二人合著的最早于1970年出版的《時間序列分析——預測與控制》。由于二位對時間序列數(shù)據(jù)分析的巨大貢獻,大家將本書提出的ARIMA模型命名為BOX-JENKINS模型。 在這本時間序列分析經(jīng)典之作中,幾位統(tǒng)計大家用極其通俗的語言,運用大量的實例,深入淺出而又形象地闡明時間序列分析的精髓,使讀者免去過多繁雜的數(shù)學公式推導證明,而很快掌握實踐的技巧,體會其中直觀而深刻的思想。相信每一位研讀此書的讀者都會獲益匪淺。
圖書封面
圖書標簽Tags
無
評論、評分、閱讀與下載