Time Series Analysis : Forecasting and Control.

By: Box, George E. PContributor(s): Jenkins, Gwilym M | Reinsel, Gregory C | Ljung, Greta M | Ljung, Greta MMaterial type: TextTextSeries: Wiley Series in Probability and Statistics SerPublisher: New York : John Wiley & Sons, Incorporated, 2015Copyright date: ©2016Edition: 5th edDescription: 1 online resource (1257 pages)Content type: text Media type: computer Carrier type: online resourceISBN: 9781118674918Subject(s): Feedback control systems - Mathematical modelsGenre/Form: Electronic books.Additional physical formats: Print version:: Time Series Analysis : Forecasting and ControlDDC classification: 519.5/5 LOC classification: TJ216Online resources: Click to View
Contents:
Time Series Analysis: Forecasting and Control -- Contents -- Preface to the Fifth Edition -- Preface to the Fourth Edition -- Preface to the Third Edition -- Chapter 1: Introduction -- 1.1 Five Important Practical Problems -- 1.1.1 Forecasting Time Series -- 1.1.2 Estimation of Transfer Functions -- 1.1.3 Analysis of Effects of Unusual Intervention Events to a System -- 1.1.4 Analysis of Multivariate Time Series -- 1.1.5 Discrete Control Systems -- 1.2 Stochastic and Deterministic Dynamic Mathematical Models -- 1.2.1 Stationary and Nonstationary Stochastic Models for Forecasting and Control -- 1.2.2 Transfer Function Models -- 1.2.3 Models for Discrete Control Systems -- 1.3 Basic Ideas in Model Building -- 1.3.1 Parsimony -- 1.3.2 Iterative Stages in the Selection of a Model -- Appendix A1.1 Use of the R Software -- Exercises -- Part One: Stochastic Models and Their Forecasting -- Chapter 2: Autocorrelation Function and Spectrum of Stationary Processes -- 2.1 Autocorrelation Properties of Stationary Models -- 2.1.1 Time Series and Stochastic Processes -- 2.1.2 Stationary Stochastic Processes -- 2.1.3 Positive Definiteness and the Autocovariance Matrix -- 2.1.4 Autocovariance and Autocorrelation Functions -- 2.1.5 Estimation of Autocovariance and Autocorrelation Functions -- 2.1.6 Standard Errors of Autocorrelation Estimates -- 2.2 Spectral Properties of Stationary Models -- 2.2.1 Periodogram of a Time Series -- 2.2.2 Analysis of Variance -- 2.2.3 Spectrum and Spectral Density Function -- 2.2.4 Simple Examples of Autocorrelation and Spectral Density Functions -- 2.2.5 Advantages and Disadvantages of the Autocorrelation and Spectral Density Functions -- Appendix A2.1 Link Between the Sample Spectrum and Autocovariance Function Estimate -- Exercises -- Chapter 3: Linear Stationary Models -- 3.1 General Linear Process.
3.1.1 Two Equivalent Forms for the Linear Process -- 3.1.2 Autocovariance Generating Function of a Linear Process -- 3.1.3 Stationarity and Invertibility Conditions for a Linear Process -- 3.1.4 Autoregressive and Moving Average Processes -- 3.2 Autoregressive Processes -- 3.2.1 Stationarity Conditions for Autoregressive Processes -- 3.2.2 Autocorrelation Function and Spectrum of Autoregressive Processes -- 3.2.3 The First-Order Autoregressive Process -- 3.2.4 Second-Order Autoregressive Process -- 3.2.5 Partial Autocorrelation Function -- 3.2.6 Estimation of the Partial Autocorrelation Function -- 3.2.7 Standard Errors of Partial Autocorrelation Estimates -- 3.2.8 Calculations in R -- 3.3 Moving Average Processes -- 3.3.1 Invertibility Conditions for Moving Average Processes -- 3.3.2 Autocorrelation Function and Spectrum of Moving Average Processes -- 3.3.3 First-OrderMoving Average Process -- 3.3.4 Second-Order Moving Average Process -- 3.3.5 Duality Between Autoregressive and Moving Average Processes -- 3.4 Mixed Autoregressive-Moving Average Processes -- 3.4.1 Stationarity and Invertibility Properties -- 3.4.2 Autocorrelation Function and Spectrum of Mixed Processes -- 3.4.3 First Order Autoregressive First-Order Moving Average Process -- 3.4.4 Summary -- Appendix A3.1 Autocovariances, Autocovariance Generating Function, and Stationarity Conditions for a General Linear Process -- Appendix A3.2 Recursive Method for Calculating Estimates of Autoregressive Parameters -- Exercises -- Chapter 4: Linear Nonstationary Models -- 4.1 Autoregressive Integrated Moving Average Processes -- 4.1.1 Nonstationary First-Order Autoregressive Process -- 4.1.2 GeneralModel for a Nonstationary Process Exhibiting Homogeneity -- 4.1.3 General Form of the ARIMA Model -- 4.2 Three Explicit Forms for the Arima Model -- 4.2.1 Difference Equation Form of the Model.
4.2.2 Random Shock Form of the Model -- 4.2.3 Inverted Form of the Model -- 4.3 Integrated Moving Average Processes -- 4.3.1 Integrated Moving Average Process of Order (0, 1, 1) -- 4.3.2 IntegratedMoving Average Process of Order (0, 2, 2) -- 4.3.3 General Integrated Moving Average Process of Order (0, d, q) -- Appendix A4.1 Linear Difference Equations -- Appendix A4.2 IMA(0, 1, 1) Process with Deterministic Drift -- Appendix A4.3 Arima Processes with Added Noise -- A4.3.1 Sum of Two Independent Moving Average Processes -- A4.3.2 Effect of Added Noise on the General Model -- A4.3.3 Example for an IMA(0, 1, 1) Process with Added White Noise -- A4.3.4 Relation between the IMA(0, 1, 1) Process and a Random Walk -- A4.3.5 Autocovariance Function of the General Model with Added Correlated Noise -- Exercises -- Chapter 5: Forecasting -- 5.1 Minimum Mean Square Error Forecasts and Their Properties -- 5.1.1 Derivation of the Minimum Mean Square Error Forecasts -- 5.1.2 Three Basic Forms for the Forecast -- 5.2 Calculating Forecasts and Probability Limits -- 5.2.1 Calculation of ѱ Weights -- 5.2.2 Use of the ѱ Weights in Updating the Forecasts -- 5.2.3 Calculation of the Probability Limits at Different Lead Times -- 5.2.4 Calculation of Forecasts Using R -- 5.3 Forecast Function and Forecastweights -- 5.3.1 Eventual Forecast Function Determined by the Autoregressive Operator -- 5.3.2 Role of the Moving Average Operator in Fixing the Initial Values -- 5.3.3 Lead l Forecast Weights -- 5.4 Examples of Forecast Functions and Their Updating -- 5.4.1 Forecasting an IMA(0, 1, 1) Process -- 5.4.2 Forecasting an IMA(0, 2, 2) Process -- 5.4.3 Forecasting a General IMA(0, d, q) Process -- 5.4.4 Forecasting Autoregressive Processes -- 5.4.5 Forecasting a (1, 0, 1) Process -- 5.4.6 Forecasting a (1, 1, 1) Process.
5.5 Use of State-Space Model Formulation for Exact Forecasting -- 5.5.1 State-SpaceModel Representation for the ARIMA Process -- 5.5.2 Kalman Filtering Relations for Use in Prediction -- 5.5.3 Smoothing Relations in the State Variable Model -- 5.6 Summary -- Appendix A5.1 Correlation Between Forecast Errors -- A5.1.1 Autocorrelation Function of Forecast Errors at Different Origins -- A5.1.2 Correlation Between Forecast Errors at the Same Origin with Different Lead Times -- Appendix A5.2 Forecastweights for Any Lead Time -- Appendix A5.3 Forecasting in Terms of the General Integrated Form -- A5.3.1 General Method of Obtaining the Integrated Form -- A5.3.2 Updating the General Integrated Form -- A5.3.3 Comparison with the Discounted Least-Squares Method -- Exercises -- Part Two: Stochastic Model Building -- Chapter 6: Model Identification -- 6.1 Objectives of Identification -- 6.1.1 Stages in the Identification Procedure -- 6.2 Identification Techniques -- 6.2.1 Use of the Autocorrelation and Partial Autocorrelation Functions in Identification -- 6.2.2 Standard Errors for Estimated Autocorrelations and Partial Autocorrelations -- 6.2.3 Identification of Models for Some Actual Time Series -- 6.2.4 Some Additional Model Identification Tools -- 6.3 Initial Estimates for the Parameters -- 6.3.1 Uniqueness of Estimates Obtained from the Autocovariance Function -- 6.3.2 Initial Estimates for Moving Average Processes -- 6.3.3 Initial Estimates for Autoregressive Processes -- 6.3.4 Initial Estimates for Mixed Autoregressive--Moving Average Processes -- 6.3.5 Initial Estimate of Error Variance -- 6.3.6 Approximate Standard Error for ѿ -- 6.3.7 Choice Between Stationary and Nonstationary Models in Doubtful Cases -- 6.4 Model Multiplicity -- 6.4.1 Multiplicity of Autoregressive--Moving AverageModels -- 6.4.2 Multiple Moment Solutions forMoving Average Parameters.
6.4.3 Use of the Backward Process to Determine Starting Values -- Appendix A6.1 Expected Behavior of the Estimated Autocorrelation Function for A Nonstationary Process -- Exercises -- Chapter 7: Parameter Estimation -- 7.1 Study of the Likelihood and Sum-of-Squares Functions -- 7.1.1 Likelihood Function -- 7.1.2 Conditional Likelihood for an ARIMA Process -- 7.1.3 Choice of Starting Values for Conditional Calculation -- 7.1.4 Unconditional Likelihood, Sum-of-Squares Function, and Least-Squares Estimates -- 7.1.5 General Procedure for Calculating the Unconditional Sum of Squares -- 7.1.6 Graphical Study of the Sum-of-Squares Function -- 7.1.7 Examination of the Likelihood Function and Confidence Regions -- 7.2 Nonlinear Estimation -- 7.2.1 GeneralMethod of Approach -- 7.2.2 Numerical Estimates of the Derivatives -- 7.2.3 Direct Evaluation of the Derivatives -- 7.2.4 General Least-Squares Algorithm for the Conditional Model -- 7.2.5 ARIMA Models Fitted to Series A-F -- 7.2.6 Large-Sample InformationMatrices and Covariance Estimates -- 7.3 Some Estimation Results for Specific Models -- 7.3.1 Autoregressive Processes -- 7.3.2 Moving Average Processes -- 7.3.3 Mixed Processes -- 7.3.4 Separation of Linear and Nonlinear Components in Estimation -- 7.3.5 Parameter Redundancy -- 7.4 Likelihood Function Based on the State-Space Model -- 7.5 Estimation Using Bayes' Theorem -- 7.5.1 Bayes' Theorem -- 7.5.2 Bayesian Estimation of Parameters -- 7.5.3 Autoregressive Processes -- 7.5.4 Moving Average Processes -- 7.5.5 Mixed Processes -- Appendix A7.1 Review of Normal Distribution Theory -- A7.1.1 Partitioning of a Positive-Definite Quadratic Form -- A7.1.2 Two Useful Integrals -- A7.1.3 Normal Distribution -- A7.1.4 Student's t Distribution -- Appendix A7.2 Review of Linear Least-Squares Theory -- A7.2.1 Normal Equations and Least Squares.
A7.2.2 Estimation of Error Variance.
Summary: Praise for the Fourth Edition  "The book follows faithfully the style of the original edition. The approach is heavily motivated by real-world time series, and by developing a complete approach to model building, estimation, forecasting and control." -        Mathematical Reviews Bridging classical models and modern topics, the Fifth Edition of Time Series Analysis: Forecasting and Control maintains a balanced presentation of the tools for modeling and analyzing time series. Also describing  the latest developments that have occurred in the field over the past decade through applications from areas such as business, finance, and engineering, the Fifth Edition continues to serve as one of the most influential and prominent works on the subject. Time Series Analysis: Forecasting and Control, Fifth Edition provides a clearly written exploration of the key methods for building, classifying, testing, and analyzing stochastic models for time series and describes their use in five important areas of application: forecasting; determining the transfer function of a system; modeling the effects of intervention events; developing multivariate dynamic models; and designing simple control schemes.  Along with these classical uses, the new edition covers modern topics with new features that include: A redesigned chapter on multivariate time series analysis with an expanded treatment of Vector Autoregressive, or VAR models, along with a discussion of the analytical tools needed for modeling vector time series An expanded chapter on special topics covering  unit root testing,  time-varying volatility  models such as ARCH and GARCH, nonlinear time series models, and long memory models Numerous examples drawn from finance, economics, engineering, and other related fields The use of the publicly available R software for graphical illustrations and numericalSummary: calculations along with scripts that demonstrate the use of R for model building and forecasting Updates to literature references throughout and new end-of-chapter exercises Streamlined chapter introductions and revisions that update and enhance the exposition Time Series Analysis: Forecasting and Control, Fifth Edition is a valuable real-world reference for researchers and practitioners in time series analysis, econometrics, finance, and related fields. The book is also an excellent textbook for beginning graduate-level courses in advanced statistics, mathematics, economics, finance, engineering, and physics.
Tags from this library: No tags from this library for this title. Log in to add tags.
    Average rating: 0.0 (0 votes)
No physical items for this record

Time Series Analysis: Forecasting and Control -- Contents -- Preface to the Fifth Edition -- Preface to the Fourth Edition -- Preface to the Third Edition -- Chapter 1: Introduction -- 1.1 Five Important Practical Problems -- 1.1.1 Forecasting Time Series -- 1.1.2 Estimation of Transfer Functions -- 1.1.3 Analysis of Effects of Unusual Intervention Events to a System -- 1.1.4 Analysis of Multivariate Time Series -- 1.1.5 Discrete Control Systems -- 1.2 Stochastic and Deterministic Dynamic Mathematical Models -- 1.2.1 Stationary and Nonstationary Stochastic Models for Forecasting and Control -- 1.2.2 Transfer Function Models -- 1.2.3 Models for Discrete Control Systems -- 1.3 Basic Ideas in Model Building -- 1.3.1 Parsimony -- 1.3.2 Iterative Stages in the Selection of a Model -- Appendix A1.1 Use of the R Software -- Exercises -- Part One: Stochastic Models and Their Forecasting -- Chapter 2: Autocorrelation Function and Spectrum of Stationary Processes -- 2.1 Autocorrelation Properties of Stationary Models -- 2.1.1 Time Series and Stochastic Processes -- 2.1.2 Stationary Stochastic Processes -- 2.1.3 Positive Definiteness and the Autocovariance Matrix -- 2.1.4 Autocovariance and Autocorrelation Functions -- 2.1.5 Estimation of Autocovariance and Autocorrelation Functions -- 2.1.6 Standard Errors of Autocorrelation Estimates -- 2.2 Spectral Properties of Stationary Models -- 2.2.1 Periodogram of a Time Series -- 2.2.2 Analysis of Variance -- 2.2.3 Spectrum and Spectral Density Function -- 2.2.4 Simple Examples of Autocorrelation and Spectral Density Functions -- 2.2.5 Advantages and Disadvantages of the Autocorrelation and Spectral Density Functions -- Appendix A2.1 Link Between the Sample Spectrum and Autocovariance Function Estimate -- Exercises -- Chapter 3: Linear Stationary Models -- 3.1 General Linear Process.

3.1.1 Two Equivalent Forms for the Linear Process -- 3.1.2 Autocovariance Generating Function of a Linear Process -- 3.1.3 Stationarity and Invertibility Conditions for a Linear Process -- 3.1.4 Autoregressive and Moving Average Processes -- 3.2 Autoregressive Processes -- 3.2.1 Stationarity Conditions for Autoregressive Processes -- 3.2.2 Autocorrelation Function and Spectrum of Autoregressive Processes -- 3.2.3 The First-Order Autoregressive Process -- 3.2.4 Second-Order Autoregressive Process -- 3.2.5 Partial Autocorrelation Function -- 3.2.6 Estimation of the Partial Autocorrelation Function -- 3.2.7 Standard Errors of Partial Autocorrelation Estimates -- 3.2.8 Calculations in R -- 3.3 Moving Average Processes -- 3.3.1 Invertibility Conditions for Moving Average Processes -- 3.3.2 Autocorrelation Function and Spectrum of Moving Average Processes -- 3.3.3 First-OrderMoving Average Process -- 3.3.4 Second-Order Moving Average Process -- 3.3.5 Duality Between Autoregressive and Moving Average Processes -- 3.4 Mixed Autoregressive-Moving Average Processes -- 3.4.1 Stationarity and Invertibility Properties -- 3.4.2 Autocorrelation Function and Spectrum of Mixed Processes -- 3.4.3 First Order Autoregressive First-Order Moving Average Process -- 3.4.4 Summary -- Appendix A3.1 Autocovariances, Autocovariance Generating Function, and Stationarity Conditions for a General Linear Process -- Appendix A3.2 Recursive Method for Calculating Estimates of Autoregressive Parameters -- Exercises -- Chapter 4: Linear Nonstationary Models -- 4.1 Autoregressive Integrated Moving Average Processes -- 4.1.1 Nonstationary First-Order Autoregressive Process -- 4.1.2 GeneralModel for a Nonstationary Process Exhibiting Homogeneity -- 4.1.3 General Form of the ARIMA Model -- 4.2 Three Explicit Forms for the Arima Model -- 4.2.1 Difference Equation Form of the Model.

4.2.2 Random Shock Form of the Model -- 4.2.3 Inverted Form of the Model -- 4.3 Integrated Moving Average Processes -- 4.3.1 Integrated Moving Average Process of Order (0, 1, 1) -- 4.3.2 IntegratedMoving Average Process of Order (0, 2, 2) -- 4.3.3 General Integrated Moving Average Process of Order (0, d, q) -- Appendix A4.1 Linear Difference Equations -- Appendix A4.2 IMA(0, 1, 1) Process with Deterministic Drift -- Appendix A4.3 Arima Processes with Added Noise -- A4.3.1 Sum of Two Independent Moving Average Processes -- A4.3.2 Effect of Added Noise on the General Model -- A4.3.3 Example for an IMA(0, 1, 1) Process with Added White Noise -- A4.3.4 Relation between the IMA(0, 1, 1) Process and a Random Walk -- A4.3.5 Autocovariance Function of the General Model with Added Correlated Noise -- Exercises -- Chapter 5: Forecasting -- 5.1 Minimum Mean Square Error Forecasts and Their Properties -- 5.1.1 Derivation of the Minimum Mean Square Error Forecasts -- 5.1.2 Three Basic Forms for the Forecast -- 5.2 Calculating Forecasts and Probability Limits -- 5.2.1 Calculation of ѱ Weights -- 5.2.2 Use of the ѱ Weights in Updating the Forecasts -- 5.2.3 Calculation of the Probability Limits at Different Lead Times -- 5.2.4 Calculation of Forecasts Using R -- 5.3 Forecast Function and Forecastweights -- 5.3.1 Eventual Forecast Function Determined by the Autoregressive Operator -- 5.3.2 Role of the Moving Average Operator in Fixing the Initial Values -- 5.3.3 Lead l Forecast Weights -- 5.4 Examples of Forecast Functions and Their Updating -- 5.4.1 Forecasting an IMA(0, 1, 1) Process -- 5.4.2 Forecasting an IMA(0, 2, 2) Process -- 5.4.3 Forecasting a General IMA(0, d, q) Process -- 5.4.4 Forecasting Autoregressive Processes -- 5.4.5 Forecasting a (1, 0, 1) Process -- 5.4.6 Forecasting a (1, 1, 1) Process.

5.5 Use of State-Space Model Formulation for Exact Forecasting -- 5.5.1 State-SpaceModel Representation for the ARIMA Process -- 5.5.2 Kalman Filtering Relations for Use in Prediction -- 5.5.3 Smoothing Relations in the State Variable Model -- 5.6 Summary -- Appendix A5.1 Correlation Between Forecast Errors -- A5.1.1 Autocorrelation Function of Forecast Errors at Different Origins -- A5.1.2 Correlation Between Forecast Errors at the Same Origin with Different Lead Times -- Appendix A5.2 Forecastweights for Any Lead Time -- Appendix A5.3 Forecasting in Terms of the General Integrated Form -- A5.3.1 General Method of Obtaining the Integrated Form -- A5.3.2 Updating the General Integrated Form -- A5.3.3 Comparison with the Discounted Least-Squares Method -- Exercises -- Part Two: Stochastic Model Building -- Chapter 6: Model Identification -- 6.1 Objectives of Identification -- 6.1.1 Stages in the Identification Procedure -- 6.2 Identification Techniques -- 6.2.1 Use of the Autocorrelation and Partial Autocorrelation Functions in Identification -- 6.2.2 Standard Errors for Estimated Autocorrelations and Partial Autocorrelations -- 6.2.3 Identification of Models for Some Actual Time Series -- 6.2.4 Some Additional Model Identification Tools -- 6.3 Initial Estimates for the Parameters -- 6.3.1 Uniqueness of Estimates Obtained from the Autocovariance Function -- 6.3.2 Initial Estimates for Moving Average Processes -- 6.3.3 Initial Estimates for Autoregressive Processes -- 6.3.4 Initial Estimates for Mixed Autoregressive--Moving Average Processes -- 6.3.5 Initial Estimate of Error Variance -- 6.3.6 Approximate Standard Error for ѿ -- 6.3.7 Choice Between Stationary and Nonstationary Models in Doubtful Cases -- 6.4 Model Multiplicity -- 6.4.1 Multiplicity of Autoregressive--Moving AverageModels -- 6.4.2 Multiple Moment Solutions forMoving Average Parameters.

6.4.3 Use of the Backward Process to Determine Starting Values -- Appendix A6.1 Expected Behavior of the Estimated Autocorrelation Function for A Nonstationary Process -- Exercises -- Chapter 7: Parameter Estimation -- 7.1 Study of the Likelihood and Sum-of-Squares Functions -- 7.1.1 Likelihood Function -- 7.1.2 Conditional Likelihood for an ARIMA Process -- 7.1.3 Choice of Starting Values for Conditional Calculation -- 7.1.4 Unconditional Likelihood, Sum-of-Squares Function, and Least-Squares Estimates -- 7.1.5 General Procedure for Calculating the Unconditional Sum of Squares -- 7.1.6 Graphical Study of the Sum-of-Squares Function -- 7.1.7 Examination of the Likelihood Function and Confidence Regions -- 7.2 Nonlinear Estimation -- 7.2.1 GeneralMethod of Approach -- 7.2.2 Numerical Estimates of the Derivatives -- 7.2.3 Direct Evaluation of the Derivatives -- 7.2.4 General Least-Squares Algorithm for the Conditional Model -- 7.2.5 ARIMA Models Fitted to Series A-F -- 7.2.6 Large-Sample InformationMatrices and Covariance Estimates -- 7.3 Some Estimation Results for Specific Models -- 7.3.1 Autoregressive Processes -- 7.3.2 Moving Average Processes -- 7.3.3 Mixed Processes -- 7.3.4 Separation of Linear and Nonlinear Components in Estimation -- 7.3.5 Parameter Redundancy -- 7.4 Likelihood Function Based on the State-Space Model -- 7.5 Estimation Using Bayes' Theorem -- 7.5.1 Bayes' Theorem -- 7.5.2 Bayesian Estimation of Parameters -- 7.5.3 Autoregressive Processes -- 7.5.4 Moving Average Processes -- 7.5.5 Mixed Processes -- Appendix A7.1 Review of Normal Distribution Theory -- A7.1.1 Partitioning of a Positive-Definite Quadratic Form -- A7.1.2 Two Useful Integrals -- A7.1.3 Normal Distribution -- A7.1.4 Student's t Distribution -- Appendix A7.2 Review of Linear Least-Squares Theory -- A7.2.1 Normal Equations and Least Squares.

A7.2.2 Estimation of Error Variance.

Praise for the Fourth Edition  "The book follows faithfully the style of the original edition. The approach is heavily motivated by real-world time series, and by developing a complete approach to model building, estimation, forecasting and control." -        Mathematical Reviews Bridging classical models and modern topics, the Fifth Edition of Time Series Analysis: Forecasting and Control maintains a balanced presentation of the tools for modeling and analyzing time series. Also describing  the latest developments that have occurred in the field over the past decade through applications from areas such as business, finance, and engineering, the Fifth Edition continues to serve as one of the most influential and prominent works on the subject. Time Series Analysis: Forecasting and Control, Fifth Edition provides a clearly written exploration of the key methods for building, classifying, testing, and analyzing stochastic models for time series and describes their use in five important areas of application: forecasting; determining the transfer function of a system; modeling the effects of intervention events; developing multivariate dynamic models; and designing simple control schemes.  Along with these classical uses, the new edition covers modern topics with new features that include: A redesigned chapter on multivariate time series analysis with an expanded treatment of Vector Autoregressive, or VAR models, along with a discussion of the analytical tools needed for modeling vector time series An expanded chapter on special topics covering  unit root testing,  time-varying volatility  models such as ARCH and GARCH, nonlinear time series models, and long memory models Numerous examples drawn from finance, economics, engineering, and other related fields The use of the publicly available R software for graphical illustrations and numerical

calculations along with scripts that demonstrate the use of R for model building and forecasting Updates to literature references throughout and new end-of-chapter exercises Streamlined chapter introductions and revisions that update and enhance the exposition Time Series Analysis: Forecasting and Control, Fifth Edition is a valuable real-world reference for researchers and practitioners in time series analysis, econometrics, finance, and related fields. The book is also an excellent textbook for beginning graduate-level courses in advanced statistics, mathematics, economics, finance, engineering, and physics.

Description based on publisher supplied metadata and other sources.

Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2018. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.

There are no comments on this title.

to post a comment.

Powered by Koha