Basic Data Analysis for Time Series with R.

By: Derryberry, DeWayne RMaterial type: TextTextPublisher: Somerset : John Wiley & Sons, Incorporated, 2014Copyright date: ©2014Edition: 1st edDescription: 1 online resource (320 pages)Content type: text Media type: computer Carrier type: online resourceISBN: 9781118593363Subject(s): R (Computer program language) | Time-series analysis -- Data processing | Time-series analysisGenre/Form: Electronic books.Additional physical formats: Print version:: Basic Data Analysis for Time Series with RDDC classification: 001.42202855133 LOC classification: QA280 .D475 2014Online resources: Click to View
Contents:
Intro -- Basic Data Analysis for Time Series with R -- Contents -- Preface -- What This Book is About -- Motivation -- Required Background -- A Couple of Odd Features -- Acknowledgments -- PART I Basic Correlation Structures -- 1 R Basics -- 1.1 Getting Started -- 1.2 Special R Conventions -- 1.3 Common Structures -- 1.4 Common Functions -- 1.5 Time Series Functions -- 1.6 Importing Data -- Exercises -- 2 Review of Regression and More About R -- 2.1 Goals of This Chapter -- 2.2 The Simple(ST) Regression Model -- 2.2.1 Ordinary Least Squares -- 2.2.2 Properties of OLS Estimates -- 2.2.3 Matrix Representation of the Problem -- 2.3 Simulating The Data From A Model and Estimating The Model Parameters in R -- 2.3.1 Simulating Data -- 2.3.2 Estimating the Model Parameters in R -- 2.4 Basic Inference for the Model -- 2.5 Residuals Analysis-What Can go Wrong… -- 2.6 Matrix Manipulation in R -- 2.6.1 Introduction -- 2.6.2 OLS the Hard Way -- 2.6.3 Some Other Matrix Commands -- Exercises -- 3 The Modeling Approach Taken in this Book and Some Examples of Typical Serially Correlated Data -- 3.1 Signal and Noise -- 3.2 Time Series Data -- 3.3 Simple Regression in the Framework -- 3.4 Real Data and Simulated Data -- 3.5 The Diversity of Time Series Data -- 3.6 Getting Data Into R -- 3.6.1 Overview -- 3.6.2 The Diskette and the scan() and ts() Functions-New York City Temperatures -- 3.6.3 The Diskette and the read.table() Function-The Semmelweis Data -- 3.6.4 Cut and Paste Data to a Text Editor -- Exercises -- 4 Some Comments on Assumptions -- 4.1 Introduction -- 4.2 The Normality Assumption -- 4.2.1 Right Skew -- 4.2.2 Left Skew -- 4.2.3 Heavy Tails -- 4.3 Equal Variance -- 4.3.1 Two-Sample t-Test -- 4.3.2 Regression -- 4.4 Independence -- 4.5 Power of Logarithmic Transformations Illustrated -- 4.6 Summary -- Exercises.
5 The Autocorrelation Function And AR(1), AR(2) Models -- 5.1 Standard Models-What are the Alternatives to WHITE NOISE? -- 5.2 Autocovariance and Autocorrelation -- 5.2.1 Stationarity -- 5.2.2 A Note About Conditions -- 5.2.3 Properties of Autocovariance -- 5.2.4 White Noise -- 5.2.5 Estimation of the Autocovariance and Autocorrelation -- 5.3 The acf() Function in R -- 5.3.1 Background -- 5.3.2 The Basic Code for Estimating the Autocovariance -- 5.4 The First Alternative to White Noise: Autoregressive Errors-AR(1), AR(2) -- 5.4.1 Definition of the AR(1) and AR(2) Models -- 5.4.2 Some Preliminary Facts -- 5.4.3 The AR(1) Model Autocorrelation and Autocovariance -- 5.4.4 Using Correlation and Scatterplots to Illustrate the AR(1) Model -- 5.4.5 The AR(2) Model Autocorrelation and Autocovariance -- 5.4.6 Simulating Data for AR(m) Models -- 5.4.7 Examples of Stable and Unstable AR(1) Models -- 5.4.8 Examples of Stable and Unstable AR(2) Models -- Exercises -- 6 The Moving Average Models MA(1) And MA(2) -- 6.1 The Moving Average Model -- 6.2 The Autocorrelation for MA(1) Models -- 6.3 A Duality Between MA(l) And AR(m) Models -- 6.4 The Autocorrelation for MA(2) Models -- 6.5 Simulated Examples of the MA(1) Model -- 6.6 Simulated Examples of the MA(2) Model -- 6.7 AR(m) and MA(l) model acf() Plots -- Exercises -- PART II Analysis of Periodic Data and Model Selection -- 7 Review of Transcendental Functions and Complex Numbers -- 7.1 Background -- 7.2 Complex Arithmetic -- 7.2.1 The Number i -- 7.2.2 Complex Conjugates -- 7.2.3 The Magnitude of a Complex Number -- 7.3 Some Important Series -- 7.3.1 The Geometric and Some Transcendental Series -- 7.3.2 A Rationale for Eulers Formula -- 7.4 Useful Facts About Periodic Transcendental Functions -- Exercises -- 8 The Power Spectrum and the Periodogram -- 8.1 Introduction.
8.2 A Definition and a Simplified Form for -- 8.3 Inverting p(f) to Recover the Ck Values -- 8.4 The Power Spectrum for Some Familiar Models -- 8.4.1 White Noise -- 8.4.2 The Spectrum for AR(1) Models -- 8.4.3 The Spectrum for AR(2) Models -- 8.5 The Periodogram, a Closer Look -- 8.5.1 Why is the Periodogram Useful? -- 8.5.2 Some Naïve Code for a Periodogram -- 8.5.3 An Example-The Sunspot Data -- 8.6 The Function SPEC.PGRAM() in R -- Exercises -- 9 Smoothers, The Bias-Variance Tradeoff, and the Smoothed Periodogram -- 9.1 Why is Smoothing Required? -- 9.2 Smoothing, Bias, and Variance -- 9.3 Smoothers Used in R -- 9.3.1 The R Function Lowess() -- 9.3.2 The R Function Smooth.Spline() -- 9.3.3 Kernel Smoothers in Spec.Pgram() -- 9.4 Smoothing the Periodogram for a Series With a Known and Unknown Period -- 9.4.1 Period Known -- 9.4.2 Period Unknown -- 9.5 Summary -- Exercises -- 10 A Regression Model for Periodic Data -- 10.1 The Model -- 10.2 An Example: The NYC Temperature Data -- 10.2.1 Fitting a Periodic Function -- 10.2.2 An Outlier -- 10.2.3 Refitting the Model with the Outlier Corrected -- 10.3 Complications 1: CO2 Data -- 10.4 Complications 2: Sunspot Numbers -- 10.5 Complications 3: Accidental Deaths -- 10.6 Summary -- Exercises -- 11 Model Selection and Cross-Validation -- 11.1 Background -- 11.2 Hypothesis tests in simple regression -- 11.3 A more general setting for likelihood ratio tests -- 11.4 A subtlety different situation -- 11.5 Information criteria -- 11.6 Cross-validation (Data splitting): NYC temperatures -- 11.6.1 Explained Variation, R2 -- 11.6.2 Data Splitting -- 11.6.3 Leave-One-Out Cross-Validation -- 11.6.4 AIC as Leave-One-Out Cross-Validation -- 11.7 Summary -- Exercises -- 12 Fitting Fourier series -- 12.1 Introduction: more complex periodic models -- 12.2 More complex periodic behavior: Accidental deaths.
12.2.1 Fourier Series Structure -- 12.2.2 R Code for Fitting Large Fourier Series -- 12.2.3 Model Selection with AIC -- 12.2.4 Model Selection with Likelihood Ratio Tests -- 12.2.5 Data Splitting -- 12.2.6 Accidental Deaths-Some Comment on Periodic Data -- 12.3 The Boise river flow data -- 12.3.1 The Data -- 12.3.2 Model Selection with AIC -- 12.3.3 Data Splitting -- 12.3.4 The Residuals -- 12.4 Where do we go from here? -- EXERCISES -- 13 Adjusting for AR(1) Correlation in Complex Models -- 13.1 Introduction -- 13.2 The Two-Sample t-Test-UNCUT and Patch-Cut Forest -- 13.2.1 The Sleuth Data and the Question of Interest -- 13.2.2 A Simple Adjustment for t-Tests When the Residuals Are AR(1) -- 13.2.3 A Simulation Example -- 13.2.4 Analysis of the Sleuth Data -- 13.3 The Second Sleuth Case-Global Warming, A Simple Regression -- 13.3.1 The Data and the Question -- 13.3.2 Filtering to Produce (Quasi-)Independent Observations -- 13.3.3 Simulated Example-Regression -- 13.3.4 Analysis of the Regression Case -- 13.3.5 The Filtering Approach for the Logging Case -- 13.3.6 A Few Comments on Filtering -- 13.4 The Semmelweis Intervention -- 13.4.1 The Data -- 13.4.2 Why Serial Correlation? -- 13.4.3 How This Data Differs from the Patch/Uncut Case -- 13.4.4 Filtered Analysis -- 13.4.5 Transformations and Inference -- 13.5 The NYC Temperatures (Adjusted) -- 13.5.1 The Data and Prediction Intervals -- 13.5.2 The AR(1) Prediction Model -- 13.5.3 A Simulation to Evaluate These Formulas -- 13.5.4 Application to NYC Data -- 13.6 The Boise River Flow Data: Model Selection With Filtering -- 13.6.1 The Revised Model Selection Problem -- 13.6.2 Comments on R2 and R2pred -- 13.6.3 Model Selection After Filtering with a Matrix -- 13.7 Implications of AR(1) Adjustments and the "Skip" Method -- 13.7.1 Adjustments for AR(1) Autocorrelation.
13.7.2 Impact of Serial Correlation on p-Values -- 13.7.3 The "skip" Method -- 13.8 Summary -- Exercises -- PART III Complex Temporal Structures -- 14 The backshift operator, the impulse response function, and general ARMA models -- 14.1 The general ARMA model -- 14.1.1 The Mathematical Formulation -- 14.1.2 The arima.sim() Function in R Revisited -- 14.1.3 Examples of ARMA(m,l) Models -- 14.2 The backshift (shift, lag) operator -- 14.2.1 Definition of B -- 14.2.2 The Stationary Conditions for a General AR(m) Model -- 14.2.3 ARMA(m,l) Models and the Backshift Operator -- 14.2.4 More Examples of ARMA(m,l) Models -- 14.3 The impulse response operator - intuition -- 14.4 Impulse response operator, g(B)-computation -- 14.4.1 Definition of g(B) -- 14.4.2 Computing the Coefficients, -- 14.4.3 Plotting an Impulse Response Function -- 14.5 Interpretation and utility of the impulse response function -- Exercises -- 15 The Yule-Walker Equations and the Partial Autocorrelation Function -- 15.1 Background -- 15.2 Autocovariance of an ARMA(m,l) Model -- 15.2.1 A Preliminary Result -- 15.2.2 The Autocovariance Function for ARMA(m,l) Models -- 15.3 AR(m) and the Yule-Walker Equations -- 15.3.1 The Equations -- 15.3.2 The R Function ar.yw() with an AR(3) Example -- 15.3.3 Information Criteria-Based Model Selection Using ar.yw() -- 15.4 The Partial Autocorrelation Plot -- 15.4.1 A Sequence of Hypothesis Tests -- 15.4.2 The pacf() Function-Hypothesis Tests Presented in a Plot -- 15.5 The Spectrum For Arma Processes -- 15.6 Summary -- Exercises -- 16 Modeling philosophy and Complete Examples -- 16.1 Modeling overview -- 16.1.1 The Algorithm -- 16.1.2 The Underlying Assumption -- 16.1.3 An Example Using an AR(m) Filter to Model MA(3) -- 16.1.4 Generalizing the "Skip" Method -- 16.2 A complex periodic model-Monthly river flows, Furnas 1931-1978 -- 16.2.1 The Data.
16.2.2 A Saturated Model.
Summary: Written at a readily accessible level, Basic Data Analysis for Time Series with R emphasizes the mathematical importance of collaborative analysis of data used to collect increments of time or space. Balancing a theoretical and practical approach to analyzing data within the context of serial correlation, the book presents a coherent and systematic regression-based approach to model selection.  The book illustrates these principles of model selection and model building through the use of information criteria, cross validation, hypothesis tests, and confidence intervals. Focusing on frequency- and time-domain and trigonometric regression as the primary themes, the book also includes modern topical coverage on Fourier series and Akaike's Information Criterion (AIC). In addition, Basic Data Analysis for Time Series with R also features: Real-world examples to provide readers with practical hands-on experience Multiple R software subroutines employed with graphical displays Numerous exercise sets intended to support readers understanding of the core concepts Specific chapters devoted to the analysis of the Wolf sunspot number data and the Vostok ice core data sets.
Tags from this library: No tags from this library for this title. Log in to add tags.
    Average rating: 0.0 (0 votes)
No physical items for this record

Intro -- Basic Data Analysis for Time Series with R -- Contents -- Preface -- What This Book is About -- Motivation -- Required Background -- A Couple of Odd Features -- Acknowledgments -- PART I Basic Correlation Structures -- 1 R Basics -- 1.1 Getting Started -- 1.2 Special R Conventions -- 1.3 Common Structures -- 1.4 Common Functions -- 1.5 Time Series Functions -- 1.6 Importing Data -- Exercises -- 2 Review of Regression and More About R -- 2.1 Goals of This Chapter -- 2.2 The Simple(ST) Regression Model -- 2.2.1 Ordinary Least Squares -- 2.2.2 Properties of OLS Estimates -- 2.2.3 Matrix Representation of the Problem -- 2.3 Simulating The Data From A Model and Estimating The Model Parameters in R -- 2.3.1 Simulating Data -- 2.3.2 Estimating the Model Parameters in R -- 2.4 Basic Inference for the Model -- 2.5 Residuals Analysis-What Can go Wrong… -- 2.6 Matrix Manipulation in R -- 2.6.1 Introduction -- 2.6.2 OLS the Hard Way -- 2.6.3 Some Other Matrix Commands -- Exercises -- 3 The Modeling Approach Taken in this Book and Some Examples of Typical Serially Correlated Data -- 3.1 Signal and Noise -- 3.2 Time Series Data -- 3.3 Simple Regression in the Framework -- 3.4 Real Data and Simulated Data -- 3.5 The Diversity of Time Series Data -- 3.6 Getting Data Into R -- 3.6.1 Overview -- 3.6.2 The Diskette and the scan() and ts() Functions-New York City Temperatures -- 3.6.3 The Diskette and the read.table() Function-The Semmelweis Data -- 3.6.4 Cut and Paste Data to a Text Editor -- Exercises -- 4 Some Comments on Assumptions -- 4.1 Introduction -- 4.2 The Normality Assumption -- 4.2.1 Right Skew -- 4.2.2 Left Skew -- 4.2.3 Heavy Tails -- 4.3 Equal Variance -- 4.3.1 Two-Sample t-Test -- 4.3.2 Regression -- 4.4 Independence -- 4.5 Power of Logarithmic Transformations Illustrated -- 4.6 Summary -- Exercises.

5 The Autocorrelation Function And AR(1), AR(2) Models -- 5.1 Standard Models-What are the Alternatives to WHITE NOISE? -- 5.2 Autocovariance and Autocorrelation -- 5.2.1 Stationarity -- 5.2.2 A Note About Conditions -- 5.2.3 Properties of Autocovariance -- 5.2.4 White Noise -- 5.2.5 Estimation of the Autocovariance and Autocorrelation -- 5.3 The acf() Function in R -- 5.3.1 Background -- 5.3.2 The Basic Code for Estimating the Autocovariance -- 5.4 The First Alternative to White Noise: Autoregressive Errors-AR(1), AR(2) -- 5.4.1 Definition of the AR(1) and AR(2) Models -- 5.4.2 Some Preliminary Facts -- 5.4.3 The AR(1) Model Autocorrelation and Autocovariance -- 5.4.4 Using Correlation and Scatterplots to Illustrate the AR(1) Model -- 5.4.5 The AR(2) Model Autocorrelation and Autocovariance -- 5.4.6 Simulating Data for AR(m) Models -- 5.4.7 Examples of Stable and Unstable AR(1) Models -- 5.4.8 Examples of Stable and Unstable AR(2) Models -- Exercises -- 6 The Moving Average Models MA(1) And MA(2) -- 6.1 The Moving Average Model -- 6.2 The Autocorrelation for MA(1) Models -- 6.3 A Duality Between MA(l) And AR(m) Models -- 6.4 The Autocorrelation for MA(2) Models -- 6.5 Simulated Examples of the MA(1) Model -- 6.6 Simulated Examples of the MA(2) Model -- 6.7 AR(m) and MA(l) model acf() Plots -- Exercises -- PART II Analysis of Periodic Data and Model Selection -- 7 Review of Transcendental Functions and Complex Numbers -- 7.1 Background -- 7.2 Complex Arithmetic -- 7.2.1 The Number i -- 7.2.2 Complex Conjugates -- 7.2.3 The Magnitude of a Complex Number -- 7.3 Some Important Series -- 7.3.1 The Geometric and Some Transcendental Series -- 7.3.2 A Rationale for Eulers Formula -- 7.4 Useful Facts About Periodic Transcendental Functions -- Exercises -- 8 The Power Spectrum and the Periodogram -- 8.1 Introduction.

8.2 A Definition and a Simplified Form for -- 8.3 Inverting p(f) to Recover the Ck Values -- 8.4 The Power Spectrum for Some Familiar Models -- 8.4.1 White Noise -- 8.4.2 The Spectrum for AR(1) Models -- 8.4.3 The Spectrum for AR(2) Models -- 8.5 The Periodogram, a Closer Look -- 8.5.1 Why is the Periodogram Useful? -- 8.5.2 Some Naïve Code for a Periodogram -- 8.5.3 An Example-The Sunspot Data -- 8.6 The Function SPEC.PGRAM() in R -- Exercises -- 9 Smoothers, The Bias-Variance Tradeoff, and the Smoothed Periodogram -- 9.1 Why is Smoothing Required? -- 9.2 Smoothing, Bias, and Variance -- 9.3 Smoothers Used in R -- 9.3.1 The R Function Lowess() -- 9.3.2 The R Function Smooth.Spline() -- 9.3.3 Kernel Smoothers in Spec.Pgram() -- 9.4 Smoothing the Periodogram for a Series With a Known and Unknown Period -- 9.4.1 Period Known -- 9.4.2 Period Unknown -- 9.5 Summary -- Exercises -- 10 A Regression Model for Periodic Data -- 10.1 The Model -- 10.2 An Example: The NYC Temperature Data -- 10.2.1 Fitting a Periodic Function -- 10.2.2 An Outlier -- 10.2.3 Refitting the Model with the Outlier Corrected -- 10.3 Complications 1: CO2 Data -- 10.4 Complications 2: Sunspot Numbers -- 10.5 Complications 3: Accidental Deaths -- 10.6 Summary -- Exercises -- 11 Model Selection and Cross-Validation -- 11.1 Background -- 11.2 Hypothesis tests in simple regression -- 11.3 A more general setting for likelihood ratio tests -- 11.4 A subtlety different situation -- 11.5 Information criteria -- 11.6 Cross-validation (Data splitting): NYC temperatures -- 11.6.1 Explained Variation, R2 -- 11.6.2 Data Splitting -- 11.6.3 Leave-One-Out Cross-Validation -- 11.6.4 AIC as Leave-One-Out Cross-Validation -- 11.7 Summary -- Exercises -- 12 Fitting Fourier series -- 12.1 Introduction: more complex periodic models -- 12.2 More complex periodic behavior: Accidental deaths.

12.2.1 Fourier Series Structure -- 12.2.2 R Code for Fitting Large Fourier Series -- 12.2.3 Model Selection with AIC -- 12.2.4 Model Selection with Likelihood Ratio Tests -- 12.2.5 Data Splitting -- 12.2.6 Accidental Deaths-Some Comment on Periodic Data -- 12.3 The Boise river flow data -- 12.3.1 The Data -- 12.3.2 Model Selection with AIC -- 12.3.3 Data Splitting -- 12.3.4 The Residuals -- 12.4 Where do we go from here? -- EXERCISES -- 13 Adjusting for AR(1) Correlation in Complex Models -- 13.1 Introduction -- 13.2 The Two-Sample t-Test-UNCUT and Patch-Cut Forest -- 13.2.1 The Sleuth Data and the Question of Interest -- 13.2.2 A Simple Adjustment for t-Tests When the Residuals Are AR(1) -- 13.2.3 A Simulation Example -- 13.2.4 Analysis of the Sleuth Data -- 13.3 The Second Sleuth Case-Global Warming, A Simple Regression -- 13.3.1 The Data and the Question -- 13.3.2 Filtering to Produce (Quasi-)Independent Observations -- 13.3.3 Simulated Example-Regression -- 13.3.4 Analysis of the Regression Case -- 13.3.5 The Filtering Approach for the Logging Case -- 13.3.6 A Few Comments on Filtering -- 13.4 The Semmelweis Intervention -- 13.4.1 The Data -- 13.4.2 Why Serial Correlation? -- 13.4.3 How This Data Differs from the Patch/Uncut Case -- 13.4.4 Filtered Analysis -- 13.4.5 Transformations and Inference -- 13.5 The NYC Temperatures (Adjusted) -- 13.5.1 The Data and Prediction Intervals -- 13.5.2 The AR(1) Prediction Model -- 13.5.3 A Simulation to Evaluate These Formulas -- 13.5.4 Application to NYC Data -- 13.6 The Boise River Flow Data: Model Selection With Filtering -- 13.6.1 The Revised Model Selection Problem -- 13.6.2 Comments on R2 and R2pred -- 13.6.3 Model Selection After Filtering with a Matrix -- 13.7 Implications of AR(1) Adjustments and the "Skip" Method -- 13.7.1 Adjustments for AR(1) Autocorrelation.

13.7.2 Impact of Serial Correlation on p-Values -- 13.7.3 The "skip" Method -- 13.8 Summary -- Exercises -- PART III Complex Temporal Structures -- 14 The backshift operator, the impulse response function, and general ARMA models -- 14.1 The general ARMA model -- 14.1.1 The Mathematical Formulation -- 14.1.2 The arima.sim() Function in R Revisited -- 14.1.3 Examples of ARMA(m,l) Models -- 14.2 The backshift (shift, lag) operator -- 14.2.1 Definition of B -- 14.2.2 The Stationary Conditions for a General AR(m) Model -- 14.2.3 ARMA(m,l) Models and the Backshift Operator -- 14.2.4 More Examples of ARMA(m,l) Models -- 14.3 The impulse response operator - intuition -- 14.4 Impulse response operator, g(B)-computation -- 14.4.1 Definition of g(B) -- 14.4.2 Computing the Coefficients, -- 14.4.3 Plotting an Impulse Response Function -- 14.5 Interpretation and utility of the impulse response function -- Exercises -- 15 The Yule-Walker Equations and the Partial Autocorrelation Function -- 15.1 Background -- 15.2 Autocovariance of an ARMA(m,l) Model -- 15.2.1 A Preliminary Result -- 15.2.2 The Autocovariance Function for ARMA(m,l) Models -- 15.3 AR(m) and the Yule-Walker Equations -- 15.3.1 The Equations -- 15.3.2 The R Function ar.yw() with an AR(3) Example -- 15.3.3 Information Criteria-Based Model Selection Using ar.yw() -- 15.4 The Partial Autocorrelation Plot -- 15.4.1 A Sequence of Hypothesis Tests -- 15.4.2 The pacf() Function-Hypothesis Tests Presented in a Plot -- 15.5 The Spectrum For Arma Processes -- 15.6 Summary -- Exercises -- 16 Modeling philosophy and Complete Examples -- 16.1 Modeling overview -- 16.1.1 The Algorithm -- 16.1.2 The Underlying Assumption -- 16.1.3 An Example Using an AR(m) Filter to Model MA(3) -- 16.1.4 Generalizing the "Skip" Method -- 16.2 A complex periodic model-Monthly river flows, Furnas 1931-1978 -- 16.2.1 The Data.

16.2.2 A Saturated Model.

Written at a readily accessible level, Basic Data Analysis for Time Series with R emphasizes the mathematical importance of collaborative analysis of data used to collect increments of time or space. Balancing a theoretical and practical approach to analyzing data within the context of serial correlation, the book presents a coherent and systematic regression-based approach to model selection.  The book illustrates these principles of model selection and model building through the use of information criteria, cross validation, hypothesis tests, and confidence intervals. Focusing on frequency- and time-domain and trigonometric regression as the primary themes, the book also includes modern topical coverage on Fourier series and Akaike's Information Criterion (AIC). In addition, Basic Data Analysis for Time Series with R also features: Real-world examples to provide readers with practical hands-on experience Multiple R software subroutines employed with graphical displays Numerous exercise sets intended to support readers understanding of the core concepts Specific chapters devoted to the analysis of the Wolf sunspot number data and the Vostok ice core data sets.

Description based on publisher supplied metadata and other sources.

Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2018. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.

There are no comments on this title.

to post a comment.

Powered by Koha