Blog Archives

Identification of ARMA processes

February 19, 2014
By
Identification of ARMA processes

Last week (in the MAT8181 course) in order to identify the orders of an ARMA process, we’ve seen the eacf method, and I mentioned the scan method, introduced in Tsay and Tiao (1985). The code below – to produce the output of the scan procedure – has been adapted from an old code by Steve Chen (where I included a visualization...

Read more »

Voting Twice in France

February 19, 2014
By
Voting Twice in France

On the Monkey Cage blog, Baptiste Coulmont (a.k.a. @coulmont) recently uploaded a post entitled “You can vote twice ! The many political appeals of proxy votes in France“, coauthored with Joël Gombin (a.k.a. @joelgombin), and myself. The study was initially written in French as mentioned in a previous post. Baptiste posted additional information on his blog (http://coulmont.com/blog/…) and I also wanted to post some lines of code,...

Read more »

Bivariate Densities with N(0,1) Margins

February 18, 2014
By
Bivariate Densities with N(0,1) Margins

This Monday, in the ACT8595 course, we came back on elliptical distributions and conditional independence (here is an old post on de Finetti’s theorem, and the extension to Hewitt-Savage’s). I have shown simulations, to illustrate those two concepts of dependent variables, but I wanted to spend some time to visualize densities. More specifically what could be the joint density...

Read more »

Temperatures Series as Random Walks

February 12, 2014
By
Temperatures Series as Random Walks

Last year, I did mention in a post that unit-root tests are dangerous, because they might lead us to strange models. For instance, in a post, I did obtain that the temperature observed in January 2013, in Montréal, might be considered as a random walk process (or at leat an integrated process). The code to extract the data has...

Read more »

Unit Root Tests

February 12, 2014
By
Unit Root Tests

This week, in the MAT8181 Time Series course, we’ve discussed unit root tests. According to Wold’s theorem, if is  (weakly) stationnary then where is the innovation process, and where  is some deterministic series (just to get a result as general as possible). Observe that as discussed in a previous post. To go one step further, there is also the...

Read more »

Personal Analytics with RSS Feeds

February 7, 2014
By
Personal Analytics with RSS Feeds

I am currently working on a paper on Academic Blogging, from my own experience. And I wanted to do something similar to Stephen Wolfram’s personal analytics of my life. More specifically, I wanted to understand when I do post my blog entries. If I post more entries during office hours, then it should mean that, indeed, I consider my blog as...

Read more »

Inference for ARMA(p,q) Time Series

January 30, 2014
By
Inference for ARMA(p,q) Time Series

As we mentioned in our previous post, as soon as we have a moving average part, inference becomes more complicated. Again, to illustrate, we do not need a two general model. Consider, here, some  process, where  is some white noise, and assume further that . > theta=.7 > phi=.5 > n=1000 > Z=rep(0,n) > set.seed(1) > e=rnorm(n) > for(t...

Read more »

Inference for MA(q) Time Series

January 29, 2014
By
Inference for MA(q) Time Series

Yesterday, we’ve seen how inference for time series was possible.  I started  with that one because it is actually the simple case. For instance, we can use ordinary least squares. There might be some possible bias (see e.g. White (1961)), but asymptotically, estimators are fine (consistent, with asymptotic normality). But when the noise is (auto)correlated, then it is more...

Read more »

Inference for AR(p) Time Series

January 28, 2014
By
Inference for AR(p) Time Series

Consider a (stationary) autoregressive process, say of order 2, for some white noise with variance . Here is a code to generate such a process, > phi1=.25 > phi2=.7 > n=1000 > set.seed(1) > e=rnorm(n) > Z=rep(0,n) > for(t in 3:n) Z=phi1*Z+phi2*Z+e > Z=Z > n=length(Z) > plot(Z,type="l") Here, we have to estimate two sets of parameters: the autoregressive...

Read more »

Bias of Hill Estimators

January 28, 2014
By
Bias of Hill Estimators

In the MAT8595 course, we’ve seen yesterday Hill estimator of the tail index. To be more specific, we did see see that if , with , then Hill estimators for are given by for . Then we did say that satisfies some consistency in the sense that if , but not too fast, i.e. (under additional assumptions on the...

Read more »