%matplotlib inline
rng = np.random.default_rng(20210819)
eta = rng.standard_normal(5200)
rho = 0.8
beta = 10
epsilon = eta.copy()
for i in range(1, eta.shape[0]):
epsilon[i] = rho * epsilon[i - 1] + eta[i]
y = beta + epsilon
y = y[200:]
The three models are specified and estimated in the next cell. An AR(0) is included as a reference. The AR(0) is identical using all three estimators.
ar0_res = SARIMAX(y, order=(0, 0, 0), trend="c").fit()
sarimax_res = SARIMAX(y, order=(1, 0, 0), trend="c").fit()
arima_res = ARIMA(y, order=(1, 0, 0), trend="c").fit()
autoreg_res = AutoReg(y, 1, trend="c").fit()
At X0 0 variables are exactly at the bounds
At iterate 0 f= 1.91760D+00 |proj g|= 3.68860D-06
* * *
Tit = total number of iterations
Tnf = total number of function evaluations
Tnint = total number of segments explored during Cauchy searches
Skip = number of BFGS updates skipped
Nact = number of active bounds at final generalized Cauchy point
Projg = norm of the final projected gradient
F = final function value
* * *
N Tit Tnf Tnint Skip Nact Projg F
2 0 1 0 0 0 3.689D-06 1.918D+00
F = 1.9175996129577773
CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL
At X0 0 variables are exactly at the bounds
At iterate 0 f= 1.41373D+00 |proj g|= 9.51828D-04
* * *
Tit = total number of iterations
Tnf = total number of function evaluations
Tnint = total number of segments explored during Cauchy searches
Skip = number of BFGS updates skipped
Nact = number of active bounds at final generalized Cauchy point
Projg = norm of the final projected gradient
F = final function value
* * *
N Tit Tnf Tnint Skip Nact Projg F
3 2 5 1 0 0 4.516D-05 1.414D+00
F = 1.4137311050015484
CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH
The table below contains the estimated parameter in the model, the estimated AR(1) coefficient, and the long-run mean which is either equal to the estimated parameters (AR(0) or ARIMA
), or depends on the ratio of the intercept to 1 minus the AR(1) parameter.