es()
is a part of smooth
package and is a wrapper for the ADAM
function with distribution="dnorm"
. It implements
Exponential Smoothing in the ETS form, selecting the most appropriate
model among 30 possible ones.
We will use some of the functions of the greybox
package
in this vignette for demonstrational purposes.
Let’s load the necessary packages:
require(smooth)
require(greybox)
The simplest call for the es()
function is:
<- es(BJsales, h=12, holdout=TRUE, silent=FALSE) ourModel
## Forming the pool of models based on... ANN , AAN , Estimation progress: 33 %44 %56 %67 %78 %89 %100 %... Done!
ourModel
## Time elapsed: 0.2 seconds
## Model estimated using es() function: ETS(AMdN)
## Distribution assumed in the model: Normal
## Loss function type: likelihood; Loss function value: 237.7324
## Persistence vector g:
## alpha beta
## 1.000 0.249
## Damping parameter: 0.9034
## Sample size: 138
## Number of estimated parameters: 6
## Number of degrees of freedom: 132
## Information criteria:
## AIC AICc BIC BICc
## 487.4647 488.1060 505.0283 506.6080
##
## Forecast errors:
## ME: 2.83; MAE: 2.986; RMSE: 3.676
## sCE: 14.939%; Asymmetry: 87.8%; sMAE: 1.314%; sMSE: 0.026%
## MASE: 2.507; RMSSE: 2.396; rMAE: 0.963; rRMSE: 0.959
In this case function uses branch and bound algorithm to form a pool of models to check and after that constructs a model with the lowest information criterion. As we can see, it also produces an output with brief information about the model, which contains:
holdout=TRUE
).The function has also produced a graph with actual values, fitted values and point forecasts.
If we need prediction interval, then we can use the
forecast()
method:
plot(forecast(ourModel, h=12, interval="prediction"))
The same model can be reused for different purposes, for example to produce forecasts based on newly available data:
es(BJsales, model=ourModel, h=12, holdout=FALSE)
## Time elapsed: 0 seconds
## Model estimated using es() function: ETS(AMdN)
## Distribution assumed in the model: Normal
## Loss function type: likelihood; Loss function value: 255.4039
## Persistence vector g:
## alpha beta
## 1.000 0.249
## Damping parameter: 0.9034
## Sample size: 150
## Number of estimated parameters: 1
## Number of degrees of freedom: 149
## Information criteria:
## AIC AICc BIC BICc
## 512.8078 512.8348 515.8184 515.8862
We can also extract the type of model in order to reuse it later:
modelType(ourModel)
## [1] "AMdN"
This handy function also works with ets()
from forecast
package.
If we need actual values from the model, we can use
actuals()
method from greybox
package:
actuals(ourModel)
## Time Series:
## Start = 1
## End = 138
## Frequency = 1
## [1] 200.1 199.5 199.4 198.9 199.0 200.2 198.6 200.0 200.3 201.2 201.6 201.5
## [13] 201.5 203.5 204.9 207.1 210.5 210.5 209.8 208.8 209.5 213.2 213.7 215.1
## [25] 218.7 219.8 220.5 223.8 222.8 223.8 221.7 222.3 220.8 219.4 220.1 220.6
## [37] 218.9 217.8 217.7 215.0 215.3 215.9 216.7 216.7 217.7 218.7 222.9 224.9
## [49] 222.2 220.7 220.0 218.7 217.0 215.9 215.8 214.1 212.3 213.9 214.6 213.6
## [61] 212.1 211.4 213.1 212.9 213.3 211.5 212.3 213.0 211.0 210.7 210.1 211.4
## [73] 210.0 209.7 208.8 208.8 208.8 210.6 211.9 212.8 212.5 214.8 215.3 217.5
## [85] 218.8 220.7 222.2 226.7 228.4 233.2 235.7 237.1 240.6 243.8 245.3 246.0
## [97] 246.3 247.7 247.6 247.8 249.4 249.0 249.9 250.5 251.5 249.0 247.6 248.8
## [109] 250.4 250.7 253.0 253.7 255.0 256.2 256.0 257.4 260.4 260.0 261.3 260.4
## [121] 261.6 260.8 259.8 259.0 258.9 257.4 257.7 257.9 257.4 257.3 257.6 258.9
## [133] 257.8 257.7 257.2 257.5 256.8 257.5
We can also use persistence or initials only from the model to construct the other one:
# Provided initials
es(BJsales, model=modelType(ourModel),
h=12, holdout=FALSE,
initial=ourModel$initial)
## Time elapsed: 0.02 seconds
## Model estimated using es() function: ETS(AMdN)
## Distribution assumed in the model: Normal
## Loss function type: likelihood; Loss function value: 255.2728
## Persistence vector g:
## alpha beta
## 0.9716 0.2812
## Damping parameter: 0.8747
## Sample size: 150
## Number of estimated parameters: 4
## Number of degrees of freedom: 146
## Information criteria:
## AIC AICc BIC BICc
## 518.5456 518.8215 530.5881 531.2793
# Provided persistence
es(BJsales, model=modelType(ourModel),
h=12, holdout=FALSE,
persistence=ourModel$persistence)
## Time elapsed: 0.02 seconds
## Model estimated using es() function: ETS(AMdN)
## Distribution assumed in the model: Normal
## Loss function type: likelihood; Loss function value: 255.3497
## Persistence vector g:
## alpha beta
## 1.000 0.249
## Damping parameter: 0.888
## Sample size: 150
## Number of estimated parameters: 4
## Number of degrees of freedom: 146
## Information criteria:
## AIC AICc BIC BICc
## 518.6993 518.9752 530.7419 531.4330
or provide some arbitrary values:
es(BJsales, model=modelType(ourModel),
h=12, holdout=FALSE,
initial=200)
## Time elapsed: 0.03 seconds
## Model estimated using es() function: ETS(AMdN)
## Distribution assumed in the model: Normal
## Loss function type: likelihood; Loss function value: 255.3529
## Persistence vector g:
## alpha beta
## 0.9893 0.2677
## Damping parameter: 0.8928
## Sample size: 150
## Number of estimated parameters: 5
## Number of degrees of freedom: 145
## Information criteria:
## AIC AICc BIC BICc
## 520.7058 521.1224 535.7589 536.8028
Using some other parameters may lead to completely different model and forecasts (see discussion of the additional parameters in the online textbook about ADAM):
es(BJsales, h=12, holdout=TRUE, loss="MSEh", bounds="a", ic="BIC")
## Time elapsed: 0.78 seconds
## Model estimated using es() function: ETS(MAN)
## Distribution assumed in the model: Normal
## Loss function type: MSEh; Loss function value: 0.0018
## Persistence vector g:
## alpha beta
## 1.5008 0.0000
##
## Sample size: 138
## Number of estimated parameters: 4
## Number of degrees of freedom: 134
## Information criteria:
## AIC AICc BIC BICc
## 1022.671 1022.971 1034.380 1035.121
##
## Forecast errors:
## ME: -0.446; MAE: 1.206; RMSE: 1.345
## sCE: -2.356%; Asymmetry: -43.6%; sMAE: 0.531%; sMSE: 0.004%
## MASE: 1.012; RMSSE: 0.877; rMAE: 0.389; rRMSE: 0.351
You can play around with all the available parameters to see what’s their effect on the final model.
In order to combine forecasts we need to use “C” letter:
es(BJsales, model="CCN", h=12, holdout=TRUE)
## Time elapsed: 0.21 seconds
## Model estimated: ETS(CCN)
## Loss function type: likelihood
##
## Number of models combined: 10
## Sample size: 138
## Average number of estimated parameters: 6.3367
## Average number of degrees of freedom: 131.6633
##
## Forecast errors:
## ME: 2.842; MAE: 2.993; RMSE: 3.686
## sCE: 15.002%; sMAE: 1.317%; sMSE: 0.026%
## MASE: 2.513; RMSSE: 2.403; rMAE: 0.966; rRMSE: 0.962
Model selection from a specified pool and forecasts combination are called using respectively:
# Select the best model in the pool
es(BJsales, model=c("ANN","AAN","AAdN","MNN","MAN","MAdN"),
h=12, holdout=TRUE)
## Time elapsed: 0.1 seconds
## Model estimated using es() function: ETS(AAdN)
## Distribution assumed in the model: Normal
## Loss function type: likelihood; Loss function value: 238.2715
## Persistence vector g:
## alpha beta
## 0.9534 0.2925
## Damping parameter: 0.8622
## Sample size: 138
## Number of estimated parameters: 6
## Number of degrees of freedom: 132
## Information criteria:
## AIC AICc BIC BICc
## 488.5431 489.1843 506.1066 507.6863
##
## Forecast errors:
## ME: 2.814; MAE: 2.969; RMSE: 3.655
## sCE: 14.854%; Asymmetry: 87.8%; sMAE: 1.306%; sMSE: 0.026%
## MASE: 2.492; RMSSE: 2.382; rMAE: 0.958; rRMSE: 0.954
# Combine the pool of models
es(BJsales, model=c("CCC","ANN","AAN","AAdN","MNN","MAN","MAdN"),
h=12, holdout=TRUE)
## Time elapsed: 0.1 seconds
## Model estimated: ETS(CCN)
## Loss function type: likelihood
##
## Number of models combined: 6
## Sample size: 138
## Average number of estimated parameters: 6.4484
## Average number of degrees of freedom: 131.5516
##
## Forecast errors:
## ME: 2.851; MAE: 2.998; RMSE: 3.694
## sCE: 15.051%; sMAE: 1.319%; sMSE: 0.026%
## MASE: 2.517; RMSSE: 2.408; rMAE: 0.967; rRMSE: 0.964
Now we introduce explanatory variable in ETS:
<- BJsales.lead x
and fit an ETSX model with the exogenous variable first:
es(BJsales, model="ZZZ", h=12, holdout=TRUE,
xreg=x)
## Time elapsed: 1.02 seconds
## Model estimated using es() function: ETSX(AAdN)
## Distribution assumed in the model: Normal
## Loss function type: likelihood; Loss function value: 237.5567
## Persistence vector g (excluding xreg):
## alpha beta
## 0.9498 0.2933
## Damping parameter: 0.8812
## Sample size: 138
## Number of estimated parameters: 7
## Number of degrees of freedom: 131
## Information criteria:
## AIC AICc BIC BICc
## 489.1133 489.9748 509.6041 511.7266
##
## Forecast errors:
## ME: 2.875; MAE: 2.998; RMSE: 3.7
## sCE: 15.178%; Asymmetry: 90%; sMAE: 1.319%; sMSE: 0.026%
## MASE: 2.517; RMSSE: 2.412; rMAE: 0.967; rRMSE: 0.966
If we want to check if lagged x can be used for forecasting purposes,
we can use xregExpander()
function from
greybox
package:
es(BJsales, model="ZZZ", h=12, holdout=TRUE,
xreg=xregExpander(x), regressors="use")
## Time elapsed: 0.48 seconds
## Model estimated using es() function: ETSX(ANN)
## Distribution assumed in the model: Normal
## Loss function type: likelihood; Loss function value: 251.4178
## Persistence vector g (excluding xreg):
## alpha
## 1
##
## Sample size: 138
## Number of estimated parameters: 6
## Number of degrees of freedom: 132
## Information criteria:
## AIC AICc BIC BICc
## 514.8355 515.4767 532.3990 533.9788
##
## Forecast errors:
## ME: 2.447; MAE: 2.925; RMSE: 3.462
## sCE: 12.917%; Asymmetry: 73.6%; sMAE: 1.287%; sMSE: 0.023%
## MASE: 2.455; RMSSE: 2.257; rMAE: 0.943; rRMSE: 0.904
We can also construct a model with selected exogenous (based on IC):
es(BJsales, model="ZZZ", h=12, holdout=TRUE,
xreg=xregExpander(x), regressors="select")
## Time elapsed: 0.96 seconds
## Model estimated using es() function: ETS(AMdN)
## Distribution assumed in the model: Normal
## Loss function type: likelihood; Loss function value: 237.7324
## Persistence vector g:
## alpha beta
## 1.000 0.249
## Damping parameter: 0.9034
## Sample size: 138
## Number of estimated parameters: 6
## Number of degrees of freedom: 132
## Information criteria:
## AIC AICc BIC BICc
## 487.4647 488.1060 505.0283 506.6080
##
## Forecast errors:
## ME: 2.83; MAE: 2.986; RMSE: 3.676
## sCE: 14.939%; Asymmetry: 87.8%; sMAE: 1.314%; sMSE: 0.026%
## MASE: 2.507; RMSSE: 2.396; rMAE: 0.963; rRMSE: 0.959
Finally, if you work with M or M3 data, and need to test a function on a specific time series, you can use the following simplified call:
es(Mcomp::M3$N2457, silent=FALSE)
This command has taken the data, split it into in-sample and holdout and produced the forecast of appropriate length to the holdout.