The news are (reminder: the nnetsauce.Lazy*
s do automated Machine Learning benchmarking of multiple models):
LazyDeepMTS
: no more LazyMTS
class, instead, you can use LazyDeepMTS
with n_layers=1
LazyDeepMTS
(see updated docs and examples/lazy_mts_horizon.py)ClassicalMTS
for classsical models (for now VAR and VECM adapted from statsmodels for a unified interface in nnetsauce) in multivariate time series forecasting (not available in LazyDeepMTS
yet)partial_fit
for CustomClassifier
and CustomRegressor
The Python version now contains a class FitForecaster
, that does conformalized time series forecasting (that is, with uncertainty quantification). It is similar to R’s ahead::fitforecast
and an example can be found here:
https://github.com/Techtonique/ahead_python/blob/main/examples/fitforecaster.py
misc
is a package of utility functions that I use frequently and always wanted to have stored somewhere. The functions are mostly short, but (hopefully) doing one thing well, and powerful. misc::parfor
is adapted from the excellent foreach::foreach
. The difference is: misc::parfor
calls a function in a loop. Two of the advantages of misc::parfor
over foreach::foreach
are:
cl
to use parallel processing (NULL
for all the cores).Here are a few examples of use of misc::parfor
:
devtools::install_github("thierrymoudiki/misc") library(misc)
misc::parfor(function(x) x^2, 1:10) misc::parfor(function(x) x^2, 1:10, cl = 2) misc::parfor(function(x) x^2, 1:10, verbose = TRUE) misc::parfor(function(x) x^3, 1:10, show_progress = FALSE) misc::parfor(function(x) x^3, 1:10, show_progress = FALSE) foo <- function(x) { print(x) return(x*0.5) } misc::parfor(foo, 1:10, show_progress = FALSE, verbose = TRUE, combine = rbind) misc::parfor(foo, 1:10, show_progress = FALSE, verbose = TRUE, combine = cbind)
foo2 <- function(x) { print(x) return(x*0.5) } misc::parfor(foo2, 1:10, show_progress = FALSE, verbose = TRUE, combine = '+')