Nixtla Neuralforecast Versions Save

Scalable and user friendly neural :brain: forecasting algorithms.

v1.7.2

1 week ago

New Features

  • [FEAT] DeepNPTS model @elephaint (#990)
  • [FEAT] TiDE model @elephaint (#971)

Bug Fixes

  • [FIX] Refit after validation boolean @elephaint (#991)
  • fix cross_validation results with uneven windows @jmoralez (#989)
  • [FIX] fix wrong import doc PatchTST @elephaint (#967)
  • [FIX] raise exception nbeats h=1 with stacks @elephaint (#966)

Enhancement

  • reduce default warnings @jmoralez (#974)
  • Create CODE_OF_CONDUCT.md @tracykteal (#972)

v1.7.1

1 month ago

New Features

  • multi-node distributed training with spark @jmoralez (#935)
  • [FEAT] Add BiTCN model @elephaint (#958)
  • [FEAT] - Add iTransformer to neuralforecast @marcopeix (#944)
  • [FEAT] Add MLPMultivariate model @elephaint (#938)

Bug Fixes

  • [FIX] Fixes default settings of BiTCN @elephaint (#961)
  • [FIX] HINT not producing coherent forecasts @elephaint (#964)
  • [FIX] Fixes 948 multivariate predict/val issues when n_series > 1024 @elephaint (#962)
  • handle exogenous variables of TFT in parent class @jmoralez (#959)
  • fix early stopping in ray auto models @jmoralez (#953)
  • fix cross_validation when the id is the index @jmoralez (#951)

Documentation

  • add MLflow logging example @cargecla1 (#892)

v1.7.0

1 month ago

New Features

  • [FEAT] Added TSMixerx model @elephaint (#921)
  • Add Time-LLM @marcopeix (#908)
  • [FEAT] Added TSMixer model @elephaint (#914)
  • Add option to support user defined optimizer for NeuralForecast Models @JQGoh (#901)
  • [FEAT] Added NLinear model @ggattoni (#900)
  • [FEAT] Added DLinear model @cchallu (#875)
  • support refit in cross_validation @jmoralez (#842)
  • use environment variable to get id as column in outputs @jmoralez (#841)
  • support different column names for ids, times and targets @jmoralez (#838)
  • polars support @jmoralez (#829)
  • add callbacks to auto models @jmoralez (#795)

Bug Fixes

  • [FIX] Avoid raised error for varied step_size parameter during predict_insample() @JQGoh (#933)
  • [FIX] 926 auto ensure all models support alias and 924 Configuring hyperparameter space for Auto* Models @elephaint (#927)
  • fix base_multivariate window generation @jmoralez (#907)
  • Fix optuna multigpu @jmoralez (#889)
  • support saving and loading models with alias @jmoralez (#867)
  • [FIX] Polars .columns produces list rather than Pandas Index @akmalsoliev (#862)
  • add missing models to filename dict @jmoralez (#856)
  • ensure exogenous features are lists @jmoralez (#851)
  • fix save with save_dataset=False @jmoralez (#850)
  • copy config in optuna @jmoralez (#844)
  • Fixed: Exception: max_epochs is deprecated, use max_steps instead. @twobitunicorn (#835)
  • fix single column 2d array polars df @jmoralez (#830)
  • move scalers to core @jmoralez (#813)
  • [FIX] Default AutoPatchTST config @cchallu (#811)
  • [FIX] ReVin Numerical Stability @dluuo (#781)
  • On Windows, prevent long trial directory names @tg2k (#735)

Documentation

  • removed documentation for missing argument @yarnabrina (#913)
  • feat: Added cross-validation tutorial @MMenchero (#897)
  • chore: update license to apache-2 @AzulGarza (#882)
  • [FEAT] Model table in README @cchallu (#880)
  • redirect to mintlify docs @jmoralez (#816)
  • add missing models to documentation @jmoralez (#775)

Dependencies

  • add windows to CI @jmoralez (#814)
  • address future warnings @jmoralez (#898)
  • use scalers from coreforecast @jmoralez (#873)
  • add python 3.11 to CI @jmoralez (#839)

Enhancement

  • Reduce device transfers @elephaint (#923)
  • extract common methods to BaseModel @jmoralez (#915)
  • remove TQDMProgressBar callback @jmoralez (#899)
  • use fsspec in save and load methods @jmoralez (#895)
  • Feature/Check input for NaNs when available_mask = 1 @JQGoh (#894)
  • switch flake8 to ruff @Borda (#871)
  • use future instead of deprecation warnings @jmoralez (#849)
  • add frequency validation and futr_df debugging methods @jmoralez (#833)

v1.6.4

7 months ago

New Features

  • TemporalNorm with ReVIN learnable parameters @kdgutier (#768)
  • support optuna in auto models @jmoralez (#763)
  • [FEAT] TimesNet model @cchallu (#757)
  • add local_scaler_type @jmoralez (#754)
  • [FEAT] Implementation of Exogenous - NBEATSx @akmalsoliev (#738)

Bug Fixes

  • [FIX] futr_exog_list in Auto and HINT classes @cchallu (#773)
  • fix off by one error in BaseRecurrent available_ts @KeAWang (#759)

Documentation

  • [DOCS] Scaling tutorial @cchallu (#770)
  • [DOCS] Auto hyperparameter selection with optuna @cchallu (#767)
  • [DOCS] Update tutorials to v.1.6.3 @cchallu (#741)

Enhancement

  • check futr_exog_list are in futr_df @jmoralez (#769)

v1.6.2

9 months ago

What's Changed

Full Changelog: https://github.com/Nixtla/neuralforecast/compare/v1.6.1...v1.6.2

v1.6.1

10 months ago

New Models

  • DeepAR
  • FEDformer

New features

  • Available Mask to specify missing data in input data frame.
  • Improve fit and cross_validation methods with use_init_models parameter to restore models to initial parameters.
  • Added robust losses: HuberLoss, TukeyLoss, HuberQLoss, and HuberMQLoss.
  • Added Bernoulli DistributionLoss to build temporal classifiers.
  • New exclude_insample_y parameter to all models to build models only based on exogenous regressors.
  • Added dropout to NBEATSx and NHITS models.
  • Improved predict method of windows-based models to create batches to control memory usage. Can be controlled with the new inference_windows_batch_size parameter.
  • Improvements to the HINT family of hierarchical models: identity reconciliation, AutoHINT, and reconciliation methods in hyperparameter selection.
  • Added inference_input_sizehyperparameter to recurrent-based methods to control historic length during inference to better control memory usage and inference times.

New tutorials and documentation

  • Neuralforecast map and How-to add new models
  • Transformers for time-series
  • Predict insample tutorial
  • Interpretable Decomposition
  • Outlier Robust Forecasting
  • Temporal Classification
  • Predictive Maintenance
  • Statistical, Machine Learning, and Neural Forecasting methods

Fixed bugs and new protections

  • Fixed bug on MinMax scalers that returned NaN values when the mask had 0 values.
  • Fixed bug on y_loc and y_scale being in different devices.
  • Added early_stopping_steps to the HINT method.
  • Added protection in the fit method of all models to stop training when training or validation loss becomes NaN. Print input and output tensors for debugging.
  • Added protection to prevent the case val_check_step > max_steps from causing an error when early stopping is enabled.
  • Added PatchTST to save and load methods dictionaries.
  • Added AutoNBEATSx to core's MODEL_DICT.
  • Added protection to the NBEATSx-i model where horizon=1 causes an error due to collapsing trend and seasonality basis.

v1.5.0

1 year ago

What's Changed

Features

New models

Misc

Fixes

Tutorials and Docs

New dependencies

New Contributors

Full Changelog: https://github.com/Nixtla/neuralforecast/compare/v1.4.0...v1.5.0

v1.4.0

1 year ago

New Models

  • Temporal Convolution Network (TCN)
  • AutoNBEATSx
  • AutoTFT (Transformers)

New features

  • Recurrent models (RNN, LSTM, GRU, DilatedRNN) can now take static, historical, and future exogenous variables. These variables are combined with lags to produce "context" vectors based on MLP decoders, based on the MQ-RNN model (https://arxiv.org/pdf/1711.11053.pdf).

  • The new DistributionLoss class allows for producing probabilistic forecasts with all available models. By changing the loss hyperparameter to one of these losses, the model will learn and output the distribution parameters:

    • Bernoulli, Poisson, Normal, StudentT, Negative Binomial, and Tweedie distributions
    • Scale-decoupled optimization using Temporal Scalers to improve convergence and performance.
    • The predict method can return samples, quantiles, or distribution parameters.
  • sCRPS loss in PyTorch to minimize errors generating prediction intervals.

Optimization improvements

We included new optimization features commonly used to train neural models:

  • Added learning rate scheduler, using torch.optim.lr_scheduler.StepLR scheduler. The new num_lr_decays hyperparameter controls the number of decays (evenly distributed) during training.
  • Added Early stopping using validation loss. The new early_stop_patience_steps controls the number of validation steps with no improvement after which training will be stopped.
  • New validation loss hyperparameter to allow different train and validation losses

Training, scheduler, validation loss computation, and early stopping are now defined in steps (instead of epochs) to control the training procedure better. Use max_steps to define the number of training iterations. Note: max_epochs will be deprecated in the future.

New tutorials and documentation

  • Probabilistic Long-horizon forecasting
  • Save and Load Models to use them in different datasets
  • Temporal Fusion Transformer
  • Exogenous variables
  • Automatic hyperparameter tuning
  • Intermittent or Sparse Time Series
  • Detect Demand Peaks

v1.3.0

1 year ago

What's Changed

Full Changelog: https://github.com/Nixtla/neuralforecast/compare/v1.2.0...v1.3.0

v1.2.0

1 year ago

What's Changed

Full Changelog: https://github.com/Nixtla/neuralforecast/compare/v1.1.0...v1.2.0