Skip to content

Conversation

@renovate
Copy link
Contributor

@renovate renovate bot commented Jan 6, 2026

This PR contains the following updates:

Package Change Age Confidence
autogluon.tabular ==1.4.0==1.5.0 age confidence

Fixes: #366

Release Notes

autogluon/autogluon (autogluon.tabular)

v1.5.0

Compare Source

Version 1.5.0

We are happy to announce the AutoGluon 1.5.0 release!

AutoGluon 1.5.0 introduces new features and major improvements to both tabular and time series modules.

This release contains 131 commits from 17 contributors! See the full commit change-log here: autogluon/autogluon@1.4.0...1.5.0

Join the community:
Get the latest updates: Twitter

This release supports Python versions 3.10, 3.11, 3.12 and 3.13. Support for Python 3.13 is currently experimental, and some features might not be available when running Python 3.13 on Windows. Loading models trained on older versions of AutoGluon is not supported. Please re-train models using AutoGluon 1.5.0.


Spotlight

Chronos-2

AutoGluon v1.5 adds support for Chronos-2, our latest generation of foundation models for time series forecasting. Chronos-2 natively handles all types of dynamic covariates, and performs cross-learning from items in the batch. It produces multi-step quantile forecasts and is designed for strong out-of-the-box performance on new datasets.

Chronos-2 achieves state-of-the-art zero-shot accuracy among public models on major benchmarks such as fev-bench and GIFT-Eval, making it a strong default choice when little or no task-specific training data is available.

In AutoGluon, Chronos-2 can be used in zero-shot mode or fine-tuned on custom data. Both LoRA fine-tuning and full fine-tuning are supported. Chronos-2 integrates into the standard TimeSeriesPredictor workflow, making it easy to backtest, compare against classical and deep learning models, and combine with other models in ensembles.

from autogluon.timeseries import TimeSeriesPredictor

predictor = TimeSeriesPredictor(...)
predictor.fit(train_data, presets="chronos2")  # zero-shot mode

More details on zero-shot usage, fine-tuning and ensembling are available in the updated tutorial.

Tabular

TBA


General

Dependencies
Fixes and Improvements

Tabular

AutoGluon-Tabular v1.5 introduces several improvements focused on accuracy, robustness, and usability. The release adds new foundation models, updates the feature preprocessing pipeline, and improves GPU stability and memory estimation. New model portfolios are provided for both CPU and GPU workloads.

Highlights
  • New foundation models: RealTabPFN-2, RealTabPFN-2.5, and TabDPT are now available in AutoGluon-Tabular.
  • Updated preprocessing pipeline with more consistent feature handling across models.
  • Improved GPU stability and more reliable memory estimation during training.
  • New CPU and GPU portfolios tuned for better performance across a wide range of datasets.
  • Stronger benchmark results: with the new presets, AutoGluon-Tabular v1.5 achieves an 85% win rate over AutoGluon v1.4 Extreme on the 51 TabArena datasets, with a 3% reduction in mean relative error.
New Features
Fixes and Improvements

TimeSeries

AutoGluon v1.5 introduces substantial improvements to the time series module, with clear gains in both accuracy and usability. Across our benchmarks, v1.5 achieves up to an 80% win rate compared to v1.4. The release adds new models, more flexible ensembling options, and numerous bug fixes and quality-of-life improvements.

Highlights
  • Chronos-2 is now available in AutoGluon, with support for zero-shot inference as well as full and LoRA fine-tuning (tutorial).
  • Customizable ensembling logic: Adds item-level ensembling, multi-layer stack ensembles, and other advanced forecast combination methods (documentation).
  • New presets leading to major gains in accuracy & efficiency. AG-TS v1.5 achieves up to 80% win rate over v1.4 on point and probabilistic forecasting tasks. With just a 10 minute time limit, v1.5 outperforms v1.4 running for 2 hours.
  • Usability improvements: Automatically determine an appropriate backtesting configuration by setting num_val_windows="auto" and refit_every_n_windows="auto". Easily access the validation predictions and perform rolling evaluation on custom data with new predictor methods backtest_predictions and backtest_targets.
New Features
API Changes and Deprecations
  • Remove outdated presets related to the original Chronos model: chronos, chronos_large, chronos_base, chronos_small, chronos_mini, chronos_tiny, chronos_ensemble. We recommend to use the new presets chronos2, chronos2_small and chronos2_ensemble instead.
Fixes and Improvements
Code Quality

Multimodal

Fixes and Improvements

Documentation and CI


Contributors

Full Contributor List (ordered by # of commits):

@​shchur @​canerturkmen @​Innixma @​prateekdesai04 @​abdulfatir @​LennartPurucker @​celestinoxp @​FANGAreNotGnu @​xiyuanzh @​nathanaelbosch @​betatim @​AdnaneKhan @​paulbkoch @​shou10152208 @​ryuichi-ichinose @​atschalz @​colesussmeier

New Contributors

Configuration

📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR was generated by Mend Renovate. View the repository job log.

@renovate renovate bot force-pushed the renovate/autogluon.tabular-1.x branch 20 times, most recently from a14afe1 to 1627469 Compare January 13, 2026 12:18
@renovate renovate bot force-pushed the renovate/autogluon.tabular-1.x branch from 1627469 to e35f1d9 Compare January 13, 2026 15:13
@danielelotito danielelotito marked this pull request as draft January 13, 2026 16:24
@danielelotito
Copy link
Collaborator

I am working to make this fix #366 properly

@renovate renovate bot force-pushed the renovate/autogluon.tabular-1.x branch 6 times, most recently from b41be3e to 843e7ce Compare January 14, 2026 12:47
@renovate renovate bot force-pushed the renovate/autogluon.tabular-1.x branch from 843e7ce to 1e1e13a Compare January 14, 2026 13:28
@renovate
Copy link
Contributor Author

renovate bot commented Jan 14, 2026

Edited/Blocked Notification

Renovate will not automatically rebase this PR, because it does not recognize the last commit author and assumes somebody else may have edited the PR.

You can manually request rebase by checking the rebase/retry box above.

⚠️ Warning: custom changes will be lost.

Copy link
Collaborator

@srikumar003 srikumar003 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This update will break autoconf for two reasons:

  1. Deprecates the loading of older models entirely
  2. The Pydantic model of min_gpu_recommender does not contain v3.0.0 which leads to model validation errors.

The changes required are in plugins/custom_experiments/autoconf/min_gpu_recommender.py which involves:

  1. Add the following block to load_model()
    elif model_version == "3.0.0":
        path_weights: str = str(
            object=importlib.resources.files(package="autoconf")
            / "AutoGluonModels"
            / "v3-0-0_ag-20260113_144447-clone-opt-train_frac_1"
        )
  1. Add 3.0.0 to the ModelVersion property like so:
ModelVersion = ConstitutiveProperty(
    identifier="model_version",
    propertyDomain=PropertyDomain(
        variableType=VariableTypeEnum.CATEGORICAL_VARIABLE_TYPE,
        values=["1.1.0", "2.0.0", "3.0.0"],
    ),
)

I have verified that with these two changes, autoconf will keep working

However, we need to deprecate the models that are currently referenced as they cannot be loaded with the new autogluon version

danielelotito and others added 2 commits January 20, 2026 11:30
Signed-off-by: Daniele Lotito <99284466+danielelotito@users.noreply.github.com>
@danielelotito
Copy link
Collaborator

@srikumar003 I have added the new models as suggested.
Tell me if you think we should do something wrt

we need to deprecate the models that are currently referenced as they cannot be loaded with the new autogluon version

now, such as removing from the pydantic model the corresponding fields

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

build(autoconf): Managing library dependencies in custom experiments. Use-case: Autogluon models

3 participants