Model Card for Chronos Bolt BASE Fine-Tuned Model v1

The model was fine-tuned on large amount of sales data with a lot of covariates. Training data contains 2.3 million data point. Other details about data is confidential.

Technical Specifications

Model Architecture and Objective

The model is based on the amazon/chronos-2 architecture, fine-tuned specifically for intermittent time-series forecasting tasks. It leverages pre-trained capabilities for sequence-to-sequence modeling, adapted to handle multi-horizon forecasting scenarios.

Contact:

NIEXCHE (Fevzi KILAS)


license: mit

Downloads last month
20
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for nieche/chronos-2-fine-tuned-LoRA-v1

Base model

amazon/chronos-2
Finetuned
(1)
this model