Google's 200M-parameter time-series foundation model with 16k context (github.com) AI
Google Research has released TimesFM, a pretrained time-series foundation model for forecasting, with an updated TimesFM 2.5 checkpoint. The newer version uses 200M parameters (down from 500M), extends context length to 16k, and adds continuous quantile forecasting up to a 1k horizon via an optional quantile head. The GitHub repo includes instructions and example code for running the model with PyTorch or Flax, along with notes about ongoing support updates.
March 31, 2026 10:29
Source: Hacker News