ogbench_data / README.md
taldatech's picture
Add robotics task category and research links to dataset card (#1)
1a66ecb
metadata
license: cc-by-4.0
task_categories:
  - robotics
tags:
  - robot
  - ogbench
  - rl
  - imitation
  - learning
  - simulation
  - manipulation

OGBench Data for Latent Particle World Models (LPWM)

This repository contains pre-processed 64x64 frames for the scene and cube tasks from the OGBench benchmark. The dataset includes actions and frames used for training and evaluating Latent Particle World Models (LPWM).

LPWM is a self-supervised object-centric world model that autonomously discovers keypoints, bounding boxes, and object masks directly from video data. It is designed to scale to real-world multi-object datasets and is applicable in decision-making tasks such as goal-conditioned imitation learning.

Citation

If you use this data or the LPWM model in your research, please cite the following paper:

@inproceedings{
  daniel2026latent,
  title={Latent Particle World Models: Self-supervised Object-centric Stochastic Dynamics Modeling},
  author={Tal Daniel and Carl Qi and Dan Haramati and Amir Zadeh and Chuan Li and Aviv Tamar and Deepak Pathak and David Held},
  booktitle={The Fourteenth International Conference on Learning Representations},
  year={2026},
  url={https://openreview.net/forum?id=lTaPtGiUUc}
}