Logo for the MolmoWeb Project

MolmoWeb-4B-Native

Note that this is the molmo-native checkpoint, and it's NOT Huggingface/transformers-compatible. Check out allenai/MolmoWeb-4B for HF-compatible checkpoint.

MolmoWeb is a family of fully open multimodal web agents. MolmoWeb agents achieve state-of-the-art results outperforming similar scale open-weight-only models such as Fara-7B, UI-Tars-1.5-7B, and Holo1-7B. MolmoWeb-8B also surpasses set-of-marks (SoM) agents built on much larger closed frontier models like GPT-4o. We further demonstrate consistent gains through test-time scaling via parallel rollouts with best-of-N selection, achieving 94.7% and 60.5% pass@4 (compared to 78.2% and 35.3% pass@1)on WebVoyager and Online-Mind2Web respectively.

Learn more about the MolmoWeb family in our announcement blog post and tech report.

MolmoWeb-4B-Native is based on Molmo2 architecture, which uses Qwen3-8B and SigLIP 2 as vision backbone.

Ai2 is committed to open science. The MolmoWeb datasets are available here. All other artifacts used in creating MolmoWeb (training code, evaluations, intermediate checkpoints) will be made available, furthering our commitment to open-source AI development and reproducibility.

Quick links:

Usage

Please refer to our Github repo for inference code.

License and Use

This model is licensed under Apache 2.0. It is intended for research and educational use in accordance with Ai2’s Responsible Use Guidelines.

Downloads last month
13
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for allenai/MolmoWeb-4B-Native

Finetuned
Qwen/Qwen3-8B
Finetuned
(1193)
this model

Datasets used to train allenai/MolmoWeb-4B-Native

Collection including allenai/MolmoWeb-4B-Native

Paper for allenai/MolmoWeb-4B-Native