metadata
language: en
license: cc-by-nc-4.0
extra_gated_heading: Register
extra_gated_button_content: I agree, request access
extra_gated_fields:
Your Full Name: text
Company or Institution: text
Intended Use Case: text
I confirm this use is Non-Commercial: checkbox
I will provide proper attribution to the creators: checkbox
I accept the data 'As-Is' without warranties: checkbox
PALM: A Dataset and Baseline for Learning Multi-subject Hand Prior (Paper)
Zicong Fan, Edoardo Remelli, David Dimond, Fadime Sener, Liuhao Ge, Bugra Tekin, Cem Keskin, Shreyas Hampali
Github Repo: https://github.com/facebookresearch/PALM
This is a repository for PALM, a large-scale dataset comprising calibrated multi-view high-resolution RGB images and 3dMD hand scans (a). It features 263 subjects spanning a wide range of skin tones and hand sizes, 90k RGB images, and 13k high-quality hand scans with corresponding MANO registrations (b). This diversity and precision provide a foundation for learning a universal prior over human hand shape and appearance.
Why use PALM?
Summary on dataset:
- 90k multi-view RGB images
- Accurate MANO registration
- 263 subjects
- 7 RGB views
- 2448 x 2048 high resolution images
- 13k 3dMD hand scans