limits

#1
by JonusNattapong - opened

It's too much to load file by file. Do you have a way to load all the data without limits?

image

image

JonusNattapong changed discussion status to closed
Open Law Data Thailand org

Since this dataset is quite large (containing millions of records and heavy PDF files), it is not recommended to download everything at once unless necessary.

I recommend using the Hugging Face CLI (hf) to download only the specific directories you need. This is the fastest and most efficient way.

Command:

hf download open-law-data-thailand/soc-ratchakitcha --repo-type dataset --include "meta/*" --local-dir "downloads"

Directory Structure:
You can use the --include flag to target specific folders:

  • meta/: Contains JSONL files (Metadata ). This is likely what you are looking for.
  • Example: Get only 2025 data: --include "meta/2025/*"
hf download open-law-data-thailand/soc-ratchakitcha --repo-type dataset --include "meta/2025/*" --local-dir "downloads"
  • Example: Get only 2020s data: --include "meta/202?/*"
hf download open-law-data-thailand/soc-ratchakitcha --repo-type dataset --include "meta/202?/*" --local-dir "downloads"

Please explore the data structure from the repository and be selective on what you want.

This method allows you to skip the heavy PDF files and download only the text/metadata you need for your application.

However, if you want to clone the whole repository, you may need to subscribe on the HuggingFace premium plan or write the script to download and hold every 5 minites.

Hope this help. :)

thx boss

JonusNattapong changed discussion status to open
spicydog changed discussion status to closed

Sign up or log in to comment