# US Segmentation Dataset The dataset is a collection of openly available ultrasound datasets. The individual datasets are stored as zip files in the `zip` directory. The utility for extraction all of them will be provided later. ## Instructions for uploading new datasets 1. Download the dataset from the source 2. Convert the dataset into a zip format if required 3. Upload the zip file corresponding anatomy folder in the `zip` directory using the command: `hf upload [repo_id] [local_path] [path_in_repo] --repo-type dataset --commit-message "add-dataset: "` eg: `hf upload us-segmentator/us-segmentation-dataset local_path_to_zip_file.zip zips//.zip --repo-type dataset --commit-message "add-dataset: "` ## Instructions for downloading the dataset ### CLI 1. Make sure you have git lfs installed 2. Run the following command: ```bash git lfs install ``` 3. Run the following command: ```bash hf download us-segmentator/us-segmentation-dataset --repo-type dataset --local-dir local_path_to_download_directory ``` 4. The dataset will be downloaded to the local directory. > Note: The dataset can be downloaded without the --local-dir argument, but it will be downloaded the huggingface cache directory. ### Python Use the `huggingface_hub` library to download specific files programmatically: ```python from huggingface_hub import hf_hub_download file_path = hf_hub_download( repo_id="us-segmentator/us-segmentation-dataset", filename="zips//.zip", repo_type="dataset", local_dir="local_path_to_download_directory" ) print(f"File downloaded to: {file_path}") ``` ## Pulling non-LFS changes only If you are only pulling repository updates like `README.md`, scripts, or other small files (and want to skip downloading large LFS zip datasets during pull), use: Use case: This is useful when you are contributing documentation/code changes and do not need the dataset zip files on your machine. ### Linux/macOS ```bash GIT_LFS_SKIP_SMUDGE=1 git pull ``` ### Windows (PowerShell) ```powershell $env:GIT_LFS_SKIP_SMUDGE=1 git pull ```