--- title: LLM Code Deployment API emoji: 🚀 colorFrom: blue colorTo: green sdk: docker app_port: 7860 --- # LLM Code Deployment - Student API This is a Hugging Face Space that hosts the student API endpoint for the LLM Code Deployment project. It receives build requests, generates code using LLMs, and deploys to GitHub Pages. ## Setup on Hugging Face Spaces ### 1. Create a New Space 1. Go to https://huggingface.co/new-space 2. Choose a name for your Space 3. Select **Docker** as the SDK 4. Click **Create Space** ### 2. Configure Environment Variables Go to your Space's **Settings** → **Variables and secrets** tab and add: **Required:** - `STUDENT_EMAIL` - Your email address (matching the submission form) - `STUDENT_SECRET` - Your secret key (matching the submission form) - `GITHUB_TOKEN` - Your GitHub personal access token (with `repo` permissions) - `GITHUB_USERNAME` - Your GitHub username - `AIPIPE_TOKEN` - Your AIPipe token from https://aipipe.org/login **Optional (advanced):** - `LLM_PROVIDER` - `aipipe` (default), `anthropic`, or `openai` - `LLM_MODEL` - Model name (default: `google/gemini-2.0-flash-lite-001`) - `AIPIPE_BASE_URL` - AIPipe endpoint (default: `https://aipipe.org/openrouter/v1`) ### 3. Get Your AIPipe Token 1. Visit https://aipipe.org/login 2. Sign in with your `@ds.study.iitm.ac.in` email 3. Copy your API token 4. You get **$2 per month free** - don't exceed this ### 4. Deploy the Code **Option A: Clone from GitHub** ```bash git clone https://github.com/YOUR_USERNAME/YOUR_REPO.git cd YOUR_REPO git remote add space https://huggingface.co/spaces/YOUR_USERNAME/YOUR_SPACE git push space main ``` **Option B: Direct Upload** 1. Upload all project files to your Space via the web interface 2. Ensure `Dockerfile`, `requirements.txt`, and all code files are present ### 5. Wait for Build The Space will automatically build using the Dockerfile. This may take 5-10 minutes. ### 6. Get Your API Endpoint Once deployed, your endpoint will be: ``` https://YOUR_USERNAME-YOUR_SPACE.hf.space/api/build ``` Use this URL when submitting to the instructor's Google Form. ## Testing Your Deployment Test your endpoint locally first: ```bash curl -X POST https://YOUR_USERNAME-YOUR_SPACE.hf.space/api/build \ -H "Content-Type: application/json" \ -d '{ "email": "your-email@ds.study.iitm.ac.in", "secret": "your-secret", "task": "test-task-123", "round": 1, "nonce": "test-nonce-456", "brief": "Create a simple Hello World page with Bootstrap", "checks": ["Page displays Hello World"], "evaluation_url": "https://example.com/evaluate", "attachments": [] }' ``` Expected response: ```json { "status": "accepted", "message": "Task test-task-123-1 accepted for processing", "task": "test-task-123", "round": 1 } ``` ## Health Check Check if your Space is running: ```bash curl https://YOUR_USERNAME-YOUR_SPACE.hf.space/health ``` ## Monitoring View logs in the **Logs** tab of your Space to monitor: - Task requests received - Code generation progress - GitHub deployment status - Evaluation notification results ## Troubleshooting ### Space won't start - Check the **Logs** tab for build errors - Ensure all environment variables are set correctly - Verify Dockerfile and requirements.txt are present ### Authentication errors - Verify `STUDENT_SECRET` matches what you submitted in the form - Check `STUDENT_EMAIL` is correct - Ensure `GITHUB_TOKEN` has repo permissions ### LLM generation fails - Verify `AIPIPE_TOKEN` is set and valid - Check you haven't exceeded the $2/month quota - Review logs for specific error messages ### GitHub deployment fails - Ensure `GITHUB_TOKEN` has correct permissions - Check `GITHUB_USERNAME` is correct - Verify token hasn't expired ## Cost Management **AIPipe Limits:** - Free tier: $2 per month for @ds.study.iitm.ac.in emails - Models recommended: - `google/gemini-2.0-flash-lite-001` (cheapest) - `anthropic/claude-3-haiku` (good quality) - `openai/gpt-4.1-nano` (balanced) **Monitoring usage:** - Check https://aipipe.org/usage - Each code generation uses ~1000-2000 tokens - Budget for ~50-100 task submissions per month ## Local Development To test locally before deploying to Spaces: ```bash # Set environment variables cp .env.example .env # Edit .env with your credentials # Run with Docker docker build -t llm-code-deploy . docker run -p 7860:7860 --env-file .env llm-code-deploy # Or run directly with Python uv run python main.py student-api ``` ## Support For issues: 1. Check the project README.md 2. Review Hugging Face Spaces documentation 3. Contact the course instructors ## License MIT License - See LICENSE file