๐ Quick Setup Guide for Hugging Face Space
Your Neural Pong demo is ready to deploy! Follow these steps:
Step 1: Create Your Hugging Face Space
- Go to Hugging Face Spaces: https://huggingface.co/spaces
- Click "Create new Space"
- Fill in the details:
- Space name:
neural-pong(or your preferred name) - SDK: Select "Docker" โ ๏ธ Important!
- Hardware: Select "GPU" โ "T4 small" (or larger)
- Visibility: Public or Private
- Space name:
- Click "Create Space"
Step 2: Upload Files Using Git
cd /share/u/wendler/code/toy-wm-hf-space
# Initialize git (if not already done)
git init
# Add all files
git add .
# Commit
git commit -m "Initial commit: Neural Pong demo"
# Add your Space as remote (replace YOUR_USERNAME and SPACE_NAME)
git remote add origin https://huggingface.co/spaces/YOUR_USERNAME/SPACE_NAME
# Push everything
git push -u origin main
Note: The checkpoint file is 225MB, so Git is recommended over web upload.
Step 3: Wait for Build
- After pushing, Hugging Face will automatically start building
- Go to your Space page โ "Logs" tab to watch progress
- Build time: 5-15 minutes (installing PyTorch, etc.)
Step 4: Test Your Space
- Once build completes, visit your Space URL
- You should see the Pong interface
- Wait for model to load (loading spinner)
- Click "Start Stream"
- Use Arrow Keys or WASD to play!
Quick Commands
# Run the setup script
./setup.sh
# Check files are ready
ls -la
# Test Docker build locally (optional)
docker build -t neural-pong .
docker run -p 7860:7860 neural-pong
Troubleshooting
Build Fails?
- Check "Logs" tab for errors
- Verify checkpoint path in
configs/inference.yaml - Ensure GPU is selected in Space settings
Model Won't Load?
- Verify checkpoint exists:
checkpoints/ckpt-step=053700-metric=0.00092727.pt - Check the path in
configs/inference.yaml - Look for errors in the Logs tab
What's Included
โ app.py - Flask application (no single-user limitation) โ checkpoints/ - Model checkpoint (225MB) โ src/ - All necessary source code (15 Python files) โ static/index.html - Frontend interface โ configs/inference.yaml - Model configuration โ Dockerfile - Container configuration โ requirements.txt - Python dependencies
Need More Help?
- See
SETUP_GUIDE.mdfor detailed instructions - See
DEPLOYMENT.mdfor technical details - Check Hugging Face Spaces docs: https://huggingface.co/docs/hub/spaces
Ready? Run ./setup.sh to get started! ๐