Dominick Wirzba
Chronuid
ยท
AI & ML interests
None yet
Recent Activity
reacted
to
danielhanchen's
post
with ๐ฅ
2 days ago
We collaborated with Hugging Face to enable you to train MoE models 12ร faster with 35% less VRAM via our new Triton kernels (no accuracy loss). ๐ค
Train gpt-oss locally on 12.8GB VRAM with our free notebooks: https://unsloth.ai/docs/new/faster-moe
reacted
to
sergiopaniego's
post
with โค๏ธ
about 1 month ago
This super detailed tutorial by @Paulescu is pure gold ๐ช "Fine-tuning a Small Language Model for browser control with GRPO and OpenEnv"
LFM2-350M (@LiquidAI) + BrowserGym (OpenEnv) + GRPO (TRL) for learning browser control ๐ค
https://paulabartabajo.substack.com/p/fine-tuning-lfm2-350m-for-browser
liked
a model
about 2 months ago
google/functiongemma-270m-it