Bailey Wallace
Bailey00
·
AI & ML interests
None yet
Recent Activity
reacted to danielhanchen's post with ❤️ about 3 hours ago
You don’t need to set LLM parameters anymore! 🚀
llama.cpp uses only the context length + compute your local setup needs. Unsloth also auto-applies the correct model settings
Try in Unsloth Studio - now with precompiled llama.cpp binaries.
GitHub: https://github.com/unslothai/unsloth reacted to danielhanchen's post with 🔥 about 3 hours ago
A new way to use Unsloth.
Coming soon... upvoted a collection 5 days ago
Qwen3.5Organizations
None yet