hacksider/deep-live-cam
Updated β’ 57
At first I could not find model and provider settings!
Do you know why?
When you open settings, and try to scroll with the mouse, cursor comes to the text where there are updates, that text start scrolling instead of the page.
Updates are not that important, though they come first.
LLM Models should be in it's separate tab in settings, that is important.
Other issue is that text is too small and I cannot enlarge it. Do you know how?
How to configure provider and model?
Why such is not in the GUI interface?
Is it?
{
"default_provider": "llama.cpp",
"providers": {
"llama.cpp": {
"url": "http://192.168.1.68:8080/v1",
"default_model": "llm"
}
}
}
This could become a meaningful direction for future LLM design.
macros.ahk and run it. Before sending a prompt to your coding agent, press Ctrl + Alt + 1 and paste your prompt to any regular chatbot. Then send the output to the agent. This is the actual, boring, real way to "10x your prompting". Use the other number keys to avoid repeating yourself over and over again. I use this macro prolly 100-200 times per day. AutoHotKey isn't as new or hype as a lot of other workflows, but there's a reason it's still widely used after 17 years. Don't overcomplicate it.; Requires AutoHotkey v1.1+
; All macros are `Ctrl + Alt + <variable>`
^!1::
Send, Please help me more clearly articulate what I mean with this message (write the message in a code block):
return
^!2::
Send, Please make the following changes:
return
^!3::
Send, It seems you got cut off by the maximum response limit. Please continue by picking up where you left off.
returnCtrl + Alt + 1 works best with Instruct models (non-thinking). Reasoning causes some models to ramble and miss the point. I've just been using GPT-5.x for this. Is this a truly new model or some other model was used as base?
How was it trained?
Are datasets available?
Any transparency?