Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
frumuย 
posted an update Feb 23

How to configure provider and model?

Why such is not in the GUI interface?

Is it?

{
  "default_provider": "llama.cpp",
  "providers": {
    "llama.cpp": {
      "url": "http://192.168.1.68:8080/v1",
      "default_model": "llm"
    }
  }
}
ยท

Yes that will work for custom providers like llama.cpp

On linux it should be in ~/.config/tandem/config.json

You can also try just adding a custom provider in the desktop GUI settings and give it http://192.168.1.68:8080/v1 that will just make it's provider custom , not llama.cpp

I will add a note to imrpove support for llama.cpp so it is easier to add.

In this post