modAI Help

Supported AI Services

Executing AI requests

Enabling system setting modai.api.execute_on_server will move the execution to the server side, hiding the network traffic.

It can be enabled per service using modai.api.{service}.execute_on_server} format, for example, to enable this only for chatgpt, the setting would be: modai.api.anthropic.execute_on_server.

OpenAI (ChatGPT)

ChatGPT is the default AI service assumed. Fill out the modai.api.openai.key and adjust any models as desired.

Google Gemini

Add a valide API key to the modai.api.google.key to use Google Gemini.

To change a prompt to use Google Gemini, set its corresponding model setting, e.g:

  • global.global.modelgoogle/gemini-2.0-flash

Anthropic (Claude)

Add a valid API key to the modai.api.anthropic.key to use Claude.

To change a prompt to use Claude, set its corresponding model setting, e.g:

  • global.global.modelanthropic/claude-3-5-haiku-latest

OpenRouter.ai

Add a valid API key to the modai.api.openrouter.key to use OpenRouter.

To change a prompt to use OpenRouter, set its corresponding model setting, e.g:

  • global.global.modelopenrouter/meta-llama/llama-4-scout:free

Custom Services/Models

  • Service name: custom

Some services like Open WebUI provide a wrapper for multiple models. To use a custom model via these services you need to fill out the modai.api.custom.url, modai.api.custom.key and optionally the modai.custom.compatibility, which tells the model what API emulation to use (almost alway leave this as openai).

To use the custom service, set the following fields:

  • modai.api.custom.url{your custom URL}

  • modai.api.custom.key{your API key}

Then, you for each model you want to use, set the corresponding "model" field with the prefix "custom/" followed by the model name, e.g:

  • modai.global.modelcustom/llama3.1:8b

Last modified: 17 April 2025