T.TAO
Configuration

AI Settings

The AI Settings panel is used to centrally manage all AI service providers required for AI features in Alfheim. Once configured, features such as character chat testing, knowledge base search, translation assistance, and image generation can all utilize AI capabilities.

Overview

  • Multi-provider support
  • Unified API Key and model management
  • Custom provider configuration
  • Shared provider pool across all AI features

Supported AI Providers

ProviderDescription
GeminiGoogle's AI service
OpenAIChatGPT series models
AnthropicClaude series models
OllamaLocally deployed open-source models
CustomAny service compatible with OpenAI API format

Adding a Provider

  1. Click Add Provider in the AI Settings panel.
  2. Select the provider type.
  3. Fill in the configuration:
    • Name: A recognizable name for this configuration.
    • API Key: The key obtained from the provider.
    • Base URL (optional): Custom API endpoint. Required for Ollama and custom providers.
    • Default Model: Select or enter the model name to use.
  4. Click Save.

Managing Providers

Edit

Select an existing provider configuration to modify it.

Delete

Remove provider configurations that are no longer in use.

Enable/Disable

Temporarily disable a provider without deleting its configuration.

AI Features Overview

After configuring providers, the following features can use AI capabilities:

FeatureModuleDescription
Character ChatCharacter DesignerSimulate conversations with characters to test personality
Semantic SearchKnowledge Base (Huginn)Retrieve knowledge entries by meaning
Consistency CheckKnowledge Base (Mimir)Detect contradictions in settings
AI TranslationLocalizationBatch translate game content
Image GenerationArt StudioAI-assisted generation of visual assets

About Image Generation

AI image generation is primarily implemented through ComfyUI connection in Nidavellir. The AI generation panel in Art Studio may also use configured providers.

Using Ollama (Local Models)

Ollama allows you to run open-source AI models on your local computer without API keys or network connectivity.

Configuration Steps

  1. Install and start the Ollama service locally.
  2. Add an Ollama-type provider in the AI Settings panel.
  3. Enter the Ollama service address (default: http://localhost:11434).
  4. Select the downloaded model.

Advantages

  • Fully offline operation; data never leaves your machine.
  • No API fees.
  • Suitable for projects with data privacy requirements.

Using Custom Providers

If your AI service is compatible with the OpenAI API format, you can connect it via a custom provider.

  1. Select the "Custom" type.
  2. Enter the service's Base URL.
  3. Enter the API Key (if required).
  4. Enter the model name.

Tips

  • It is recommended to configure at least one provider to unlock all AI features in Alfheim.
  • Different features may have different model requirements β€” character chat and translation benefit from more capable models, while simple search can use lighter models.
  • If cost is a concern, you can configure different providers for different purposes (e.g., Ollama for daily testing, commercial API for final output).
  • API Keys are stored in local project configuration and are never uploaded or shared.