AI Settings
The AI Settings panel is used to centrally manage all AI service providers required for AI features in Alfheim. Once configured, features such as character chat testing, knowledge base search, translation assistance, and image generation can all utilize AI capabilities.
Overview
- Multi-provider support
- Unified API Key and model management
- Custom provider configuration
- Shared provider pool across all AI features
Supported AI Providers
| Provider | Description |
|---|---|
| Gemini | Google's AI service |
| OpenAI | ChatGPT series models |
| Anthropic | Claude series models |
| Ollama | Locally deployed open-source models |
| Custom | Any service compatible with OpenAI API format |
Adding a Provider
- Click Add Provider in the AI Settings panel.
- Select the provider type.
- Fill in the configuration:
- Name: A recognizable name for this configuration.
- API Key: The key obtained from the provider.
- Base URL (optional): Custom API endpoint. Required for Ollama and custom providers.
- Default Model: Select or enter the model name to use.
- Click Save.
Managing Providers
Edit
Select an existing provider configuration to modify it.
Delete
Remove provider configurations that are no longer in use.
Enable/Disable
Temporarily disable a provider without deleting its configuration.
AI Features Overview
After configuring providers, the following features can use AI capabilities:
| Feature | Module | Description |
|---|---|---|
| Character Chat | Character Designer | Simulate conversations with characters to test personality |
| Semantic Search | Knowledge Base (Huginn) | Retrieve knowledge entries by meaning |
| Consistency Check | Knowledge Base (Mimir) | Detect contradictions in settings |
| AI Translation | Localization | Batch translate game content |
| Image Generation | Art Studio | AI-assisted generation of visual assets |
About Image Generation
AI image generation is primarily implemented through ComfyUI connection in Nidavellir. The AI generation panel in Art Studio may also use configured providers.
Using Ollama (Local Models)
Ollama allows you to run open-source AI models on your local computer without API keys or network connectivity.
Configuration Steps
- Install and start the Ollama service locally.
- Add an Ollama-type provider in the AI Settings panel.
- Enter the Ollama service address (default:
http://localhost:11434). - Select the downloaded model.
Advantages
- Fully offline operation; data never leaves your machine.
- No API fees.
- Suitable for projects with data privacy requirements.
Using Custom Providers
If your AI service is compatible with the OpenAI API format, you can connect it via a custom provider.
- Select the "Custom" type.
- Enter the service's Base URL.
- Enter the API Key (if required).
- Enter the model name.
Tips
- It is recommended to configure at least one provider to unlock all AI features in Alfheim.
- Different features may have different model requirements β character chat and translation benefit from more capable models, while simple search can use lighter models.
- If cost is a concern, you can configure different providers for different purposes (e.g., Ollama for daily testing, commercial API for final output).
- API Keys are stored in local project configuration and are never uploaded or shared.