Configuration
How to configure config.yaml
You can directly modify it in the TUI under Configuration → General Settings (Recommended), or manually edit config/config.yaml.
💡 The actual config file is located at
C:\Users\{username}\AppData\Roaming\mailslide\config\config.yaml
llm_mode: per_plugin
plugin_modules:
- custom_plugins.my_plugin_module
jobs:
- name: "My Job"
account: "[email protected]"
source: "Inbox"
destination: "Inbox/processed"
manual_review_destination: "Inbox/manual_review"
batch_flush_enabled: true
limit: 10
plugins:
- add_category
| Field | Description |
|---|---|
| name | Job name (can be named arbitrarily) |
| enable | Whether to enable this job (true/false, default true) |
| account | Outlook account Email or PST file name |
| source | Source folder (e.g., Inbox) |
| destination | Destination folder after processing (optional; if omitted, emails won't be moved and might be processed repeatedly, can be omitted for simple testing) |
| manual_review_destination | Folder for skipped/failed LLM outcomes to be manually reviewed (optional) |
| limit | Number of emails to process |
| llm_mode | LLM call mode (per_plugin default; share_deprecated for legacy mode) |
| plugin_modules | List of additional plugin module paths (dynamically imported at startup for registering custom plugins) |
| ui_language | UI language (zh-TW / en-US, default en-US) |
| batch_flush_enabled | Job-level batch writing toggle (default true; affects some file writing modules) |
| plugins | List of enabled plugins (optional) |
💡 Tip: If you use the
move_to_folderplugin to let the LLM decide the destination directory, you can omitdestination, and the plugin will handle moving the emails.
LLM Call Modes
per_plugin(default): Each plugin needing an LLM calls the LLM independently. Suitable for mixing multiple plugins.share_deprecated(legacy mode): One LLM call shared across all LLM plugins for a single email. This mode easily causesaction_mismatch/skippeddue to differentactionvalues and is not recommended for new configurations.
llm_mode can be set at:
- Global level:
config.llm_mode - Job level override:
job.llm_mode
Backward compatibility: Legacy values
shared/shared_legacycan still be executed, but they will be treated asshare_deprecatedand log a warning.
Configure LLM (Optional)
If you want to use plugins integrated with AI analysis features, it's recommended to edit the configuration in the TUI under Configuration → LLM Settings; you can also manually edit config/llm-config.yaml:
api_base: "http://localhost:11434/v1"
api_key: "your-api-key"
model: "llama3"
Supports OpenAI-compatible API formats such as Ollama and LM Studio.