Configuration & Discovery
LLLM favors composition over hard-coded imports. Auto-discovery plus lightweight YAML allows an entire system to be wired together without editing Python entry points.
Configuration Files
config/<system>/default.yaml– runtime settings consumed by agents and systems.name,log_type,log_dir,ckpt_dir, randomness controls.agent_configsdescribing each model (name, prompt path, temperature,max_completion_tokens, optionalapi_type).- Execution guards such as
max_exception_retry,max_interrupt_times,max_llm_recall. - Proxy activation and deploy toggles.
lllm.toml– discovery manifest read bylllm.config.find_config_fileandlllm.discovery.auto_discover.
Example (template/lllm.toml):
Place this file at the project root or point to it via $LLLM_CONFIG. The loader searches parents recursively, so you can run tools from subdirectories without losing context.
Environment Variables
| Variable | Purpose |
|---|---|
LLLM_CONFIG |
Absolute path to a config file or folder; overrides auto-detection. |
LLLM_AUTO_DISCOVER |
Set to 0, false, or no to skip auto-discovery (manual registration only). |
TMP_DIR |
Overrides the default temp/cache directory used by utils and error logging. |
Auto-Discovery Workflow
lllm.auto_discover() runs when the package is imported:
- Resolve the config path (
LLLM_CONFIG, explicit argument, or nearestlllm.toml). - Load TOML, collect prompt/proxy folders relative to the file.
- Import every
.pyfile under those folders (excluding__init__and private files). - Register each
Prompt(keyed bynamespace/module.prompt.path). - Register each
BaseProxysubclass (keyed by_proxy_pathor<namespace>/<class>).
Because registration happens at import time, simply adding a new prompt module to the folder makes it available across the repo without touching central registries.
YAML Tips
- Keep secrets (API keys) out of the YAML and load them via environment variables inside your system/agent code.
- Use multiple YAML files (e.g.,
config/prod.yaml) and load the desired profile before building a system. - Version-control template configs but store user-specific overrides elsewhere; the CLI scaffold already sets up a namespaced config folder.
Disabling Discovery
When packaging LLLM as a reusable library, you may want to opt out of auto-imports. Set LLLM_AUTO_DISCOVER=0, then register prompts and proxies manually:
from lllm import register_prompt, register_proxy
register_prompt(my_prompt)
register_proxy("custom/my_proxy", MyProxy)
This pattern is useful for unit tests or environments where dynamic imports are restricted.