-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy path.env.example
More file actions
85 lines (78 loc) · 3.39 KB
/
.env.example
File metadata and controls
85 lines (78 loc) · 3.39 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
# MeshBot Environment Configuration
#
# Configuration Priority (12-factor app principles):
# 1. Command-line arguments (highest priority)
# 2. Environment variables (this file)
# 3. Default values (lowest priority)
#
# Naming Convention:
# - Environment variables: UPPER_SNAKE_CASE (e.g., MESHCORE_PORT)
# - CLI arguments: kebab-case with -- prefix (e.g., --meshcore-port)
# - Config attributes: snake_case (e.g., meshcore.port)
#
# See: meshbot run --help or meshbot test --help
# LLM Configuration (OpenAI-compatible endpoints)
# Model format: openai:MODEL_NAME
#
# MODEL_NAME depends on your provider:
# OpenAI (default): gpt-4o-mini, gpt-4o, gpt-4-turbo
# OpenRouter: openai/gpt-4o-mini, anthropic/claude-3-5-sonnet-20241022, etc.
# Ollama: llama3.2, mistral, etc.
# LM Studio: Use the model name shown in LM Studio
#
# Examples:
# LLM_MODEL=openai:gpt-4o-mini # OpenAI (default)
# LLM_MODEL=openai:openai/gpt-4o-mini # OpenRouter with GPT-4o-mini
# LLM_MODEL=openai:anthropic/claude-3-5-sonnet-20241022 # OpenRouter with Claude
# LLM_MODEL=openai:llama3.2 # Ollama
LLM_MODEL=openai:gpt-4o-mini
# API Key - required for most LLM providers
# This key is used for both the main agent and Memori memory features
LLM_API_KEY=your_api_key_here
# Base URL for OpenAI-compatible endpoints (optional)
# Leave unset for OpenAI (default)
# This setting applies to both the main agent and Memori memory features
# Set this when using alternative providers:
# OpenRouter: https://openrouter.ai/api/v1
# Ollama: http://localhost:11434/v1
# LM Studio: http://localhost:1234/v1
# Any other OpenAI-compatible endpoint
# LLM_BASE_URL=https://openrouter.ai/api/v1
# Optional LLM Configuration (Advanced)
# These have sensible defaults but can be customized if needed
# LLM_MAX_TOKENS=500 # Maximum tokens for LLM responses
# LLM_TEMPERATURE=0.7 # LLM temperature (0.0-2.0, lower = more focused)
# LLM_MAX_MESSAGE_LENGTH=120 # Maximum message length in characters
# Optional system prompt file (default: prompts/default.md)
# Use this to specify a custom system prompt file
# LLM_PROMPT_FILE=prompts/custom.md
# MeshCore Configuration
MESHCORE_CONNECTION_TYPE=mock
# Node name - will be set as advertised name on startup
# Bot will respond to DMs and @NodeName mentions in channels
MESHCORE_NODE_NAME=MeshBot
# Channel to listen to (0 for General, or specific channel name/number)
MESHCORE_LISTEN_CHANNEL=0
# MESHCORE_PORT=/dev/ttyUSB0
# MESHCORE_HOST=192.168.1.100
# MESHCORE_BAUDRATE=115200
# MESHCORE_DEBUG=false
# MESHCORE_AUTO_RECONNECT=true
# Message sending delays (for LoRa duty cycle compliance)
# Delay between multi-chunk messages in seconds (default: 5.0)
# This respects LoRa duty cycle restrictions (1% in Europe requires ~50s wait after 500ms message)
# Reduce for faster regions (US: 2-3s), increase for stricter requirements (Europe: 5-10s)
# MESHCORE_MESSAGE_DELAY=5.0
# Number of retry attempts for failed message sends (default: 1)
# MESHCORE_MESSAGE_RETRY=1
# Weather Configuration
# Required: Set coordinates for weather queries
# Weather tool will not work without these values
# Example: Ipswich, UK coordinates shown below
WEATHER_LATITUDE=52.0597
WEATHER_LONGITUDE=1.1455
# Optional: Number of forecast days (default: 3)
# WEATHER_FORECAST_DAYS=3
# Logging Configuration
LOG_LEVEL=INFO
# LOG_FILE=meshbot.log