-
Notifications
You must be signed in to change notification settings - Fork 1
Expand file tree
/
Copy path.env.example
More file actions
85 lines (65 loc) · 3 KB
/
.env.example
File metadata and controls
85 lines (65 loc) · 3 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
# LaunchDarkly Configuration
# Get your SDK key from: https://app.launchdarkly.com/settings/projects
LD_SDK_KEY=your_launchdarkly_sdk_key_here
LD_API_KEY="your key here"
# ============================================
# Authentication Method Selection
# ============================================
# Choose authentication method: "sso" or "api-key"
# AUTH_METHOD=sso # Use Bedrock via AWS SSO
AUTH_METHOD=api-key # Default: backward compatible with direct API keys
# ============================================
# SSO Configuration (when AUTH_METHOD=sso)
# ============================================
AWS_REGION=us-east-1
AWS_PROFILE=your-sso-profile-name # Your AWS SSO profile name
# Bedrock Inference Profile Region (optional)
# Controls the region prefix for auto-corrected model IDs
# If not set, automatically derived from AWS_REGION (e.g., us-east-1 → us)
# Options: us, eu, ap, ca, sa, af, me
# BEDROCK_INFERENCE_REGION=us # Default: auto-detected from AWS_REGION
# ============================================
# API Key Configuration (when AUTH_METHOD=api-key)
# ============================================
# Get your Anthropic API key from: https://console.anthropic.com/
ANTHROPIC_API_KEY=your_anthropic_api_key_here
# Get your OpenAI API key from: https://platform.openai.com/api-keys
OPENAI_API_KEY=your_openai_api_key_here
# Get your Mistral API key from: https://console.mistral.ai/
MISTRAL_API_KEY=your_mistral_api_key_here
# ============================================
# Provider-Specific Configuration
# ============================================
# Bedrock Embedding Configuration (when using Bedrock embeddings)
BEDROCK_EMBEDDING_DIMENSIONS=1024 # Options: 256, 512, 1024
BEDROCK_EMBEDDING_MODEL=amazon.titan-embed-text-v2:0
# Optional: MCP Tool Configuration
# These are used for advanced research capabilities (leave blank if not using MCP tools)
ARXIV_MCP_SERVER_PATH=/Users/your_username/.local/bin/arxiv-mcp-server
SEMANTIC_SCHOLAR_SERVER_PATH=/tmp/semantic_scholar_server.py
# Optional: Database Configuration (if using database MCP tools)
# DATABASE_URL=postgresql://user:password@localhost:5432/your_database
# Optional: GitHub Integration (if using GitHub MCP server)
# GITHUB_TOKEN=your_github_personal_access_token
# Optional: Slack Integration (if using Slack MCP server)
# SLACK_BOT_TOKEN=xoxb-your-slack-bot-token
# SLACK_SIGNING_SECRET=your_slack_signing_secret
# Application Configuration
# API Server Configuration
API_HOST=localhost
API_PORT=8000
# Streamlit UI Configuration
UI_HOST=localhost
UI_PORT=8501
# Vector Store Configuration
VECTOR_STORE_PATH=data/vector_store/
EMBEDDING_MODEL=amazon.titan-embed-text-v2:0 # Bedrock Titan V2
# EMBEDDING_MODEL=text-embedding-3-small # Default: OpenAI (backward compatible)
CHUNK_SIZE=1000
CHUNK_OVERLAP=200
# Optional: Override embedding provider auto-detection
# EMBEDDING_PROVIDER=bedrock # Force Bedrock embeddings
# EMBEDDING_PROVIDER=openai # Force OpenAI embeddings (requires OPENAI_API_KEY)
# Development Mode
DEBUG=true
LOG_LEVEL=INFO