- Core engine: simulator, game mechanics, triggers (138 tests) - Fog-of-war per-player state tracking - Meeting flow: interrupt, discussion, voting, consolidation - Prompt assembler with strategy injection tiers - LLM client with fallbacks for models without JSON/system support - Prompt templates: action, discussion, voting, reflection - Full integration in main.py orchestrator - Verified working with free OpenRouter models (Gemma)
2.6 KiB
2.6 KiB
OpenRouter API Reference
Quick reference for The Glass Box League LLM integration.
Base URL
https://openrouter.ai/api/v1
Authentication
Authorization: Bearer $OPENROUTER_API_KEY
Chat Completions Endpoint
POST /chat/completions
Request Body
{
"model": "google/gemini-2.0-flash-lite-preview-02-05:free",
"messages": [
{"role": "system", "content": "You are an Among Us player."},
{"role": "user", "content": "What do you do?"}
],
"temperature": 0.7,
"max_tokens": 1024,
"top_p": 0.9,
"frequency_penalty": 0.0,
"presence_penalty": 0.0,
"stream": false,
"response_format": {"type": "json_object"}
}
Response
{
"id": "gen-...",
"model": "google/gemini-2.0-flash-lite-preview-02-05:free",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "{\"action\": \"move\", \"target\": \"electrical\"}"
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 100,
"completion_tokens": 50,
"total_tokens": 150
}
}
Key Parameters
| Parameter | Type | Description |
|---|---|---|
model |
string | Model ID (see available_models.json) |
messages |
array | Conversation history |
temperature |
float | Randomness (0.0-2.0) |
max_tokens |
int | Max response length |
top_p |
float | Nucleus sampling (0.0-1.0) |
stream |
bool | Enable streaming |
response_format |
object | Force JSON output |
seed |
int | For deterministic output |
Structured Output (JSON Mode)
{
"response_format": {
"type": "json_object"
}
}
Headers
Content-Type: application/json
Authorization: Bearer $OPENROUTER_API_KEY
HTTP-Referer: https://your-app.com (optional, for rankings)
X-Title: Glass Box League (optional, for rankings)
Python Example
import os
import requests
def chat(messages, model="google/gemini-2.0-flash-lite-preview-02-05:free"):
response = requests.post(
"https://openrouter.ai/api/v1/chat/completions",
headers={
"Authorization": f"Bearer {os.getenv('OPENROUTER_API_KEY')}",
"Content-Type": "application/json"
},
json={
"model": model,
"messages": messages,
"temperature": 0.7,
"response_format": {"type": "json_object"}
}
)
return response.json()["choices"][0]["message"]["content"]
Free Models
See available_models.json for current free tier models.
Run python fetch_models.py to refresh the list.
Rate Limits
- Free tier: varies by model
- Check response headers for remaining quota