Kode-cli/.env.example
Radon Co 3c9b0ec9d1 prompt(api): Add OpenAI Responses API support with SSE streaming
WHAT: Implement Responses API adapter with full SSE streaming support to enable Kode CLI working with GPT-5 and other models that require OpenAI Responses API format

WHY: GPT-5 and newer models use OpenAI Responses API (different from Chat Completions) which returns streaming SSE responses. Kode CLI needed a conversion layer to translate between Anthropic API format and OpenAI Responses API format for seamless model integration

HOW: Created ResponsesAPIAdapter that converts Anthropic UnifiedRequestParams to Responses API format (instructions, input array, max_output_tokens, stream=true), added SSE parser to collect streaming chunks and convert back to UnifiedResponse format. Fixed ModelAdapterFactory to properly select Responses API for GPT-5 models. Updated parseResponse to async across all adapters. Added production tests validating end-to-end conversion with actual API calls
2025-11-09 01:29:04 -08:00

19 lines
569 B
Plaintext

# Environment Variables for Production API Tests
# Copy this file to .env and fill in your actual API keys
# Enable production test mode
PRODUCTION_TEST_MODE=true
# GPT-5 Codex Test Configuration
TEST_GPT5_API_KEY=your_gpt5_api_key_here
TEST_GPT5_BASE_URL=http://127.0.0.1:3000/openai
# MiniMax Codex Test Configuration
TEST_MINIMAX_API_KEY=your_minimax_api_key_here
TEST_MINIMAX_BASE_URL=https://api.minimaxi.com/v1
# WARNING:
# - Never commit .env files to version control!
# - The .env file is already in .gitignore
# - API keys should be kept secret and secure