WHAT: Fix critical CLI crash with content.filter() error and implement OpenAI Responses API integration with comprehensive testing
WHY: CLI was crashing with 'TypeError: undefined is not an object (evaluating "content.filter")' when using OpenAI models, preventing users from making API calls. Additionally needed proper Responses API support with reasoning tokens.
HOW:
• Fix content extraction from OpenAI response structure in legacy path
• Add JSON/Zod schema detection in responsesAPI adapter
• Create comprehensive test suite for both integration and production scenarios
• Document the new adapter architecture and usage
CRITICAL FIXES:
• claude.ts: Extract content from response.choices[0].message.content instead of undefined response.content
• responsesAPI.ts: Detect if schema is already JSON (has 'type' property) vs Zod schema before conversion
FILES:
• src/services/claude.ts - Critical bug fix for OpenAI response content extraction
• src/services/adapters/responsesAPI.ts - Robust schema detection for tool parameters
• src/test/integration-cli-flow.test.ts - Integration tests for full flow
• src/test/chat-completions-e2e.test.ts - End-to-end Chat Completions compatibility tests
• src/test/production-api-tests.test.ts - Production API tests with environment configuration
• docs/develop/modules/openai-adapters.md - New adapter system documentation
• docs/develop/README.md - Updated development documentation
WHAT: Add support for OpenAI Responses API in Kode CLI adapter
WHY: Enable GPT-5 and similar models that require Responses API instead of Chat Completions; fix HTTP 400 errors and schema conversion failures
HOW: Fixed tool format to use flat structure matching API spec; added missing critical parameters (include array, parallel_tool_calls, store, tool_choice); implemented robust schema conversion handling both Zod and pre-built JSON schemas; added array-based content parsing for Anthropic compatibility; created comprehensive integration tests exercising the full claude.ts flow
AFFECTED FILES:
- src/services/adapters/responsesAPI.ts: Complete adapter implementation
- src/services/openai.ts: Simplified request handling
- src/test/integration-cli-flow.test.ts: New integration test suite
- src/test/responses-api-e2e.test.ts: Enhanced with production test capability
VERIFICATION:
- Integration tests pass: bun test src/test/integration-cli-flow.test.ts
- Production tests: PRODUCTION_TEST_MODE=true bun test src/test/responses-api-e2e.test.ts
WHAT: Implement Responses API adapter with full SSE streaming support to enable Kode CLI working with GPT-5 and other models that require OpenAI Responses API format
WHY: GPT-5 and newer models use OpenAI Responses API (different from Chat Completions) which returns streaming SSE responses. Kode CLI needed a conversion layer to translate between Anthropic API format and OpenAI Responses API format for seamless model integration
HOW: Created ResponsesAPIAdapter that converts Anthropic UnifiedRequestParams to Responses API format (instructions, input array, max_output_tokens, stream=true), added SSE parser to collect streaming chunks and convert back to UnifiedResponse format. Fixed ModelAdapterFactory to properly select Responses API for GPT-5 models. Updated parseResponse to async across all adapters. Added production tests validating end-to-end conversion with actual API calls
- Implement model adapter factory for unified API handling
- Add response state manager for conversation continuity
- Support GPT-5 Responses API with continuation tokens
- Add model capabilities type system
- Include deployment guide and test infrastructure
- Enhance error handling and debugging for model interactions