Mokksy and AI-Mocks
Mokksy and AI-Mocks are mock HTTP and LLM (Large Language Model) servers inspired by WireMock, with support for response streaming and Server-Side Events (SSE). They are designed to build, test, and mock OpenAI API responses for development purposes.
Project Overview
Mokksy and AI-Mocks are a suite of tools designed for mocking HTTP and LLM (Large Language Model) APIs for testing and development purposes.
Components
Mokksy
AI-Mocks
- Specialized mock server implementations built on top of Mokksy
- Currently supports:
- OpenAI API (
ai-mocks-openai
) - Anthropic API (
ai-mocks-anthropic
)
- OpenAI API (
- Allows developers to mock LLM API responses for testing and development
Key Features
- Streaming Support: True support for streaming responses and Server-Side Events (SSE)
- Response Control: Flexibility to control server responses directly
- Delay Simulation: Support for simulating response delays and delays between chunks
- Modern API: Fluent Kotlin DSL API with Kotest Assertions
- Error Simulation: Ability to mock negative scenarios and error responses