Changelog¶
All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
0.1.0 - 2025-01-XX¶
Added¶
- Initial release of ullm
- Support for OpenAI, Anthropic, Groq, and AWS Bedrock providers
- Chat completion API (
completion()andacompletion()) - OpenAI Responses API (
responses()andaresponses()) - Streaming support for all providers
- Tool calling/function calling support
- Structured output with Pydantic models
- Exponential backoff retry logic using tenacity
- Comprehensive test suite
- CI/CD with GitHub Actions
- Full API compatibility with litellm core features
- Examples and documentation
Features¶
- 100x smaller memory footprint than litellm (~2MB vs ~200MB)
- 24x faster import time
- Modern Python tooling (uv, ruff, mypy)
- Type hints throughout
- Async/await support
- Minimal dependencies (httpx, pydantic, tenacity)