All notable changes to the HelpingAI Python SDK will be documented in this file.
- 📦 Package Structure: Fixed setuptools packaging configuration to properly include all subpackages (
HelpingAI.*), resolvingModuleNotFoundErrorwhen importingHAIfrom installed package - 🛠️ Import Paths: Corrected import paths in
HelpingAI/__init__.pyto ensure all client submodules are accessible when importing the mainHelpingAIpackage - 🔄 Backward Compatibility: Verified that existing code importing
HAIfromHelpingAIcontinues to function correctly after refactoring - 🧹 Code Cleanup: Removed redundant imports and streamlined
HelpingAI/__init__.pyfor clarity and maintainability - updated documentation to reflect correct import paths and usage examples
- 🔄 Multi-Format API Support: Enhanced models API to support both HelpingAI format (array of strings) and v1/models schema format
- 🔧 Auto-Detection: Automatic detection of response format in
list()method for seamless compatibility - 🔗 v1 Models Schema Support: New
from_v1_models_data()method for explicit v1/models schema handling - 📊 Custom Logging System: New
HelpingAI/logging.pywith structured logging, colored output, and environment configuration (may get replaced with LITPRINTER) - 🎨 Enhanced Error Diagnostics: Advanced error parsing with helpful suggestions and context-aware recommendations
- 🧹 Major Refactor: Split
HelpingAI/client.pyinto modular submodules:base.py,completions.py,chat.py, andmain.pyunderHelpingAI/client/for improved maintainability and clarity - 🔄 Backward Compatibility: The original
client.pynow re-exports all main client classes for seamless transition - 🧩 Internal Structure: All client logic and classes are now organized in dedicated files, reducing file size and improving code navigation
- 🔐 Enhanced Security: Models endpoint now requires authentication (
auth_required=True) - 🧹 Code Cleanup: Removed all hardcoded model names for fully dynamic API-driven model management
- 📡 Dynamic Model Loading: Models are now fetched entirely from API responses instead of fallback hardcoded lists
- 🧠 Simplified Think Logic: Simplified complex
hide_thinkparameter logic as it is now internally handled by HelpingAI's backend - 🎯 Flexible Data Handling:
Model.from_api_data()now accepts both string (HelpingAI format) and dict (v1/models schema) inputs - 🔍 Better Error Reporting: Error messages now show actual available models from API responses
- 🧩 Format Agnostic: Seamless handling of different API response structures without breaking changes
- 🛠️ Advanced Error Handling: Modular error parsing with helper methods for message extraction, model name detection, and streaming suggestions
- 📋 Smart Error Messages: Context-aware error enhancement with actionable suggestions based on status codes and request patterns
- 🌈 Colored Log Output: Custom logging system with color-coded levels, timestamps, and configurable output destinations
- ⚙️ Environment Configuration: Logging configuration via environment variables (
HAI_LOG_LEVEL,HAI_LOG_FILE,HAI_LOG_CONSOLE)
- 🔌 MCP Integration: Full Model context Protocol (MCP) support for external tool connections
- 🖥️ Multiple Transport Types: Support for stdio, SSE, and streamable-http MCP servers
- 🔄 Automatic Tool Discovery: MCP tools automatically converted to the standard tool calling format
- 📁 Resource Support: Built-in
list_resourcesandread_resourcetools for MCP resources - 🔀 Mixed Tools Support: Seamlessly combine MCP servers with regular standard tool definitions
- ⚡ Process Management: Automatic cleanup of MCP server processes on exit
- 🔁 Reconnection Logic: Handles server disconnections automatically
- 🛡️ Graceful Error Handling: Works without MCP package installed with helpful error messages
- 📦 Optional MCP Dependency: Install with
pip install HelpingAI[mcp]for MCP features - New MCP integration documentation and examples
- 🛠️ Extended Tools Compatibility: Enhanced tools framework to support MCP server configurations
- 🌐 Popular MCP Servers: Ready support for mcp-server-time, mcp-server-fetch, mcp-server-filesystem, and more
- 🏗️ Backward Compatibility: Fully backward compatible with no breaking changes to existing functionality
- 🔧 Tool Calling Framework: New
@tools decoratorfor effortless tool creation - 🔄 Direct Tool Execution: New
.call()method on HAI client for executing tools without registry manipulation - 🤖 Automatic Schema Generation: Type hint-based JSON schema creation with docstring parsing
- 📝 Smart Documentation: Multi-format docstring parsing (Google, Sphinx, NumPy styles)
- 🧠 Thread-Safe Tool Registry: Reliable tool management in multi-threaded environments
- 🔍 Tool Validation: Automatic parameter validation against JSON schema
- Extended Python Support: Now supports Python 3.7-3.14
- Streaming Support: Real-time response streaming
- Advanced Filtering: Hide reasoning blocks with
hide_thinkparameter - New comprehensive Tool Calling Guide
- 🔄 Universal Compatibility: Seamless integration with existing standard tool definitions
- Updated Models: Support for latest models (Dhanishtha-2.0-preview, Dhanishtha-2.0-preview-mini)
- Improved Model Management: Better fallback handling and detailed model descriptions
- Simplified Tool Execution: Direct tool calling with
client.call(tool_name, arguments)syntax - Deprecated
get_tools_format()in favor ofget_tools() - Updated documentation to reflect current model names and best practices
- 🛡️ Enhanced Tool Error Handling: Comprehensive exception types for tool operations
- Dhanishtha-2.0 Integration: World's first intermediate thinking model with multi-phase reasoning
- Dhanishtha Models: Advanced reasoning capabilities with transparent thinking processes
- Function-Calling Friendly Interface: Familiar API design
- Enhanced Error Handling: Comprehensive exception types
- Support for Dhanishtha-2.0-preview model
- Improved error handling for API requests
- Enhanced streaming capabilities
- Various bug fixes and performance improvements
- Initial support for tool calling
- Enhanced type hints for better IDE support
- Connection handling for unstable networks
- Token counting accuracy
- Initial public release
- Support for chat completions
- Basic streaming functionality
- Error handling framework
For more details, see the documentation or GitHub repository.