Skip to content
This repository was archived by the owner on Sep 3, 2025. It is now read-only.

Conversation

@mvilanova
Copy link
Contributor

Requirements Changes

  • Added tiktoken package to requirements
  • Updated several package versions:
    • aiohappyeyeballs from 2.4.4 to 2.4.6
    • aiohttp from 3.11.11 to 3.11.12
    • boto3 from 1.36.13 to 1.36.19
    • botocore from 1.36.13 to 1.36.19
    • google-api-python-client from 2.160.0 to 2.161.0
    • numpy from 2.2.2 to 2.2.3
    • openai from 1.61.0 to 1.62.0

Functional Changes in src/dispatch/ai/service.py

  1. New Constants
  • Added MAX_TOKENS = 128000 constant
  1. New Functions
  • Added num_tokens_from_string(message: str, model: str) -> tuple[list[int], int, tiktoken.Encoding]

    • Calculates token count for a given string using specified model
    • Returns tokenized message, token count, and encoding object
  • Added truncate_prompt(tokenized_prompt: list[int], num_tokens: int, encoding: tiktoken.Encoding) -> str

    • Truncates prompts that exceed the maximum token limit
    • Returns truncated prompt as string
  1. Modified Functions
  • Updated generate_case_signal_summary

    • Added token counting and truncation logic
    • Separated prompt construction from API call
  • Updated generate_incident_summary

    • Added token counting and truncation logic before making API calls

Key Improvements

  1. Token Management: Added functionality to handle large prompts by checking and truncating them if they exceed token limits
  2. Better Error Handling: Improved logging for tokenization issues
  3. Code Organization: Separated prompt construction from API calls for better maintainability

The changes primarily focus on managing token limits in AI prompts and updating dependencies to their latest versions.

@mvilanova mvilanova added the enhancement New feature or request label Feb 13, 2025
@mvilanova mvilanova merged commit 6e22f32 into main Feb 14, 2025
9 checks passed
@mvilanova mvilanova deleted the feature/llm-context-length-limit branch February 14, 2025 19:10
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants