Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Bump litellm from 1.63.0 to 1.63.3#1255

Closed
dependabot[bot] wants to merge 1 commit intomainfrom
dependabot/pip/litellm-1.63.3
Closed

Bump litellm from 1.63.0 to 1.63.3#1255
dependabot[bot] wants to merge 1 commit intomainfrom
dependabot/pip/litellm-1.63.3

Conversation

@dependabot
Copy link
Copy Markdown
Contributor

@dependabot dependabot bot commented on behalf of github Mar 10, 2025

Bumps litellm from 1.63.0 to 1.63.3.

Release notes

Sourced from litellm's releases.

v1.63.2-stable

Full Changelog: BerriAI/litellm@v1.61.20-stable...v1.63.2-stable

  1. New Models / Updated Models

    1. Add supports_pdf_input: true for specific Bedrock Claude models 
  2. LLM Translation

    1. Support /openai/ passthrough for Assistant endpoints
    2. Bedrock Claude - fix amazon anthropic claude 3 tool calling transformation on invoke route
    3. Bedrock Claude - response_format support for claude on invoke route
    4. Bedrock - pass description if set in response_format
    5. Bedrock - Fix passing response_format: {"type": "text"}
    6. OpenAI - Handle sending image_url as str to openai
    7. Deepseek - Fix deepseek 'reasoning_content' error
    8. Caching - Support caching on reasoning content
    9. Bedrock - handle thinking blocks in assistant message
    10. Anthropic - Return signature on anthropic streaming + migrate to signature field instead of signature_delta
    11. Support format param for specifying image type
    12. Anthropic - /v1/messages endpoint - thinking param support: note: this refactors the [BETA] unified /v1/messages endpoint, to just work for the Anthropic API.
    13. Vertex AI - handle $id in response schema when calling vertex ai
  3. Spend Tracking Improvements

    1. Batches API - Fix cost calculation to run on retrieve_batch
    2. Batches API - Log batch models in spend logs / standard logging payload
  4. Management Endpoints / UI

    1. Allow team/org filters to be searchable on the Create Key Page
    2. Add created_by and updated_by fields to Keys table
    3. Show 'user_email' on key table on UI
    4. (Feat) - Show Error Logs on LiteLLM UI
    5. UI - Allow admin to control default model access for internal users
    6. (UI) - Allow Internal Users to View their own logs
    7. (UI) Fix session handling with cookies
    8. Keys Page - Show 100 Keys Per Page, Use full height, increase width of key alias
  5. Logging / Guardrail Integrations

    1. Fix prometheus metrics w/ custom metrics
  6. Performance / Loadbalancing / Reliability improvements

    1. Cooldowns - Support cooldowns on models called with client side credentials
    2. Tag-based Routing - ensures tag-based routing across all endpoints (/embeddings, /image_generation, etc.)
  7. General Proxy Improvements

    1. Raise BadRequestError when unknown model passed in request
    2. Enforce model access restrictions on Azure OpenAI proxy route
    3. Reliability fix - Handle emoji’s in text - fix orjson error
    4. Model Access Patch - don't overwrite litellm.anthropic_models when running auth checks
    5. Enable setting timezone information in docker image

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [litellm](https://github.com/BerriAI/litellm) from 1.63.0 to 1.63.3.
- [Release notes](https://github.com/BerriAI/litellm/releases)
- [Commits](https://github.com/BerriAI/litellm/commits)

---
updated-dependencies:
- dependency-name: litellm
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added Dependencies Pull requests that update a dependency file python Pull requests that update Python code labels Mar 10, 2025
@dependabot @github
Copy link
Copy Markdown
Contributor Author

dependabot bot commented on behalf of github Mar 11, 2025

Superseded by #1262.

@dependabot dependabot bot closed this Mar 11, 2025
@dependabot dependabot bot deleted the dependabot/pip/litellm-1.63.3 branch March 11, 2025 04:20
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

Dependencies Pull requests that update a dependency file python Pull requests that update Python code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants