Skip to content

feat(turn-detector): drop transformers dependency#5657

Open
chenghao-mou wants to merge 1 commit intomainfrom
chenghao/feat/drop-transformers-dependency
Open

feat(turn-detector): drop transformers dependency#5657
chenghao-mou wants to merge 1 commit intomainfrom
chenghao/feat/drop-transformers-dependency

Conversation

@chenghao-mou
Copy link
Copy Markdown
Member

@chenghao-mou chenghao-mou commented May 6, 2026

drop the transformers dependency in favor of a light hub + tokenizers combo.

benchmarked with the script on 500 files:

Op (mean) en old en new intl old intl new
render 43 μs 11 μs (~3.9x) 56 μs 17 μs (~3.3x)
encode 194 μs 85 μs (~2.3x) 260 μs 129 μs (~2.0x)
cold-load 1592 ms 361 ms (~4.4x) 896 ms 435 ms (~2.1x)
  • en download files still works
  • multi download files still works
  • en inference still works
  • multi inference still works

drop the transformers dependency in favor of a light hub + tokenizers combo.
@chenghao-mou chenghao-mou requested a review from a team May 6, 2026 09:23
Copy link
Copy Markdown
Contributor

@devin-ai-integration devin-ai-integration Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ Devin Review: No Issues Found

Devin Review analyzed this PR and found no potential bugs to report.

View in Devin Review to see 4 additional findings.

Open in Devin Review

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants