Skip to content

chore: Add basic statistics to demo apps#985

Open
msluszniak wants to merge 7 commits intomainfrom
@ms/add-basic-statictics-to-demo-apps
Open

chore: Add basic statistics to demo apps#985
msluszniak wants to merge 7 commits intomainfrom
@ms/add-basic-statictics-to-demo-apps

Conversation

@msluszniak
Copy link
Member

@msluszniak msluszniak commented Mar 19, 2026

Description

Added inference times, plus for some apps TTFT, tokens/second.
Fixed T2S streaming for unfinished sentences.

Introduces a breaking change?

  • Yes
  • No

Type of change

  • Bug fix (change which fixes an issue)
  • New feature (change which adds functionality)
  • Documentation update (improves or adds clarity to existing documentation)
  • Other (chores, tests, code style improvements etc.)

Tested on

  • iOS
  • Android

Testing instructions

  • Run all apps and check if basic statistics are correctly displayed
  • Check that text without end of sequence sign is processed correctly in text to speech example

Screenshots

Related issues

Closes #959

Checklist

  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have updated the documentation accordingly
  • My changes generate no new warnings

Additional notes

@msluszniak msluszniak self-assigned this Mar 19, 2026
@msluszniak msluszniak added chore PRs that are chores demo app labels Mar 19, 2026
@msluszniak msluszniak force-pushed the @ms/add-basic-statictics-to-demo-apps branch from 1815fd9 to cbabf51 Compare March 20, 2026 13:08
@barhanc
Copy link
Contributor

barhanc commented Mar 20, 2026

Instance segmentation app is missing the inference timing.

@msluszniak
Copy link
Member Author

msluszniak commented Mar 20, 2026

Instance segmentation app is missing the inference timing.

And model listing as well. It's because instance segmentation was merged after the PR with demo app fixes. I'll add these to instance segmentation. Good catch.

DONE

@barhanc
Copy link
Contributor

barhanc commented Mar 20, 2026

Additionally there are no inference timings for other speech models, but I understand it would be difficult to properly measure the inference time there as they operate in streaming mode. Maybe @IgorSwat has some idea how this could be done.

@msluszniak
Copy link
Member Author

@barhanc about streaming tasks, yeah I eventually decided that I don't have good idea how to present these times

@IgorSwat
Copy link
Contributor

IgorSwat commented Mar 20, 2026

Additionally there are no inference timings for other speech models, but I understand it would be difficult to properly measure the inference time there as they operate in streaming mode. Maybe @IgorSwat has some idea how this could be done.

We could measure the average time of executing transcribe() inside the streaming loop - the question is whether we want to make a breaking change just for that.

Same goes for Text to Speech - measuring it's inference time is possible, but would require digging into TypeScript API at very least.

@IgorSwat IgorSwat added the bug fix PRs that are fixing bugs label Mar 20, 2026
@msluszniak msluszniak requested a review from barhanc March 21, 2026 12:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug fix PRs that are fixing bugs chore PRs that are chores demo app

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add basic statistics to demo app

3 participants