Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLM tests restructuring #1440

Merged
merged 11 commits into from
Dec 27, 2024
Merged

Conversation

ilya-lavrenov
Copy link
Contributor

@ilya-lavrenov ilya-lavrenov commented Dec 26, 2024

  • Merged chat scenario tests to test_llm_pipeline.py
  • Created CB dedicated test_continuous_batching.py file with CB-specific tests (in addition to test_llm_pipeline.py, which cover basic LLM pipeline functionality)

CVS-159921

@github-actions github-actions bot added category: continuous batching Continuous batching category: LLM LLM pipeline (stateful, static) category: GHA CI based on Github actions no-match-files labels Dec 26, 2024
@ilya-lavrenov ilya-lavrenov added this to the 2025.0 milestone Dec 26, 2024
@ilya-lavrenov ilya-lavrenov self-assigned this Dec 26, 2024
@ilya-lavrenov ilya-lavrenov changed the title Llm tests LLM tests restructuring Dec 26, 2024
mean_gen_duration, std_gen_duration = perf_metrics.get_generate_duration()
assert (mean_gen_duration, std_gen_duration) == (perf_metrics.get_generate_duration().mean, perf_metrics.get_generate_duration().std)
assert mean_gen_duration > 0 and load_time + mean_gen_duration < total_time
# TODO: looks like total_time does not count load_time actually as model is read via read_model from cache
# assert mean_gen_duration > 0 and load_time + mean_gen_duration < total_time
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pavel-esir please, have a look

@ilya-lavrenov ilya-lavrenov merged commit 82b44fa into openvinotoolkit:master Dec 27, 2024
59 checks passed
@ilya-lavrenov ilya-lavrenov deleted the llm-tests branch December 27, 2024 03:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
category: continuous batching Continuous batching category: GHA CI based on Github actions category: LLM LLM pipeline (stateful, static) no-match-files
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant