You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In OpenVINO GenAI 2024.5.0, even if do_sample parameter LLMPipeline.generate() or GenerationConfig is True, LLM generated text is the same though run-to-run. It looks like that 2024.4.0 behavior match to my expectation.
Test Model: Converted TinyLlama, got it with below command line.
importopenvino_genaiasov_genaipipe=ov_genai.LLMPipeline("TinyLlama-1.1B-Chat-v1.0-int8-ov", "CPU")
if__name__=="__main__":
print(ov_genai.__version__)
prompt="The Sun is yellow because"print(f"prompt: {prompt}")
foriinrange(3):
print(f"--- response {i:02} ---")
print(pipe.generate(prompt, do_sample=True, max_new_tokens=32))
Outputs with OpenVINO GenAI 2024.5.0 (openvino-genai==2024.5.0)
2024.5.0.0
prompt: The Sun is yellow because
--- response 00 ---
of its orange hue.
YELLOW2 A miniature tulip, made from clay.
YELLOW3 My Red
--- response 01 ---
of its orange hue.
YELLOW2 A miniature tulip, made from clay.
YELLOW3 My Red
--- response 02 ---
of its orange hue.
YELLOW2 A miniature tulip, made from clay.
YELLOW3 My Red
Outputs with OpenVINO GenAI 2024.4.0 (openvino-genai==2024.4.0)
2024.4.0.0
prompt: The Sun is yellow because
--- response 00 ---
the concentration of magnesium ion in it is high than others. This results in the excess chloride ion in the salt pile could leach mag
--- response 01 ---
its spectrum (Visible lights): Sunlight falls on the surface of the earth, turning it orange. AcdeThe Sun is orange because sunlight caused suf
--- response 02 ---
it is hotter than most magista using this teenager. To stick to synechocystis. Anaerobic conditions cannot be called
Note
I may be mistaken, but the root of issue may be here:
In this code, Sampler (L284) will be made as local instance for each generate() and it has own RNG (Random Number Generator) instance. However, each RNG objects of C++ will produce the same sequence if not given seed.
Thank you for reading and regards,
The text was updated successfully, but these errors were encountered:
Describe the bug
In OpenVINO GenAI 2024.5.0, even if
do_sample
parameterLLMPipeline.generate()
orGenerationConfig
isTrue
, LLM generated text is the same though run-to-run. It looks like that 2024.4.0 behavior match to my expectation.Test Model: Converted TinyLlama, got it with below command line.
Reproducer Python Script: generate text 3 times.
Outputs with OpenVINO GenAI 2024.5.0 (openvino-genai==2024.5.0)
Outputs with OpenVINO GenAI 2024.4.0 (openvino-genai==2024.4.0)
Note
I may be mistaken, but the root of issue may be here:
openvino.genai/src/cpp/src/llm_pipeline.cpp
Lines 258 to 287 in ee91fcf
In this code,
Sampler
(L284) will be made as local instance for eachgenerate()
and it has own RNG (Random Number Generator) instance. However, each RNG objects of C++ will produce the same sequence if not givenseed
.Thank you for reading and regards,
The text was updated successfully, but these errors were encountered: