Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Good First Issue]: Verify baichuan2-7b-chat with GenAI text_generation #273

Open
p-wysocki opened this issue Mar 1, 2024 · 12 comments
Open
Assignees
Labels
good first issue Good for newcomers

Comments

@p-wysocki
Copy link
Collaborator

Context

This task regards enabling tests for baichuan2-7b-chat. You can find more details under openvino_notebooks LLM chatbot README.md.

Please ask general questions in the main issue at #259

What needs to be done?

Described in the main Discussion issue at: #259

Example Pull Requests

Described in the main Discussion issue at: #259

Resources

Contact points

Described in the main Discussion issue at: #259

Ticket

No response

@mengbingrock
Copy link
Contributor

Hi OpenVino developers,
I'm interested in GSOC this summer. Could I take this task please? it is a good chance to get familiar with the codebase and development workflow.

Thanks

@p-wysocki
Copy link
Collaborator Author

Hello @mengbingrock! Thanks for taking a look, I assigned you. Please let us know if you have any questions. :)

@p-wysocki p-wysocki moved this from Contributors Needed to Assigned in Good first issues Mar 7, 2024
@mengbingrock
Copy link
Contributor

Hello Developers,
I met some problem during export:

[ WARNING ] Cannot apply model.to_bettertransformer because of the exception:
The model type baichuan is not yet supported to be used with BetterTransformer. Feel free to open an issue at https://github.com/huggingface/optimum/issues if you would like this model type to be supported. Currently supported models are: dict_keys(['albert', 'bark', 'bart', 'bert', 'bert-generation', 'blenderbot', 'bloom', 'camembert', 'blip-2', 'clip', 'codegen', 'data2vec-text', 'deit', 'distilbert', 'electra', 'ernie', 'fsmt', 'gpt2', 'gptj', 'gpt_neo', 'gpt_neox', 'hubert', 'layoutlm', 'm2m_100', 'marian', 'markuplm', 'mbart', 'opt', 'pegasus', 'rembert', 'prophetnet', 'roberta', 'roc_bert', 'roformer', 'splinter', 'tapas', 't5', 'vilt', 'vit', 'vit_mae', 'vit_msn', 'wav2vec2', 'xlm-roberta', 'yolos', 'stablelm_epoch', 'aquila', 'codegen2']).. Usage model with stateful=True may be non-effective if model does not contain torch.functional.scaled_dot_product_attention
Overriding 1 configuration item(s)
        - use_cache -> True
./build/greedy_causal_lm ./Baichuan2-7B-Chat/pytorch/dldt/FP16/ "Why is the Sun yellow?"
Exception from src/inference/src/infer_request.cpp:196:
Check '::getPort(port, name, {_impl->get_inputs(), _impl->get_outputs()})' failed at src/inference/src/infer_request.cpp:198:
Port for tensor name position_ids was not found.

The position_ids indeed in not shown in openvino_model.xml.
Do I need to check the intel.optimum for solution?

Thanks

openvino_model.xml.txt
openvino_tokenizer.xml.txt

@p-wysocki
Copy link
Collaborator Author

@pavel-esir

@pavel-esir
Copy link
Contributor

The position_ids indeed in not shown in openvino_model.xml. Do I need to check the intel.optimum for solution?

hi @mengbingrock, thanks for you analysis! Yes, looking to intel.optimum might help to find why position_ids are not displayed in IR. You can but the breakpoing or a pring right before forward method to see what argument are fed into network inputs.

I will also take a look what input are fed to forward in the very original HF repo, but i bit later

@mengbingrock
Copy link
Contributor

Thank you for your reply, @pavel-esir
Before forwarding, the greedy_causal_lm.cpp line 78 would check node name position_ids is present or not. And it failed to find it.

I thought the ir is responsible for this error, and it is generated from convert.py,
def convert_optimum_causallm_base(model, args, model_config=None, compress_only=False):
which is calling optimum.intel. This is a customized model and it need special configuration during exporting. I noticed that there is works already exported it the onnx with correct input name. I'm looking at its convertion code to understand it.

ref:
https://github.com/wangzhaode/llm-export/releases/tag/baichuan2-7b-chat-onnx

@mengbingrock
Copy link
Contributor

Hi @pavel-esir,
I ran this command directly, then it produce the similiar xml as last time.

optimum-cli export openvino --trust-remote-code --model ~/.cache/huggingface/hub/models--baichuan-inc--Baichuan2-7B-Chat/snapshots/ea66ced17780ca3db39bc9f8aa601d8463db3da5 --task text-generation-with-past bcaichuan

openvino_model_export.xml.txt

No position_ids is present, missing this parameter in IR.

Where should I look at next? Thank your for your guidance.

@pavel-esir
Copy link
Contributor

@mengbingrock thx for the update. I'm right debugging conversion in optimum to see why position_ids disappeared

@pavel-esir
Copy link
Contributor

pavel-esir commented Mar 25, 2024

@mengbingrock i finally managed to get IR with position_ids openvino_model.xml.txt.

In order to do so, here https://github.com/huggingface/optimum-intel/blob/main/optimum/exporters/openvino/model_configs.py#L77
TextDecoderOnnxConfig should be changed to TextDecoderWithPositionIdsOnnxConfig.

Soon we will open PR for that, but meantime as a workaround you can modify your file locally
venv_path/site-packages/optimum/exporters/openvino/model_configs.py

@mengbingrock
Copy link
Contributor

Really appreciate your work on this @pavel-esir ! Next time I'll try to find the root cause myself to save your valuable time.
I've drafted the PR, but it is after your commit to optimum.

@p-wysocki p-wysocki moved this from Assigned to In Review in Good first issues Apr 3, 2024
@mlukasze mlukasze moved this from In Review to Contributors Needed in Good first issues Sep 18, 2024
@aryan0931
Copy link

.take

Copy link

github-actions bot commented Dec 4, 2024

Thank you for looking into this issue! Please let us know if you have any questions or require any help.

@p-wysocki p-wysocki moved this from Contributors Needed to Assigned in Good first issues Dec 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
Status: Assigned
Development

Successfully merging a pull request may close this issue.

4 participants