Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Need to turn the model to a MoE first" error #2

Open
Harry-zzh opened this issue May 10, 2022 · 5 comments
Open

"Need to turn the model to a MoE first" error #2

Harry-zzh opened this issue May 10, 2022 · 5 comments

Comments

@Harry-zzh
Copy link

I just remove "--do_train" and "--do_eval" lines in bert_base_mnli_example.sh, an add a line that"--do_predict". But when I run it, "Need to turn the model to a MoE first" error happens. I wonder why it happens, thanks a lot.

@Harry-zzh
Copy link
Author

And the loaded model is the already trained model.

@shadymcy
Copy link

And When I Use bert_base_mnli_example.sh , add a --preprocess_importance argument, remove the --do_train argument to compute the importance scores, "FileNotFoundError: [Errno 2] No such file or directory: 'importance_files/importance_mnli.pkl'" error happens. where can I get that file? Thanks a lot

@CaffreyR
Copy link

Hi @shadymcy, have you solve this problems? I have encountered the same! Many thanks!

@shadymcy
Copy link

@CaffreyR I am not....Sorry

@wintersurvival
Copy link

wintersurvival commented Dec 8, 2022

commenting out importance_processer in transformers/models/bert/modeling_bert_moe.py will work:

    #self.importance_processor = ImportanceProcessor(config, layer_idx, config.moebert_expert_num, 0)

    #if not self.importance_processor.is_moe:
    #    raise RuntimeError("Need to turn the model to a MoE first.")

@shadymcy @CaffreyR @Harry-zzh

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants