You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sorry for not writing enough comments. I just added comments to the hyperparemters used in fairseq-RoBERTa/launch/FreeLB/mnli-fp32-clip.sh and huggingface-transformers/launch/run_glue.sh, so that you can read the code starting from these scripts...
fairseq is more convolved, but I think it should be much easier to read the code of Huggingface's transformers. The algorithm is all included in huggingface-transformers/examples/run_glue_freelb.py, plus some modification for the dropout mask in the ALBERT model. fairseq includes our implementations for FreeAT and YOPO, but will take more time to read.
It's hard to understand the code, including bash shell.
The text was updated successfully, but these errors were encountered: