-
Notifications
You must be signed in to change notification settings - Fork 409
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add exportable baby llama example #4345
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/4345
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (1 Unrelated Failure)As of commit d9ab717 with merge base 5a20a49 (): FLAKY - The following job failed but was likely due to flakiness present on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This pull request was exported from Phabricator. Differential Revision: D60073137 |
Summary: Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test. Differential Revision: D60073137
29a52f9
to
5016b17
Compare
This pull request was exported from Phabricator. Differential Revision: D60073137 |
Summary: Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test. Removed some pyre changes that broke the OSS AoT export, and added some required passes and operators. Differential Revision: D60073137
5016b17
to
f623ca8
Compare
This pull request was exported from Phabricator. Differential Revision: D60073137 |
Summary: Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test. Removed some pyre changes that broke the OSS AoT export, and added some required passes and operators. Differential Revision: D60073137
f623ca8
to
7e8c44a
Compare
This pull request was exported from Phabricator. Differential Revision: D60073137 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D60073137 |
Summary: Pull Request resolved: #4345 Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test. Removed some pyre changes that broke the OSS AoT export, and added some required passes and operators. Differential Revision: D60073137
7e8c44a
to
a45da52
Compare
Summary: Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test. Removed some pyre changes that broke the OSS AoT export, and added some required passes and operators. Differential Revision: D60073137
a45da52
to
15b6369
Compare
This pull request was exported from Phabricator. Differential Revision: D60073137 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D60073137 |
Summary: Pull Request resolved: #4345 Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test. Removed some pyre changes that broke the OSS AoT export, and added some required passes and operators. Differential Revision: D60073137
15b6369
to
3f40f6b
Compare
Summary: Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test. Removed some pyre changes that broke the OSS AoT export, and added some required passes and operators. Differential Revision: D60073137
3f40f6b
to
4b4d460
Compare
This pull request was exported from Phabricator. Differential Revision: D60073137 |
Summary: Pull Request resolved: #4345 Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test. Removed some pyre changes that broke the OSS AoT export, and added some required passes and operators. Differential Revision: D60073137
This pull request was exported from Phabricator. Differential Revision: D60073137 |
4b4d460
to
d9ab717
Compare
This pull request has been merged in 1e14333. |
Summary: Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test.
Differential Revision: D60073137