Skip to content

Commit

Permalink
Update filename and paper title to align with the article
Browse files Browse the repository at this point in the history
PiperOrigin-RevId: 555913385
  • Loading branch information
k8lion authored and copybara-github committed Aug 11, 2023
1 parent 67c43bb commit 78dc19f
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions muNet/NAX.ipynb → muNet/TransformerExpansions.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -22,9 +22,9 @@
"id": "7BQOCQE5tZiU"
},
"source": [
"# Neural Architecture EXpansion (NAX): Incremental scaling of Transformer-based Neural Networks\n",
"# Composable Function-preserving Expansions for Transformer Architectures\n",
"\n",
"This notebook contains implementations of the six function-preserving transformations of transformer-based models proposed in \"Neural Architecture EXpansion (NAX): Incremental scaling of Transformer-based Neural Networks\". We provide a basic implementation of a generic transformer architecture and show that each transformation is function-preserving, both for individual architectural components and the whole transformer model, as well as for individual transformations and combinations of transformations.\n"
"This notebook contains implementations of the six function-preserving transformations of transformer-based models proposed in \"Composable Function-preserving Expansions for Transformer Architectures\". We provide a basic implementation of a generic transformer architecture and show that each transformation is function-preserving, both for individual architectural components and the whole transformer model, as well as for individual transformations and combinations of transformations.\n"
]
},
{
Expand Down

0 comments on commit 78dc19f

Please sign in to comment.