- transformers==2.2.1
- python>=3.6
- torch==1.3.0
This repository is an implementation of First-order MAML under continual learning on NLU tasks. The original method is proposed at https://arxiv.org/abs/1905.12588. This work applied the approach mention aboved into NLU domain, the task is divided into two-fold:
-
Contiual MAML: MAML is adaptaed from meta learning bert, and modified to support First Order MAML as well as OML (online aware Meta Learning).
-
Dataloader: dataloader is adapted and modified from meta learning bert and transfomers to provide batch of GLUE tasks. Currently, there are two implementation of dataloader in
task_glue
andtask_glue_wo_saving
. Both scripts have the same inputs and outputs, but have different processing times. Specifically,task_glue
preprocesses texts and saved in local, whiletask_glue_wo_saving
processes features from text online without saving. Therefore,task_glue
might have a slower time when a dataset is called first-time.