-
Notifications
You must be signed in to change notification settings - Fork 14
User Guide
Minerva can work in either of two modes:
-
Submit mode
Submit mode is the main Minerva mode where you exchange one step from our pipeline with your own solution and train and evaluate the modified model. -
Dry mode
In dry mode you can run or train the pipeline to make sure that everything is working correctly.
Also, three types of Neptune support are available:
-
No Neptune
Choose this option if you want to work without Neptune support. -
Neptune locally
Choose this option if you want to run the pipeline locally and use Neptune to visualize the results. -
Neptune's cloud
Choose this option if you want to run the pipeline on the cloud available through Neptune.
This user guide is organized in the following way:
1. Submit mode
1.1. No Neptune
1.2. Neptune locally
1.3. Neptune's cloud
2. Dry mode
2.1. Dry eval
2.1.1. No Neptune
2.1.2. Neptune locally
2.1.3. Neptune's cloud
2.2. Dry train
2.2.1. No Neptune
2.2.2. Neptune locally
2.2.3. Neptune's cloud
Submit mode is the main Minerva mode where you exchange one step from our pipeline with your own solution and train and evaluate the modified model.
Choose a task, for example task1.ipynb
. Write your implementation to the task by filling CONFIG
dictionary or the body of the solution
function according to the instructions:
CONFIG = {}
def solution():
return something
You can submit your solution with 3 types of Neptune support.
Choose this option if you want to work without Neptune support.
-
Download data (once):
Data are downloaded automatically.
- Download file
imgs.zip
from Right Whale Recognition challenge site on kaggle (you must be logged in to kaggle to do that). - Extract
imgs.zip
toresources/whales/data/
. - After that, folder
resources/whales/data/
should contain two elements: filemetadata.csv
and folderimgs
with images.
- Download file
-
In
neptune.yaml
file:-
Comment
pip-requirements-file
line. -
Uncomment Local setup paths and set them as follows:
-
data_dir
: doesn't matter, -
solution_dir
:resources/fashion_mnist/solution
.
-
data_dir
:resources/whales/data/
, -
solution_dir
:resources/whales/solution/
.
Note: you can also set another
solution_dir
if you previously trained another pipeline instance with use of dry train sub-mode. -
-
Comment Cloud setup paths.
-
-
Type:
python main.py -- submit --problem fashion_mnist --task_nr 1
Run time with GPU: TODO
python main.py -- submit --problem whales --task_nr 1
Run time with GPU: TODO
Note: if you want to submit a task using a notebook different than the default one, add
--filepath path/to/your/notebook
in the end of the command.
Choose this option if you want to run the pipeline locally and use Neptune to visualize the results.
- Download data in the same way as for no Neptune support.
- Edit
neptune.yaml
file in the same way as for no Neptune support. - Type:
Run time with GPU: TODO
neptune run -- submit --problem fashion_mnist --task_nr 1
Run time with GPU: TODOneptune run -- submit --problem whales --task_nr 1
Choose this option if you want to run the pipeline on the cloud available through Neptune.
-
In
neptune.yaml
file:-
Uncomment
pip-requirements-file
line. -
Comment Local setup paths.
-
Uncomment Cloud setup paths and set them as follows:
-
data_dir
: doesn't matter, -
solution_dir
:/public/minerva/resources/fashion_mnist/solution
.
-
data_dir
:/public/whales
, -
solution_dir
:/public/minerva/resources/whales/solution
.
-
-
-
Type:
neptune send \ --environment keras-2.0-gpu-py3 \ --worker gcp-gpu-medium \ -- submit --problem fashion_mnist --task_nr 1
Run time with GPU: TODO
neptune send \ --environment pytorch-0.2.0-gpu-py3 \ --worker gcp-gpu-medium \ -- submit --problem whales --task_nr 1
Run time with GPU: TODO
Note: make sure you typed correct
--environment
and--worker
. If you miss them the script won't run.
In dry mode you can run or train the pipeline to make sure that everything is working correctly. We provided two dry sub-modes:
-
Dry eval
In dry eval sub-mode you can run an existing pipeline and evaluate it. -
Dry train
In dry train sub-mode you can train a new instance of the pipeline and then evaluate it.
In dry eval mode you can run our pipeline to make sure that everything is working correctly.
- Download data in the same way as in submit mode.
- Edit
neptune.yaml
file in the same way as in submit mode. - Type:
Run time with GPU: less than 1 minute.
python main.py -- dry_eval --problem fashion_mnist
Run time with GPU: about 3 minutes.python main.py -- dry_eval --problem whales
- Download data in the same way as in submit mode.
- Edit
neptune.yaml
file in the same way as in submit mode. - Type:
Run time with GPU: less than 1 minute.
neptune run -- dry_eval --problem fashion_mnist
Run time with GPU: about 3 minutes.neptune run -- dry_eval --problem whales
- Edit
neptune.yaml
file in the same way as in submit mode. - Type:
Run time with GPU: less than 1 minute.
neptune send \ --environment keras-2.0-gpu-py3 \ --worker gcp-gpu-medium \ -- dry_eval --problem fashion_mnist
Run time with GPU: TODOneptune send \ --environment pytorch-0.2.0-gpu-py3 \ --worker gcp-gpu-medium \ -- dry_eval --problem whales
In dry train sub-mode you can train a new instance of the pipeline and then evaluate it.
-
Download data in the same way as in submit mode.
-
Edit
neptune.yaml
file in the same way as in submit mode exceptsolution_dir
. Setsolution_dir
as the local path where the new pipeline instance will be stored, e.g.output/trained_solution
.Note: make sure that you chose path which doesn't contain a pipeline. Minerva doesn't overwrite existing models.
-
Type:
python main.py -- dry_train --problem fashion_mnist
Run time with GPU: TODO
python main.py -- dry_train --problem whales
Run time with GPU: TODO
- Download data in the same way as in submit mode.
- Edit
neptune.yaml
file in the same way as for no Neptune support. Remember to set a newsolution_dir
. - Type:
Run time with GPU: about 15 minutes.
neptune run -- dry_train --problem fashion_mnist
Run time with GPU: TODOneptune run -- dry_train --problem whales
- Edit
neptune.yaml
file in the same way as in submit mode exceptsolution_dir
. Setsolution_dir
as the path in Neptune's cloud where the new pipeline instance will be stored, e.g./output/trained_solution
. The path must start with/output
. - Type:
Run time with GPU: TODO
neptune send \ --environment keras-2.0-gpu-py3 \ --worker gcp-gpu-medium \ -- dry_train --problem fashion_mnist
Run time with GPU: TODOneptune send \ --environment pytorch-0.2.0-gpu-py3 \ --worker gcp-gpu-medium \ -- dry_train --problem whales