From bbc2c8f81c77847c19f500e397516da7a1fd16df Mon Sep 17 00:00:00 2001 From: Ejaaz Merali Date: Wed, 15 May 2019 14:24:13 -0400 Subject: [PATCH] [scipost] minor --- docs/scipost/main.tex | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/scipost/main.tex b/docs/scipost/main.tex index 36dd6fb0..ce5da5ac 100644 --- a/docs/scipost/main.tex +++ b/docs/scipost/main.tex @@ -309,8 +309,8 @@ \subsection{Training} Figure~\ref{fig:KL} shows the convergence of the fidelity and KL divergence during training. The convergence time will, in general, depend on the choice of hyperparameters. -Finally, the network parameters $\bm{\lambda}$, together with the callbacks, can be -saved (or loaded) to a file: +Finally, the network parameters $\bm{\lambda}$, together with the \verb|MetricEvaluator|'s data, +can be saved (or loaded) to a file: \begin{python} state.save( "filename.pt",