From 6afae218ff3efb44aa01dfe61e2136f56513aeac Mon Sep 17 00:00:00 2001 From: Gernot Bauer Date: Mon, 17 Jul 2023 15:27:27 +0200 Subject: [PATCH] fixed typo --- .../adjust_non_polar_non_asssociating.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/examples/pcsaft/parameter_adjustment/adjust_non_polar_non_asssociating.ipynb b/examples/pcsaft/parameter_adjustment/adjust_non_polar_non_asssociating.ipynb index 3dbb99124..d91bd9912 100644 --- a/examples/pcsaft/parameter_adjustment/adjust_non_polar_non_asssociating.ipynb +++ b/examples/pcsaft/parameter_adjustment/adjust_non_polar_non_asssociating.ipynb @@ -145,7 +145,7 @@ "3. Normalization: The relative differences (with the applied loss functions) are divided by the number of data points in each corresponding `DataSet`.\n", "4. Weighted cost calculation: The costs of each `DataSet` are weighted based on the provided (normalized) `weights`.\n", "\n", - "The following example demonstrates the construction of an estimator using vapor pressure and liquid density data. We utilize `weights = [3, 2]` (normalized `weights = [0.6, 0.4]`) and a `Loss.huber(0.05)` loss function, which treats predictions above 0.05% linearly in the cost function instead of squaring them (outlier treatment).\n" + "The following example demonstrates the construction of an estimator using vapor pressure and liquid density data. We utilize `weights = [3, 2]` (normalized `weights = [0.6, 0.4]`) and a `Loss.huber(0.05)` loss function, which treats predictions above 5% linearly in the cost function instead of squaring them (outlier treatment).\n" ] }, {