Skip to content

Commit

Permalink
change example in the docs
Browse files Browse the repository at this point in the history
  • Loading branch information
basnijholt committed Mar 7, 2019
1 parent 51f4292 commit d057831
Showing 1 changed file with 6 additions and 6 deletions.
12 changes: 6 additions & 6 deletions docs/source/tutorial/tutorial.AverageLearner.rst
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ So, the ``learner`` compares the loss of __potential new intervals (or triangles

The relative importance of both can be adjusted by a hyperparameter ``learner.weight``, see the doc-string for more information.

Let's again try to learn some functions but now with uniformly distributed noise. We start with 1D and then go to 2D.
Let's again try to learn some functions but now with [heteroscedastic](https://en.wikipedia.org/wiki/Heteroscedasticity) noise. We start with 1D and then go to 2D.

`~adaptive.AverageLearner1D`
............................
Expand Down Expand Up @@ -119,13 +119,13 @@ Let's again try to learn some functions but now with uniformly distributed noise

def noisy_ring(xy_seed):
import numpy as np
import random
from random import uniform
(x, y), seed = xy_seed
random.seed(xy_seed) # to make the random function deterministic
a = 0.2
ring = x + np.exp(-(x**2 + y**2 - 0.75**2)**2/a**4)
noise = random.uniform(-0.5, 0.5)
return ring + noise
z = (x**2 + y**2 - 0.75**2) / a**2
plateau = np.arctan(z)
noise = uniform(-10, 10) * np.exp(-z**2)
return plateau + noise

learner = adaptive.AverageLearner2D(noisy_ring, bounds=[(-1, 1), (-1, 1)])
runner = adaptive.Runner(learner, goal=lambda l: l.loss() < 0.01)
Expand Down

0 comments on commit d057831

Please sign in to comment.