Skip to content

Commit

Permalink
Robust Gaussian Processes via Relevance Pursuit (#2608)
Browse files Browse the repository at this point in the history
Summary:
Pull Request resolved: #2608

This commit adds the implementation of the [Robust Gaussian Processes via Relevance Pursuit](https://arxiv.org/pdf/2410.24222) models and algorithms of the NeurIPS 2024 article.

Reviewed By: esantorella

Differential Revision: D65343571

fbshipit-source-id: 389cfca7ea1f67b5a2671ff7f9dc61bd535dde20
  • Loading branch information
SebastianAment authored and facebook-github-bot committed Dec 4, 2024
1 parent 88f47bc commit 5012fe8
Show file tree
Hide file tree
Showing 11 changed files with 2,186 additions and 5 deletions.
7 changes: 6 additions & 1 deletion botorch/models/gpytorch.py
Original file line number Diff line number Diff line change
Expand Up @@ -402,6 +402,8 @@ def _apply_noise(
mvn = self.likelihood(mvn, X)
return mvn

# pyre-ignore[14]: Inconsistent override. Could not find parameter
# `Keywords(typing.Any)` in overriding signature.
def posterior(
self,
X: Tensor,
Expand Down Expand Up @@ -470,6 +472,7 @@ def posterior(
return posterior_transform(posterior)
return posterior

# pyre-ignore[14]: Inconsistent override. Could not find parameter `noise`.
def condition_on_observations(
self, X: Tensor, Y: Tensor, **kwargs: Any
) -> BatchedMultiOutputGPyTorchModel:
Expand Down Expand Up @@ -632,7 +635,7 @@ def batch_shape(self) -> torch.Size:
raise NotImplementedError(msg + " that are not broadcastble.")
return next(iter(batch_shapes))

# pyre-fixme[15]: Inconsistent override in return types
# pyre-fixme[14]: Inconsistent override in return types
def posterior(
self,
X: Tensor,
Expand Down Expand Up @@ -838,6 +841,8 @@ def _apply_noise(
)
return self.likelihood(mvn, X)

# pyre-ignore[14]: Inconsistent override. Could not find parameter
# `Keywords(typing.Any)` in overriding signature.
def posterior(
self,
X: Tensor,
Expand Down
480 changes: 480 additions & 0 deletions botorch/models/likelihoods/sparse_outlier_noise.py

Large diffs are not rendered by default.

Loading

0 comments on commit 5012fe8

Please sign in to comment.