Skip to content

Add SoftKNNClassifierModel (#5072)#5072

Open
sunnyshen321 wants to merge 1 commit intofacebook:mainfrom
sunnyshen321:export-D90894389
Open

Add SoftKNNClassifierModel (#5072)#5072
sunnyshen321 wants to merge 1 commit intofacebook:mainfrom
sunnyshen321:export-D90894389

Conversation

@sunnyshen321
Copy link

@sunnyshen321 sunnyshen321 commented Mar 19, 2026

Summary:

X-link: meta-pytorch/botorch#3243

Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
P(y=1|x) = sum(w_i * y_i) / sum(w_i)
where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389

@meta-cla meta-cla bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Mar 19, 2026
@meta-codesync
Copy link

meta-codesync bot commented Mar 19, 2026

@sunnyshen321 has exported this pull request. If you are a Meta employee, you can view the originating Diff in D90894389.

sunnyshen321 pushed a commit to sunnyshen321/botorch that referenced this pull request Mar 19, 2026
Summary:
X-link: facebook/Ax#5072


Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
  P(y=1|x) = sum(w_i * y_i) / sum(w_i)
  where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389
@meta-codesync meta-codesync bot changed the title Add SoftKNNClassifierModel Add SoftKNNClassifierModel (#5072) Mar 19, 2026
sunnyshen321 pushed a commit to sunnyshen321/Ax that referenced this pull request Mar 19, 2026
Summary:

X-link: meta-pytorch/botorch#3243

Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
  P(y=1|x) = sum(w_i * y_i) / sum(w_i)
  where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389
sunnyshen321 pushed a commit to sunnyshen321/Ax that referenced this pull request Mar 19, 2026
Summary:

X-link: meta-pytorch/botorch#3243

Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
  P(y=1|x) = sum(w_i * y_i) / sum(w_i)
  where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389
sunnyshen321 pushed a commit to sunnyshen321/botorch that referenced this pull request Mar 19, 2026
Summary:
X-link: facebook/Ax#5072


Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
  P(y=1|x) = sum(w_i * y_i) / sum(w_i)
  where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389
sunnyshen321 pushed a commit to sunnyshen321/Ax that referenced this pull request Mar 19, 2026
Summary:

X-link: meta-pytorch/botorch#3243

Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
  P(y=1|x) = sum(w_i * y_i) / sum(w_i)
  where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389
Summary:

X-link: meta-pytorch/botorch#3243

Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
  P(y=1|x) = sum(w_i * y_i) / sum(w_i)
  where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389
sunnyshen321 pushed a commit to sunnyshen321/botorch that referenced this pull request Mar 19, 2026
Summary:
X-link: facebook/Ax#5072


Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
  P(y=1|x) = sum(w_i * y_i) / sum(w_i)
  where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported meta-exported

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant