Skip to content

Add SoftKNNClassifierModel (#3243)#3243

Open
sunnyshen321 wants to merge 1 commit intometa-pytorch:mainfrom
sunnyshen321:export-D90894389
Open

Add SoftKNNClassifierModel (#3243)#3243
sunnyshen321 wants to merge 1 commit intometa-pytorch:mainfrom
sunnyshen321:export-D90894389

Conversation

@sunnyshen321
Copy link
Copy Markdown

@sunnyshen321 sunnyshen321 commented Mar 19, 2026

Summary:
X-link: facebook/Ax#5072

Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
P(y=1|x) = sum(w_i * y_i) / sum(w_i)
where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389

@meta-cla meta-cla Bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Mar 19, 2026
@meta-codesync
Copy link
Copy Markdown

meta-codesync Bot commented Mar 19, 2026

@sunnyshen321 has exported this pull request. If you are a Meta employee, you can view the originating Diff in D90894389.

sunnyshen321 pushed a commit to sunnyshen321/Ax that referenced this pull request Mar 19, 2026
Summary:
X-link: meta-pytorch/botorch#3243

Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
  P(y=1|x) = sum(w_i * y_i) / sum(w_i)
  where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389
@meta-codesync meta-codesync Bot changed the title Add SoftKNNClassifierModel Add SoftKNNClassifierModel (#3243) Mar 19, 2026
sunnyshen321 pushed a commit to sunnyshen321/botorch that referenced this pull request Mar 19, 2026
Summary:
X-link: facebook/Ax#5072


Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
  P(y=1|x) = sum(w_i * y_i) / sum(w_i)
  where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389
sunnyshen321 pushed a commit to sunnyshen321/Ax that referenced this pull request Mar 19, 2026
Summary:

X-link: meta-pytorch/botorch#3243

Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
  P(y=1|x) = sum(w_i * y_i) / sum(w_i)
  where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389
sunnyshen321 pushed a commit to sunnyshen321/Ax that referenced this pull request Mar 19, 2026
Summary:

X-link: meta-pytorch/botorch#3243

Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
  P(y=1|x) = sum(w_i * y_i) / sum(w_i)
  where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389
sunnyshen321 pushed a commit to sunnyshen321/botorch that referenced this pull request Mar 19, 2026
Summary:
X-link: facebook/Ax#5072


Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
  P(y=1|x) = sum(w_i * y_i) / sum(w_i)
  where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389
sunnyshen321 pushed a commit to sunnyshen321/Ax that referenced this pull request Mar 19, 2026
Summary:

X-link: meta-pytorch/botorch#3243

Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
  P(y=1|x) = sum(w_i * y_i) / sum(w_i)
  where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389
sunnyshen321 pushed a commit to sunnyshen321/Ax that referenced this pull request Mar 19, 2026
Summary:

X-link: meta-pytorch/botorch#3243

Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
  P(y=1|x) = sum(w_i * y_i) / sum(w_i)
  where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389
Summary:
X-link: facebook/Ax#5072


Add a differentiable Soft K-Nearest Neighbors classifier model for failure-aware
Bayesian optimization. Unlike tree-based classifiers (RF, XGBoost), SoftKNN is
fully differentiable, enabling gradient-based acquisition function optimization.

The model uses Gaussian kernel weights:
  P(y=1|x) = sum(w_i * y_i) / sum(w_i)
  where w_i = exp(-||x - x_i||^2 / (2 * sigma^2))

Implements construct_inputs classmethod for seamless Ax integration.

Differential Revision: D90894389
@codecov
Copy link
Copy Markdown

codecov Bot commented Mar 19, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 99.98%. Comparing base (9a296b6) to head (2aec2e6).

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #3243   +/-   ##
=======================================
  Coverage   99.98%   99.98%           
=======================================
  Files         220      221    +1     
  Lines       21844    21905   +61     
=======================================
+ Hits        21840    21901   +61     
  Misses          4        4           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported meta-exported

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant