Skip to content

GeoFLARE: added GALE_FA, an alternate attention to GALE, for GeoTransolver#1405

Merged
mnabian merged 33 commits intoNVIDIA:mainfrom
dakhare-creator:geoflare
Apr 14, 2026
Merged

GeoFLARE: added GALE_FA, an alternate attention to GALE, for GeoTransolver#1405
mnabian merged 33 commits intoNVIDIA:mainfrom
dakhare-creator:geoflare

Conversation

@dakhare-creator
Copy link
Copy Markdown
Contributor

PhysicsNeMo Pull Request

Description

Checklist

Dependencies

Review Process

All PRs are reviewed by the PhysicsNeMo team before merging.

Depending on which files are changed, GitHub may automatically assign a maintainer for review.

We are also testing AI-based code review tools (e.g., Greptile), which may add automated comments with a confidence score.
This score reflects the AI’s assessment of merge readiness and is not a qualitative judgment of your work, nor is
it an indication that the PR will be accepted / rejected.

AI-generated feedback should be reviewed critically for usefulness.
You are not required to respond to every AI comment, but they are intended to help both authors and reviewers.
Please react to Greptile comments with 👍 or 👎 to provide feedback on their accuracy.

@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps Bot commented Feb 11, 2026

Greptile Overview

Greptile Summary

This PR adds GAFLARE (Geometry-Aware FLARE), an alternative attention mechanism to GALE for the GeoTransolver model.

Key Changes:

  • New gaflare.py module implementing FLARE-based attention with geometry-aware context
  • Modified gale.py to support dynamic attention type selection via attention_type parameter
  • Updated geotransolver.py to expose attention_type parameter (default: "GALE")

Critical Issues Found:

  • gaflare.py line 121: use_te parameter is hardcoded to False, ignoring the constructor argument
  • gaflare.py line 122: Attention scale hardcoded to 1.0 instead of proper dim_head**-0.5 scaling
  • gale.py line 393: Unsafe use of globals() for class selection - security risk and violates MOD-009 coding standard

Minor Issues:

  • Typo in docstring: "Origional" → "Original"
  • Missing newline at end of gaflare.py

Important Files Changed

Filename Overview
physicsnemo/experimental/models/geotransolver/gaflare.py New GAFLARE attention mechanism added. Critical issues: use_te parameter hardcoded to False (line 121), attention scale set to 1.0 instead of proper scaling. Minor: typo in docstring, missing newline at EOF.
physicsnemo/experimental/models/geotransolver/gale.py Integration changes to support attention type selection. Critical security issue: unsafe use of globals() for dynamic class instantiation (line 393) allows potential code execution.
physicsnemo/experimental/models/geotransolver/geotransolver.py Added attention_type parameter to allow switching between GALE and GAFLARE. Clean pass-through implementation with proper default value.

Copy link
Copy Markdown
Contributor

@greptile-apps greptile-apps Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

3 files reviewed, 6 comments

Edit Code Review Agent Settings | Greptile

Comment thread physicsnemo/experimental/models/geotransolver/gaflare.py Outdated
Comment thread physicsnemo/experimental/models/geotransolver/gaflare.py Outdated
Comment thread physicsnemo/experimental/models/geotransolver/gaflare.py Outdated
Comment thread physicsnemo/experimental/models/geotransolver/gaflare.py Outdated
Comment thread physicsnemo/experimental/models/geotransolver/gaflare.py Outdated
@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps Bot commented Feb 11, 2026

Additional Comments (1)

physicsnemo/experimental/models/geotransolver/gale.py
Using globals() for dynamic class selection is unsafe - allows arbitrary code execution if attention_type comes from untrusted input. This violates MOD-009 (avoid string-based class selection). Consider explicit mapping:

ATTENTION_CLASSES = {
    "GALE": GALE,
    "GAFLARE": GAFLARE,
}
if attention_type not in ATTENTION_CLASSES:
    raise ValueError(f"Unknown attention_type: {attention_type}")
self.Attn = ATTENTION_CLASSES[attention_type](
    hidden_dim,
    heads=num_heads,
    # ... rest of arguments
)

@coreyjadams coreyjadams self-requested a review February 12, 2026 01:27
Copy link
Copy Markdown
Collaborator

@coreyjadams coreyjadams left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @dakhare-creator - thanks for opening this. Overall it seems reasonable. Good to put it in experimental. I made a couple comments in the code, but also:

  • it'd be unusual for use to publish a "GeoFlare" model without also making available "Flare". Can you add standard FLARE attention to the experimental/nn folder so we can do that too?
  • We can also probably easily create a flare.py itself for the model.
  • How much overlap in the math and attention blocks is there between PhysicsAttention and FLARE Attention? PhysicsAttention can be used on 2d and 3d data, could this? I strongly suspect yes. Can we make this modular so we can reuse as much as possible between attention layers?

Once we're ready, let's also invite the FLARE authors to take a look.

Comment thread physicsnemo/experimental/models/geotransolver/gale.py Outdated
Comment thread physicsnemo/experimental/models/geotransolver/gaflare.py Outdated
Comment thread physicsnemo/experimental/models/geotransolver/gaflare.py Outdated
Comment thread physicsnemo/experimental/models/geotransolver/gaflare.py Outdated
Comment thread physicsnemo/experimental/models/geotransolver/gaflare.py Outdated
Comment thread physicsnemo/experimental/models/geotransolver/gaflare.py Outdated
Comment thread physicsnemo/experimental/models/geotransolver/gaflare.py Outdated
Comment thread physicsnemo/experimental/models/geotransolver/gaflare.py Outdated
Comment thread physicsnemo/experimental/models/geotransolver/gaflare.py Outdated
Copy link
Copy Markdown

@vpuri3 vpuri3 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @dakhare-creator, I've left several comments in the code. Implementing FLARE straight on a geom problem can be error prone. I'd recommend first validating FLARE on one of the paper's benchmark problems

https://github.com/vpuri3/FLARE.py/blob/master/pdebench/models/flare.py

and then exposing the relevant layers to the geometry solvers interface.

Comment thread physicsnemo/experimental/models/geotransolver/gaflare.py Outdated
Comment thread physicsnemo/experimental/models/geotransolver/gale_fa.py
Comment thread physicsnemo/experimental/models/geotransolver/gaflare.py Outdated
Comment thread physicsnemo/experimental/models/geotransolver/gale_fa.py
Comment thread physicsnemo/experimental/models/geotransolver/gale_fa.py
@mnabian
Copy link
Copy Markdown
Collaborator

mnabian commented Feb 25, 2026

/blossom-ci

@coreyjadams
Copy link
Copy Markdown
Collaborator

Hi @dakhare-creator - with the geometry encoded FLARE, does this also enable standard FLARE attention and model?

@dakhare-creator
Copy link
Copy Markdown
Contributor Author

Hi @dakhare-creator - with the geometry encoded FLARE, does this also enable standard FLARE attention and model?

Hi @coreyjadams, Yes if the context is None, the attention mechanism will be standard FLARE.

@mnabian
Copy link
Copy Markdown
Collaborator

mnabian commented Feb 27, 2026

/blossom-ci

@mnabian mnabian requested a review from coreyjadams February 28, 2026 01:16
Comment thread physicsnemo/experimental/models/transolver/transolver.py Outdated
Comment thread physicsnemo/experimental/nn/flare_attention.py
@mnabian mnabian self-assigned this Mar 11, 2026
@vpuri3
Copy link
Copy Markdown

vpuri3 commented Mar 17, 2026

Hi all, is there any update on this?

@mnabian mnabian requested a review from coreyjadams April 13, 2026 23:29
@mnabian
Copy link
Copy Markdown
Collaborator

mnabian commented Apr 13, 2026

/blossom-ci

@mnabian
Copy link
Copy Markdown
Collaborator

mnabian commented Apr 13, 2026

/blossom-ci

1 similar comment
@mnabian
Copy link
Copy Markdown
Collaborator

mnabian commented Apr 13, 2026

/blossom-ci

@mnabian mnabian enabled auto-merge April 13, 2026 23:46
@mnabian mnabian requested review from mnabian and removed request for mnabian April 14, 2026 00:45
@mnabian mnabian requested review from vpuri3 and removed request for vpuri3 April 14, 2026 00:48
@mnabian
Copy link
Copy Markdown
Collaborator

mnabian commented Apr 14, 2026

/blossom-ci

@mnabian mnabian added this pull request to the merge queue Apr 14, 2026
Merged via the queue into NVIDIA:main with commit ef06ef0 Apr 14, 2026
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants