Skip to content

(Bug) Update sentence_encoder.py: clamping cos_sim between -1 and 1 to avoid floating point precision errors in torch.acos(cos_sim)#804

Merged
qiyanjun merged 1 commit intoQData:masterfrom
Aniloid2:patch-1
Apr 17, 2026
Merged

(Bug) Update sentence_encoder.py: clamping cos_sim between -1 and 1 to avoid floating point precision errors in torch.acos(cos_sim)#804
qiyanjun merged 1 commit intoQData:masterfrom
Aniloid2:patch-1

Conversation

@Aniloid2
Copy link
Copy Markdown
Contributor

What does this PR do?

Summary

If we compare two equal embeddings, emb1 == emb2, the cosine similarity should be 1. However, due to floating point precision, we might end up with a value slightly greater than 1, such as 1.00004. This results in an undefined NaN in torch.acos(cos_sim), causing get_angular_sim to return NaN instead of 1. By using cos_sim = torch.clamp(cos_sim, -1.0, 1.0), we ensure that the cos_sim value remains within the valid range expected by torch.acos(cos_sim).

I am using TextAttack to perform attacks on LLMs. For testing, I mostly run custom attacks that lead to different embeddings, emb1 and emb2. Occasionally, my attacks do not change any words, but due to the internal randomness of LLMs during the attack search, performing a second inference step results in a misclassification. Since the two samples are the same but classified differently during the USE metric evaluation, they should result in a cosine similarity of 1. However, I am encountering NaN values after conducting USE evaluations. I found that the issue is due to floating-point precision.

Additions

  • Added a torch.clamp to avoid floating point precision errors

Changes

  • No changes

Deletions

  • No deletions made

Checklist

  • [ x] The title of your pull request should be a summary of its contribution.
  • [ x] Please write detailed description of what parts have been newly added and what parts have been modified. Please also explain why certain changes were made.
  • [ x ] If your pull request addresses an issue, please mention the issue number in the pull request description to make sure they are linked (and people consulting the issue know you are working on it)
  • [ x ] To indicate a work in progress please mark it as a draft on Github.
  • [ x ] Make sure existing tests pass.
  • [ x ] Add relevant tests. No quality testing = no merge.
  • [ x ] All public methods must have informative docstrings that work nicely with sphinx. For new modules/files, please add/modify the appropriate .rst file in TextAttack/docs/apidoc.'

If we compare two equal embeddings, emb1 == emb2, the cosine similarity should be 1. However, due to floating point precision, we might end up with a value slightly greater than 1, such as 1.00004. This results in an undefined NaN in torch.acos(cos_sim), causing get_angular_sim to return NaN instead of 1. By using cos_sim = torch.clamp(cos_sim, -1.0, 1.0), we ensure that the cos_sim value remains within the valid range expected by torch.acos(cos_sim).
Copy link
Copy Markdown
Collaborator

@yanjunqiAz yanjunqiAz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. This is a well-known floating point issue — cosine similarity can produce values like 1.0000001 due to precision, which makes acos return NaN. The clamp is the standard fix.

Verified the context: the fix goes right between the cosine computation and acos in get_angular_sim(). Correct and safe.

@qiyanjun qiyanjun merged commit 96b92be into QData:master Apr 17, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants