Fix attention backward dropout #81
Open
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Issue #, if available:
None, although closely related to aws-neuron/aws-neuron-sdk#1156.
Description of changes:
Existing dropout implementation in flash attention backward kernel had a couple issues:
softmax_ypost-dropout for computingsoftmax_dx_local.softmax_dybefore using it to computesoftmax_dx_local(subsequently used to computedqanddk).The CR updates the implementation to correctly comply with reference pseudocode as provided in https://arxiv.org/pdf/2205.14135 (Section B.4, algorithm 4).
Testing:
Please see detailed unit test requirements in the CONTRIBUTING.md
nki.baremetalnki.benchmarkI tested locally with a golden function to make sure output is accurate and performance is as expected with and without dropout.
Pull Request Checklist