Skip to content

Add support for automatic mixed precision#31

Open
bpopeters wants to merge 18 commits intomasterfrom
mixed-precision
Open

Add support for automatic mixed precision#31
bpopeters wants to merge 18 commits intomasterfrom
mixed-precision

Conversation

@bpopeters
Copy link
Collaborator

This pull request adds decorators to EntmaxBisectFunction, SparsemaxBisectFunction, Entmax15Function, and SparsemaxFunction so that they will autocast to 32-precision if they occur inside an autocast context. This seems to be the right thing to do because bf16 and fp16 introduce numerical stability issues that cannot easily be solved. Upcasting should make it easy to incorporate entmax into any mixed-precision setting.

The actual code that needed to be written was incredibly simple; the vast majority of commits are me making silly mistakes while writing the tests.

@bpopeters
Copy link
Collaborator Author

The tests pass, but they shouldn't -- torch.amp was only added in 1.10.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant

Comments