Skip to content

Commit 26b4f71

Browse files
EnayatUllahfacebook-github-bot
authored andcommitted
Fix DistributedDP Optimizer for Fast Gradient Clipping (#662)
Summary: Pull Request resolved: #662 The step function incorrectly called "original_optimizer.original_optimizer" instead of "original_optimizer". Fixed it now. Differential Revision: D60484128
1 parent f1d0e02 commit 26b4f71

File tree

2 files changed

+3
-1
lines changed

2 files changed

+3
-1
lines changed

opacus/__init__.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,13 +15,15 @@
1515

1616
from . import utils
1717
from .grad_sample import GradSampleModule
18+
from .grad_sample_fast_gradient_clipping import GradSampleModuleFastGradientClipping
1819
from .privacy_engine import PrivacyEngine
1920
from .version import __version__
2021

2122

2223
__all__ = [
2324
"PrivacyEngine",
2425
"GradSampleModule",
26+
"GradSampleModuleFastGradientClipping",
2527
"utils",
2628
"__version__",
2729
]

opacus/optimizers/ddpoptimizer_fast_gradient_clipping.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -76,6 +76,6 @@ def step(
7676

7777
if self.pre_step():
7878
self.reduce_gradients()
79-
return self.original_optimizer.original_optimizer.step()
79+
return self.original_optimizer.step()
8080
else:
8181
return None

0 commit comments

Comments
 (0)