Commit 6a674f9
Fix DistributedDP Optimizer for Fast Gradient Clipping (meta-pytorch#662)
Summary:
Pull Request resolved: meta-pytorch#662
The step function incorrectly called "original_optimizer.original_optimizer" instead of "original_optimizer". Fixed it now.
Differential Revision: D604841281 parent 4804a51 commit 6a674f9
File tree
2 files changed
+3
-1
lines changed- opacus
- optimizers
2 files changed
+3
-1
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
15 | 15 | | |
16 | 16 | | |
17 | 17 | | |
| 18 | + | |
18 | 19 | | |
19 | 20 | | |
20 | 21 | | |
21 | 22 | | |
22 | 23 | | |
23 | 24 | | |
24 | 25 | | |
| 26 | + | |
25 | 27 | | |
26 | 28 | | |
27 | 29 | | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
76 | 76 | | |
77 | 77 | | |
78 | 78 | | |
79 | | - | |
| 79 | + | |
80 | 80 | | |
81 | 81 | | |
0 commit comments