Skip to content

Commit 24bb876

Browse files
Igor Shilovfacebook-github-bot
authored andcommitted
reduce logging severity for set_to_none (#471)
Summary: We want to warn users of an unexpected behaviour with `set_to_none` flag. Normally, both `nn.Module` and `Optimizer` let clients choose whether they want to remove the `.grad` attribute altogether or just set it to None. We, on the other hand, don't want to remove the attributes - it's more convenient to assume the `.grad_sample` attribute is always present. It's not an absolute requirement, but we did that historically and I don't see a case for changing it now. However, default value for `set_to_none` is False, meaning most users are getting annoying logging notifications on every training step. Pull Request resolved: #471 Reviewed By: karthikprasad Differential Revision: D38741747 Pulled By: ffuuugor fbshipit-source-id: e77c4ec90c7aa70866d2f073034bd2fd2c27c476
1 parent dd53ccc commit 24bb876

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

opacus/grad_sample/gsm_base.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -94,7 +94,7 @@ def zero_grad(self, set_to_none: bool = False):
9494
affects regular gradients. Per sample gradients are always set to None)
9595
"""
9696
if set_to_none is False:
97-
logger.info(
97+
logger.debug(
9898
"Despite set_to_none is set to False, "
9999
"opacus will set p.grad_sample to None due to "
100100
"non-trivial gradient accumulation behaviour"

opacus/optimizers/optimizer.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -462,7 +462,7 @@ def zero_grad(self, set_to_none: bool = False):
462462
"""
463463

464464
if set_to_none is False:
465-
logger.info(
465+
logger.debug(
466466
"Despite set_to_none is set to False, "
467467
"opacus will set p.grad_sample and p.summed_grad to None due to "
468468
"non-trivial gradient accumulation behaviour"

0 commit comments

Comments
 (0)