Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: meta-pytorch/opacus
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: 3c32510
Choose a base ref
...
head repository: meta-pytorch/opacus
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: d701dc3
Choose a head ref
  • 1 commit
  • 1 file changed
  • 1 contributor

Commits on Jan 25, 2025

  1. Modifying DPLossFastGradientClipping to add support for generative ta…

    …sks with ghost clipping (#722)
    
    Summary:
    Pull Request resolved: #722
    
    Generative tasks for NLP output predictions of shape (B,T,C) i.e., (batch_size, sequence_length, vocab_size). To compute the cross-entropy loss in this case, usually the predictions are reshaped to (BxT, C) and targets to (BxT). This creates an issue with Ghost Clipping per sample loss computation as BxT is seen as the batch_size. In particular, the current implementation of Ghost Clipping results in loss_per_sample, coeff variables to have a shape of BxT and B respectively. This causes a shape mismatch error. This diff fixes that error by collapsing the loss_per_sample variable to shape B i.e., the loss across the sequence_length dim is averaged/summed.
    
    Differential Revision: D68047256
    aparna-aketi authored and facebook-github-bot committed Jan 25, 2025
    Configuration menu
    Copy the full SHA
    d701dc3 View commit details
    Browse the repository at this point in the history
Loading