MixedPrecision¶
- class lightning.pytorch.plugins.precision.MixedPrecision(precision, device, scaler=None)[source]¶
 Bases:
PrecisionPlugin for Automatic Mixed Precision (AMP) training with
torch.autocast.- Parameters:
 
- clip_gradients(optimizer, clip_val=0.0, gradient_clip_algorithm=GradClipAlgorithmType.NORM)[source]¶
 Clips the gradients.
- Return type:
 
- load_state_dict(state_dict)[source]¶
 Called when loading a checkpoint, implement to reload precision plugin state given precision plugin state_dict.
- optimizer_step(optimizer, model, closure, **kwargs)[source]¶
 Hook to run the optimizer step.
- Return type: