deepmr.optim.PGDStep#
- class deepmr.optim.PGDStep(*args: Any, **kwargs: Any)[source]#
 Proximal Gradient Method step.
This represents propagation through a single iteration of a Proximal Gradient Descent algorithm; can be used to build unrolled architectures.
- AHA#
 Normal operator AHA = AH * A.
- Type:
 Callable | torch.Tensor
- Ahy#
 Adjoint AH of measurement operator A applied to the measured data y.
- Type:
 
- D#
 Signal denoiser for plug-n-play restoration.
- Type:
 Callable
- trainable#
 If
True, gradient update step is trainable, otherwise it is not. The default isFalse.- Type:
 bool, optional
Methods
__init__(step, AHA, AHy, D[, trainable, tol])check_convergence(output, input, step)forward(input[, q])