Bases: Module, ABC
Abstract class for Distribution losses. Your distribution loss should
be computed using prior and posterior classes and parameters, sampled from posterior.
In forward method loss should realize logic of loss for one sampled weights.
Source code in src/methods/bayes/base/optimization.py
| class BaseLoss(torch.nn.Module, ABC):
"""
Abstract class for Distribution losses. Your distribution loss should
be computed using prior and posterior classes and parameters, sampled from posterior.
In forward method loss should realize logic of loss for one sampled weights.
"""
@abstractmethod
def forward(self, *args, **kwargs) -> torch.Tensor:
"""
This method computes loss for one sampled parameters.
"""
...
|
forward(*args, **kwargs)
abstractmethod
This method computes loss for one sampled parameters.
Source code in src/methods/bayes/base/optimization.py
| @abstractmethod
def forward(self, *args, **kwargs) -> torch.Tensor:
"""
This method computes loss for one sampled parameters.
"""
...
|