Base Bayessian NN
BaseBayesNet
Bases: Module
General envelope around arbitary nn.Module which is going to include nn.Modules and BayesModules as submodules.
Source code in src/methods/bayes/base/net.py
152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 | |
device
property
Return device of net
posterior: dict[str, ParamDist]
property
Returns posterior distribution for each weight
Returns:
| Type | Description |
|---|---|
dict[str, ParamDist]
|
dict[str, ParamDist]: dictionary where key - name of weight, value - postrior distribution of weight |
prior: dict[str, Optional[ParamDist]]
property
Returns prior distribution for each weight
Returns:
| Type | Description |
|---|---|
dict[str, Optional[ParamDist]]
|
dict[str, ParamDist]: dictionary where key - name of weight, value - prior distribution of weight |
weights: dict[str, nn.Parameter]
property
Returns all weights in base_module
Returns:
| Type | Description |
|---|---|
dict[str, Parameter]
|
dict[str, ParamDist]: dictionary where key - name of weight, value - weight value |
__init__(base_module, module_dict)
summary
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
base_module
|
Module
|
custom Module which is going to have some BayesModule as submodules |
required |
module_dict
|
dict[str, Module]
|
all submodules of the base_module supposed to be trained. This may be nn.Module or BayesModule. Such division is required because base_module is not registred as Module in this class. |
required |
Source code in src/methods/bayes/base/net.py
eval()
flush_weights()
This method simply set as tensors all weights that will be calculated by this layer and, so it will work properly when layer is initialized.
Source code in src/methods/bayes/base/net.py
get_path(module_name, parameter_name)
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
module_name
|
str
|
module name |
required |
parameter_name
|
str
|
parameter name |
required |
Returns: str: path to weight
Source code in src/methods/bayes/base/net.py
sample()
Sample new parameters of base_module from current posterior distribution
Returns:
| Type | Description |
|---|---|
dict[str, Parameter]
|
dict[str, nn.Parameter]: new sampled paramters in form of dictionary |
Source code in src/methods/bayes/base/net.py
sample_model()
Sample base model and return deepcopy of it.
Returns:
Source code in src/methods/bayes/base/net.py
BayesLayer
Bases: Module, ABC
Abstract envelope around arbitrary nn.Module to substitute all its nn.Parameters with ParamDist. It transform it into bayessian Module. New distribution is a variational distribution which mimics the true posterior distribution.
To specify bayes Module with custom posterior, please inherit this class and specify fields under. Attributes: prior_distribution_cls: Type of prior distribution that is used in this layer posterior_distribution_cls: Type of posterior distribution that is used in this layer is_posterior_trainable: Is posterior trainable is_posterior_trainable: Is prior trainable
Source code in src/methods/bayes/base/net.py
11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 | |
base_module: nn.Module
property
Return base_module that stores last sample module Return: nn.Module: strored module
device
property
Return device of module
net_distribution = BaseNetDistribution(module, weight_distribution=posterior)
instance-attribute
Posterior net disctribution that is trained using data to fit to evaluate probapility of each net consider the data
posterior: dict[str, ParamDist]
property
Returns posterior distribution for each weight
Returns:
| Type | Description |
|---|---|
dict[str, ParamDist]
|
dict[str, ParamDist]: dictionary where key - name of weight, value - postrior distribution of weight |
posterior_params = nn.ParameterList()
instance-attribute
key - weight_name, value - distribution_args: nn.ParameterDict this step is needed to register nn.Parameters of the ParamDists inside this class
prior: dict[str, Optional[ParamDist]] = {}
instance-attribute
Prior disctribution for each weight
prior_params = nn.ParameterList()
instance-attribute
key - weight_name, value - distribution_args: nn.ParameterDict this step is needed to register nn.Parameters of the ParamDists inside this class
weights: dict[str, nn.Parameter]
property
Returns all weights in base_module
Returns:
| Type | Description |
|---|---|
dict[str, Parameter]
|
dict[str, ParamDist]: dictionary where key - name of weight, value - weight value |
__init__(module)
summary
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
module
|
Module
|
custom Module layer which is going to be converted to BayesLayer |
required |
Source code in src/methods/bayes/base/net.py
eval()
flush_weights()
abstractmethod
This method simply set as tensors all weights that will be calculated by this layer and, so it will work properly when layer is initialized.
forward(*args, **kwargs)
sample()
Sample new parameters from net distribution Return: dict[str, nn.Parameter]: new sampled parameters