lifelong_methods.models package

Submodules

lifelong_methods.models.cosine_linear module

Adapted from https://github.com/hshustc/CVPR19_Incremental_Learning/blob/master/cifar100-class-incremental/modified_linear.py

Reference: [1] Saihui Hou, Xinyu Pan, Chen Change Loy, Zilei Wang, Dahua Lin

Learning a Unified Classifier Incrementally via Rebalancing. CVPR 2019

class lifelong_methods.models.cosine_linear.CosineLinear(in_features, out_features, sigma: Union[bool, float, int] = True)

Bases: torch.nn.modules.module.Module

reset_parameters()
forward(input_: torch.Tensor)

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class lifelong_methods.models.cosine_linear.SplitCosineLinear(in_features, out_features1, out_features2, sigma: Union[bool, float, int] = True)

Bases: torch.nn.modules.module.Module

forward(x)

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

lifelong_methods.models.mlp module

class lifelong_methods.models.mlp.MLP(input_shape, num_tasks, classes_per_task, num_hidden_layers=1, hidden_sizes=128, multi_head=False)

Bases: torch.nn.modules.module.Module

forward(input_)

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

lifelong_methods.models.resnet module

class lifelong_methods.models.resnet.ResNet(num_classes=10, num_layers=18)

Bases: torch.nn.modules.module.Module

forward(input_)

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

lifelong_methods.models.resnetcifar module

Taken with some modifications from the code written by Yerlan Idelbayev https://github.com/akamaster/pytorch_resnet_cifar10

Properly implemented ResNet-s for CIFAR10 as described in paper [1]. The implementation and structure of this file is hugely influenced by [2] which is implemented for ImageNet and doesn’t have option A for identity. Moreover, most of the implementations on the web is copy-paste from torchvision’s resnet and has wrong number of params. Proper ResNet-s for CIFAR10 (for fair comparision and etc.) has following number of layers and parameters: name | layers | params ResNet20 | 20 | 0.27M ResNet32 | 32 | 0.46M ResNet44 | 44 | 0.66M ResNet56 | 56 | 0.85M ResNet110 | 110 | 1.7M ResNet1202| 1202 | 19.4m which this implementation indeed has. Reference: [1] Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun

Deep Residual Learning for Image Recognition. arXiv:1512.03385

[2] https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py

class lifelong_methods.models.resnetcifar.ResNetCIFAR(num_classes=10, num_layers=20, relu_last_hidden=False)

Bases: torch.nn.modules.module.Module

forward(input_)

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

Module contents