Shortcuts

mmagic.models.editors.deblurganv2.deblurganv2_generator

Module Contents

Classes

FPNHead

Head for FPNInception,FPNInceptionSimple and FPNMobilenet.

FPN_inception

Base class for all neural network modules.

FPNInception

Feature Pyramid Network (FPN) with four feature maps of resolutions 1/4,

FPN_inceptionsimple

Base class for all neural network modules.

FPNInceptionSimple

Feature Pyramid Network (FPN) with four feature maps of resolutions 1/4,

FPN_mobilenet

Base class for all neural network modules.

FPNMobileNet

Base class for all neural network modules.

DeblurGanV2Generator

Defines the generator for DeblurGanv2 with the specified arguments..

Attributes

backbone_list

mmagic.models.editors.deblurganv2.deblurganv2_generator.backbone_list = ['FPNInception', 'FPNMobileNet', 'FPNInceptionSimple'][source]
class mmagic.models.editors.deblurganv2.deblurganv2_generator.FPNHead(num_in, num_mid, num_out)[source]

Bases: torch.nn.Module

Head for FPNInception,FPNInceptionSimple and FPNMobilenet.

forward(x)[source]

Forward function.

Parameters

x (torch.Tensor) – You can directly input a torch.Tensor.

Returns

torch.tensor will be returned.

Return type

torch.Tensor

class mmagic.models.editors.deblurganv2.deblurganv2_generator.FPN_inception(norm_layer, num_filter=256, pretrained='imagenet')[source]

Bases: torch.nn.Module

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call to(), etc.

Note

As per the example above, an __init__() call to the parent class must be made before assignment on the child.

Variables

training (bool) – Boolean represents whether this module is in training or evaluation mode.

unfreeze()[source]

Unfreeze params.

forward(x)[source]

Forward function.

Parameters

x (torch.Tensor) – You can directly input a torch.Tensor.

Returns

torch.tensor will be returned.

Return type

torch.Tensor

class mmagic.models.editors.deblurganv2.deblurganv2_generator.FPNInception(norm_layer, output_ch=3, num_filter=128, num_filter_fpn=256)[source]

Bases: torch.nn.Module

Feature Pyramid Network (FPN) with four feature maps of resolutions 1/4, 1/8, 1/16, 1/32 and num_filter filters for all feature maps.

unfreeze()[source]

Unfreeze params.

forward(x)[source]

Forward function.

Parameters

x (torch.Tensor) – You can directly input a torch.Tensor.

Returns

torch.tensor will be returned.

Return type

torch.Tensor

class mmagic.models.editors.deblurganv2.deblurganv2_generator.FPN_inceptionsimple(norm_layer, num_filters=256)[source]

Bases: torch.nn.Module

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call to(), etc.

Note

As per the example above, an __init__() call to the parent class must be made before assignment on the child.

Variables

training (bool) – Boolean represents whether this module is in training or evaluation mode.

unfreeze()[source]

Unfreeze params.

forward(x)[source]

Forward function.

Parameters

x (torch.Tensor) – You can directly input a torch.Tensor.

Returns

torch.tensor will be returned.

Return type

torch.Tensor

class mmagic.models.editors.deblurganv2.deblurganv2_generator.FPNInceptionSimple(norm_layer, output_ch=3, num_filter=128, num_filter_fpn=256)[source]

Bases: torch.nn.Module

Feature Pyramid Network (FPN) with four feature maps of resolutions 1/4, 1/8, 1/16, 1/32 and num_filter filters for all feature maps.

unfreeze()[source]

unfreeze the fpn network.

forward(x)[source]

Forward function.

Parameters

x (torch.Tensor) – You can directly input a torch.Tensor.

Returns

torch.tensor will be returned.

Return type

torch.Tensor

class mmagic.models.editors.deblurganv2.deblurganv2_generator.FPN_mobilenet(norm_layer, num_filters=128, pretrained=None)[source]

Bases: torch.nn.Module

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call to(), etc.

Note

As per the example above, an __init__() call to the parent class must be made before assignment on the child.

Variables

training (bool) – Boolean represents whether this module is in training or evaluation mode.

unfreeze()[source]

Unfreeze params.

forward(x)[source]

Forward function.

Parameters

x (torch.Tensor) – You can directly input a torch.Tensor.

Returns

torch.tensor will be returned.

Return type

torch.Tensor

class mmagic.models.editors.deblurganv2.deblurganv2_generator.FPNMobileNet(norm_layer, output_ch=3, num_filter=64, num_filter_fpn=128, pretrained=None)[source]

Bases: torch.nn.Module

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call to(), etc.

Note

As per the example above, an __init__() call to the parent class must be made before assignment on the child.

Variables

training (bool) – Boolean represents whether this module is in training or evaluation mode.

unfreeze()[source]

unfreeze the fpn network.

forward(x)[source]

Forward function.

Parameters

x (torch.Tensor) – You can directly input a torch.Tensor.

Returns

torch.tensor will be returned.

Return type

torch.Tensor

class mmagic.models.editors.deblurganv2.deblurganv2_generator.DeblurGanV2Generator[source]

Defines the generator for DeblurGanv2 with the specified arguments..

Parameters

model (Str) – Type of the generator model

Read the Docs v: latest
Versions
latest
stable
0.x
Downloads
pdf
epub
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.